Part of what I mean here is that we should be conscious not to continue our trend of war from tribes to factions to religions to countries further onto other races and "species". Encountering non human entities may sound like a far off prospect, but we are approaching sentient AI very quickly, as well as other types of intelligent life, along with changes to our own human intelligence, perhaps making one human very different to another. This is why I use the word "everything" instead of "everyone": we may not consider AI to be people in the classic sense of the word.
I also mean that in considering the best path for us to take in a situation (especially with political decisions) would be better dictated not by asking ourselves "What would I gain from this" but instead asking "What would make the universe a better place". Not just our group. Not just our state. Not just our country. Not even just our world.
Of course, we needn't always consider the entire universe in it's entirety when making decisions, but the broadest, most rational view necessary should always be considered. In all, this would be extremely unifying for all factions who would otherwise be bound to war, even if only one side would take up the more universal, rationalized mentality. Not only that, but global issues like CO2 emissions really need this kind of thinking. We cannot be so selfish.
This, of course relies on judgement of what "a better place for everything" is, which will depend on the views and knowledge of those making the decision.
In all, it's basically just a slight modification of secular Humanism, but considering more than just the good of humanity.
Perhaps this is unrealistic, and perhaps this is flawed. Let me know either way.