The main problem with all that [1] is a hidden assumption of super aliens, omnipotent and omniscient, just because they have space travel and big ships. That's a science fiction trope but in reality space travel does not imply the ability to e.g. alter the trajectory of the moon or any other such world-ending capability. It just implies space travel.
As a for instance, humans could travel to Alpha Centauri using nothing but modern technology if we were more resistant to space radiation and lived (a lot) longer. So maybe the aliens live a thousand years and they are shielded against radiation. Maybe their ships are ten miles long because of all the radiation shielding, or maybe their tech is bulky and they haven't figured out e.g. how to miniaturise electronic components so they need big ships for all those mainframe-sized computers. Or maybe the aliens are a few miles long each, themselves, so they need space to wiggle their tails.
Just because "aliens" doesn't mean "super omnipotent godly aliens". And the same thing goes with future "AI". A lot in this kind of discussions hinges on the assumption that an artificial intelligence would be some kind of super computer god with no end of capability. Says who?
________
[1] Apart from the fact that a minority of humans have cancer and would need that elixir. Also: "we never defeated the bugs".
As a for instance, humans could travel to Alpha Centauri using nothing but modern technology if we were more resistant to space radiation and lived (a lot) longer. So maybe the aliens live a thousand years and they are shielded against radiation. Maybe their ships are ten miles long because of all the radiation shielding, or maybe their tech is bulky and they haven't figured out e.g. how to miniaturise electronic components so they need big ships for all those mainframe-sized computers. Or maybe the aliens are a few miles long each, themselves, so they need space to wiggle their tails.
Just because "aliens" doesn't mean "super omnipotent godly aliens". And the same thing goes with future "AI". A lot in this kind of discussions hinges on the assumption that an artificial intelligence would be some kind of super computer god with no end of capability. Says who?
________
[1] Apart from the fact that a minority of humans have cancer and would need that elixir. Also: "we never defeated the bugs".