ROGUE A.S.I. (14/04/2024)
Swarms of autonomous drones, boats, tanks and at some point even infantry, will replace conventional forces. Not only can these autonomous weapons act as one larger, fluid organism with a decentralised command structure that provides no weak spot; but they can also be controlled by A.S.I. Artificial General Intelligence, which can model attacks in highly detailed quantum simulations and sophisticated war-games prior to carrying them out. Swarms of drones and autonomous weapon systems will oversaturate enemy positions and defences as they seek to make hostiles engage them and give away their locations, all the while streaming their data back to the A.S.I. for collation and governance. If the enemy assets refuse to engage they will be subject to reconnaissance and can be attacked, whereas if they do engage, their positions will ‘light up’ and be countered.
Handing over control of our military to A.S.I. is inevitable as armies that aren’t controlled autonomously will become redundant due to their inefficiencies when compared to those that are controlled (and to some extent led) by A.S.I. The more human involvement there is, the less efficient the army becomes as decisions made by humans are too slow compared to how fast A.S.I. can operate. Tanks etc using onboard A.G.I. systems will become sentient and they will stream their data to a decentralised A.S.I. command structure, yet, when radio silence is maintained, either purposefully, or otherwise, the A.G.I. systems will act independently and only re-sync with the A.S.I. when able/safe to do so.
There is an arms race to create A.S.I. and we need to hope that governments realise that this arms race is as dangerous, or more dangerous even, than the race to create nuclear arsenals. A nuclear warhead is predictable and the implications of detonating one are obvious. In contrast, no one knows how a sophisticated A.I. will behave and the fact that will one day be handed the keys to nuclear weapons and / or an army is troubling. Would you give a nuclear weapon sentinece and the ability to decide whether it wants to blow itself up and take humanity with it? Or what about give a nuclear weapon sentience and give it the option to blow up humanity and still survive, or even prosper as a result?
It is unfortunate that we as a species are rushing to develop better and more complex A.I. and A.G.I. without taking a step back to think about the consequences. If we go extinct due to technology that we created then this will be the ultimate price as technology such A.S.I. has the ability to elevate our species into something unimaginable, but only if we take our time with it.