An independent group of American scientists advising the Department of Defense (DoD) on artificial intelligence has recently penned a new report on the public’s continued view of AI as some sort of killer robot or drone technology. To be clear, this is incorrect.
Artificial intelligence is currently and will continue to be an integral part of DoD systems. The main application of AI within the DoD is in augmenting human performance.
The fears surrounding AI arise from gloomy predictions about a very small area of research within AI, Artificial General Intelligence (AGI). AGI “is the pursuit of developing machines that are capable of long-term decision-making and intent, i.e. thinking like a real human”, according to the report.
Human-like autonomy is not around the corner. It is a long way off, if not impossible. Andrew Martin of the Tungsten Network put it well when he said, “AGI isn’t around the corner, it’s not even possible. I don’t mean that it’s ‘too difficult’ like ‘man will never fly’ or ‘man will never land on the moon’, I’m saying it’s hopelessly misguided like ‘man will never dig a tunnel to the moon’.”