Professor Noam Lubell, Director of the Essex Armed Conflict and Crisis Hub, is behind the call for new guidance around Artificial Intelligence in Weapon Systems, which are being actively pursued by militaries around the world.
It comes as The House of Lords has launched the AI in Weapon Systems Committee in response to concerns about evolving military technology.
While giving evidence to the Committee in March, Professor Lubell emphasised attention must be focused on the use of AI across the targeting cycle, rather than only worrying about terminator-like robots marching across a battlefield.
He warned that problems already identified in AI detection systems used by law enforcement must be addressed so that they are not repeated in military systems used for detection and identification of targets.
Professor Lubell, of Essex Law School, went on to tell the Lords he agreed with the UK Government’s view that existing International Humanitarian Law can be applied to the use of AI in weaponry, but insisted a lot of work was needed to unpack what it means in practice.
He added that general statements being used were unlikely to provide effective parameters for AI warfare.
Professor Lubell said: “There appears to be growing agreement on some form of context-appropriate human involvement in the use of weapons, but far more guidance is needed in applying this to the very different contexts that may arise, from basic object identification systems through to advanced swarm technology weapons.
“Regardless of whether we rely upon existing international humanitarian law or create new rules to prevent unlawful use of such systems, the perceived military advantages means that some states – and non-state actors – may pay less heed to the rules, leading to a dangerous arms race to the bottom.”
While addressing the Lords Committee, Professor Lubell noted that the issue of AI determining life and death decisions is not unique to military contexts and can also be found to varying degrees in the fields of self-driving cars, traffic, law enforcement, and health.
He added: “AI is coming into critical life and death decisions across every realm of society and there is a much bigger question about human-machine interaction and how we incorporate AI into our society across all realms.
“
A better understanding and guidance for how AI might be used by the military is crucial, but is only part of a much bigger picture.”