We’ve had many discussions about autonomous weapons and AI use in the military throughout this podcast series. In this episode, Wanda Muñoz and Richard Moyes join us to discuss what action needs to be taken to create a legal framework for autonomous weapons to prevent killer robots.
Richard is Managing Director of Article 36, a specialist non-profit organisation focused on reducing harm from weapons. Wanda is a member of SEHLAC, where her current focus lies in banning autonomous weapons. You can follow Richard on LinkedIn or Twitter (@rjmoyes) and Wanda on LinkedIn or Twitter.
“It’s not just about technology in the military. It’s about the kind of society we want.” — Wanda Muñoz
Although technology plays a massive part in the military, autonomous weapons are still being developed and have yet to be used. One of Wanda and Richard’s main concerns in this conversation about autonomous weapons is the role of the human. Weapons now have humans controlling drones and other machines. Autonomous weaponry means that humans will be taken out of the loop in the life and death discussion.
“Using algorithms to kill people opens another conversation about how we use AI in other parts of society.” — Richard Moyes
Wanda and Richard argue that machines should not be deciding as to whether someone lives or dies. Machines do not see people as people, and mistakes by machines are highly likely — what would the consequence be for a machine killing the wrong human? Using algorithms to kill people opens up a whole new conversation about how we want to use AI in other parts of society — where can we draw the line if we’re allowing algorithms to decide who to kill?
Like all AI, there will be many issues when designing autonomous weapons. In our interview, Richard made a great point that humans have never been able to differentiate between an enemy and an ally. Therefore, how can we correctly train a killer algorithm? Wanda also identified the current racial bias issues in facial recognition technology. This has already caused huge problems in the police, these biases are likely going to amplify if they are implemented in the military.
What do you think about a future with autonomous weapons? Join the conversation in our Slack channel!
What are your thoughts? Join our Slack channel and join the conversation!