The beginning of the 2020s has seen a rise in state violence. In 2022, the Russia-Ukraine war created disruptions across the world. More than two years have passed, and the conflict seems to be far away from its end. In October 2023, a war in the Middle East broke out. Since October 7, Israel has been on a killing spree. While the two conflicts are not related, there is one thing common between them: the use of AI weapons where human control is minimum and ‘computers’ are trained to hit the targets. The destruction that this kind of warfare can unleash is all too predictable, which is why a global conference in Vienna has urged the world to establish rules to regulate AI. The two-day conference (April 29-30) saw the attendance of around 1,000 political leaders and members of civil society from over 140 countries. The conference added, “This is our generation’s ‘Oppenheimer moment’ where geopolitical tensions threaten to lead a major scientific breakthrough down a very dangerous path for the future of humanity.”
The use of remotely piloted drones and some small robots has already been common in the West, but the large-scale use of these weapons in the war on Gaza has highlighted the extent of destruction these weapons can cause. Although the Vienna conference did not specifically add the conflict in the Middle East, recent reports expose how unarmed civilians were killed with help from the AI targeting systems: Lavender and Where’s Daddy. There have been instances where the system miscalculated a bicycle as an RPG, bombing the civilians carrying it. These incidents show that these systems strictly follow the data sets they were trained on and lack the nuance needed in urban warfare. Governments around the world need to understand that the mistakes made today set a dangerous precedent that will allow warmongers of tomorrow to justify their war crimes. Decades later, the atomic bombing of Hiroshima and Nagasaki remains a stark reminder of how human greed for power and thirst for invincibility lead to horrific crimes against humanity. To justify their relentless bombardment of Gaza, Israeli leaders have frequently cited the bombing of Japan as a precedent that allows them to do whatever they think is right for their defence.
Given this context, it is good to note that leaders are aware of the situation and want to change course. But regulating AI is a little challenging. AI also does not have one task: it can do almost anything it is trained on. This poses unique governance challenges. For example, the use of AI targeting systems in war results in the avoidable deaths of civilians and even unarmed militants. It also absolves military leaders of their responsibility for ensuring minimum civilian harm. Countries need to work together to address the governance issues created by AI. If the last eight months are anything to go by, Western superpowers are fast losing their grip on the world. Technology is meant to serve humans; it should not be allowed to be turned into a threat. The world has to act now.
Addressing the PML-N’s central working committee (CWC) meeting in Lahore on Saturday, Mian Nawaz Sharif called for...
August 2021 marked a turning point in Pakistan’s fight against terrorism. Up till then, the past 10 years had seen...
On Friday, the Supreme Court issued show-cause notices to independent Senator Faisal Vawda and MQM-P lawmaker Mustafa...
One would expect those that have endured persecution would try and undo the work of those who lacked the democratic...
PTI founder Imran Khan is one step closer to being allowed out of jail – though that’s one step in a whole number...
For years, scientists had kept warning people about the deleterious effects of their consumption habits on the...
A global collaborative investigative journalism project by the OCCRP, ‘Dubai Unlocked’, has revealed the ownership...
The world is not in a happy place. While the Middle East, Sudan and Ukraine are in the grip of what seem like...