Wicked
  • Home
  • About
  • Get Involved
  • Resources
  • More
    • Home
    • About
    • Get Involved
    • Resources
Wicked
  • Home
  • About
  • Get Involved
  • Resources

WAR, ARMED CONFLICT, & AUTONOMOUS WEAPONS

Delve into the complexities and ethical dilemmas surrounding these pressing global issues

“What is War? Defining War, Conflict and Competition” by Nick Bosio (Australian Army Research Centre

     In this article, the terms war, conflict, and competition are defined by Nick Bosio, the Director of Military Strategic Plans in Australia. This distinction is one of great importance, as the space arms race advances, it is important for the public to understand which classification a specific event is. The public having the knowledge to distinguish what specific events are, allows for proper discourse and allows them to be able to understand the risk that certain events could pose to their safety. War is defined in this article as a group using violence as the primary form of coercion; because the definition of war focuses on violence, the term has become synonymous with any armed struggle, when in reality, an event is only considered war when violence is used as the primary form of coercion by a political unit against another political unit. Conflict is defined as situations where violence is just one means of coercion. Conflict consists of nonviolent forms of coercion like sanctions, espionage, and sabotage, as well as violent forms of coercion. Competition is the use of non-violent means and only threats of violence to persuade and coerce others into compliance. The article concludes with Bosio explaining that this distinction between war and conflict is important for military professionals when advising policymakers on how to react to a situation.


Bosio, N. (2020, March). What is war? Defining war, conflict, and competition. Australian Army Research Centre. https://researchcentre.army.gov.au/library/land-power-forum/what-war-defining-war-conflict-and-competition e 

“We're all losers in the space arms race” by Sarah O’Connor (the interpreter)

     This article from Sarah O’Connor analyzes the effects that a space arms race could have on the nations of the world, as well as offer some potential solutions to this problem. Sarah O’Connor is a researcher at the International Cyber Policy Centre at the Australian Strategic Policy Institute. O’Connor begins by discussing the Russian Nudol missile system, PL-19 Nudol, which is a ground-based weapon system that has the capability of reaching low level satellites. The United States responded by releasing a statement that Space Command was ready to respond to an act of aggression in defense of U.S. interests. The reason that the U.S. released this statement is because Nudol is a reminder that even though there are treaties that prevent placing weapons in space like the 1967 Outer Space Treaty, not all weapons capable of destroying space objects are in space. These weapons are classified as ASATs or anti-satellite missiles which are any weapon capable of destroying objects in space, including satellites. As of 2020, there are four countries with counter space weapons that could be used to prevent an attack on this battlefield—Russia, China, India, and the U.S.

O’Connor then explains that the likelihood of ASATs being used in an armed conflict is low. This could be attributed to the fact that the creation of space debris from ASATs could be detrimental for neutral countries, commercial entities, and the international civil society. Since space debris does not just disappear, a singular incident can affect space for years to come. The article ends by discussing the idea of a policy of strategic restraint and a legally binding prohibition on activities that cause harm or permanent damage to space assets. There are two major pieces of information discussed in this article that are applicable to future discussions about the space arms race. The first is ASATs and counter weapons in space and the second prohibition on all weapons in space. While O’Connor suggests that the nations with ASATs are going to move away from them, until there are other alternatives to these weapons, they remain a key instrument in space weapons. The proposed solution for the complete prohibition would end the space arms race, but would also require the cooperation of all countries involved in space exploration.

O’Connor, S. (2019, December 3). We're all losers in the space arms race. the interpreter. https://www.lowyinstitute.org/the-interpreter/we-re-all-losers-space-arms-race 

“Debris from test of Russian anti-satellite weapon forces astronauts to shelter” by Joey Roulette

     This article from Sarah O’Connor analyzes the effects that a space arms race could have on the nations of the world, as well as offer some potential solutions to this problem. Sarah O’Connor is a researcher at the International Cyber Policy Centre at the Australian Strategic Policy Institute. O’Connor begins by discussing the Russian Nudol missile system, PL-19 Nudol, which is a ground-based weapon system that has the capability of reaching low level satellites. The United States responded by releasing a statement that Space Command was ready to respond to an act of aggression in defense of U.S. interests. The reason that the U.S. released this statement is because Nudol is a reminder that even though there are treaties that prevent placing weapons in space like the 1967 Outer Space Treaty, not all weapons capable of destroying space objects are in space. These weapons are classified as ASATs or anti-satellite missiles which are any weapon capable of destroying objects in space, including satellites. As of 2020, there are four countries with counter space weapons that could be used to prevent an attack on this battlefield—Russia, China, India, and the U.S.

O’Connor then explains that the likelihood of ASATs being used in an armed conflict is low. This could be attributed to the fact that the creation of space debris from ASATs could be detrimental for neutral countries, commercial entities, and the international civil society. Since space debris does not just disappear, a singular incident can affect space for years to come. The article ends by discussing the idea of a policy of strategic restraint and a legally binding prohibition on activities that cause harm or permanent damage to space assets. There are two major pieces of information discussed in this article that are applicable to future discussions about the space arms race. The first is ASATs and counter weapons in space and the second prohibition on all weapons in space. While O’Connor suggests that the nations with ASATs are going to move away from them, until there are other alternatives to these weapons, they remain a key instrument in space weapons. The proposed solution for the complete prohibition would end the space arms race, but would also require the cooperation of all countries involved in space exploration.

O’Connor, S. (2019, December 3). We're all losers in the space arms race. the interpreter. https://www.lowyinstitute.org/the-interpreter/we-re-all-losers-space-arms-race 

What are the triggers for global conflicts, and what can we do about them? By Peter Maurer

     This speech analyzes what zones of fragility are, what zones of fragility mean for the evolving state of conflict, types of conflict in the world, and what the goal of the International Committee of the Red Cross (ICRC) is. This article is a transcription of Peter Maurer’s speech at the 19th Annual Asian Investment Conference in Hong Kong; at the time of this speech, Maurer was the President of the International Committee of the Red Cross. In his speech, Maurer discusses what the International Committee of the Red Cross does and how his work in war-torn areas has given him insight into what can cause and resolve conflict. Maurer began by discussing zones of fragility—areas that have high levels of violence, underdevelopment, injustice, governance problems, or corruption—that are growing. Zones of fragility can be made by different factors like poverty, corruption, lack of education, inequality, feeble institutions, and weak rule of law. This leads to tension between the people and government which then creates an ideal breeding ground for crime and terrorist networks to thrive. These zones of fragility are expanding from impoverished areas into the cities of developed countries. 

Maurer then discusses the types of conflicts that he has witnessed with the ICRC. These conflict types are protracted, regionalized, volatile, highly politicized, and polarized conflicts. 

● Protracted conflicts are longer and affect the basic social systems, and can be seen in situations like Iraq, Somalia, and Afghanistan. 

● Regionalized conflicts are ones that spill across country borders like the Syrian conflict or the Ukrainian conflict which affected Russia and EU relations. 

● Volatile conflicts are spurred on by terror tactics spread through social media like the Paris, Brussels, and Lahore attacks. 

● Highly politicized and polarized conflicts are the ones that have seen few settlements like Ukraine, Syria, and Yemen.

Maurer then discusses how battlefields have changed from clearly drawn war battlefields to city and community settings. He states that “in 2014 the amount of people in Brazil killed was similar to those killed in Syria which was at war during the time.” This shows that the places affected by conflict are not just the areas in declared war but in different forms everywhere. 

Maurer then conveys that the ICRC’s has the goal of, “Ensuring that essential infrastructure and services continue to function properly during war, violence and conflict not only helps prevent suffering, stabilizes lives and livelihoods of people but also lays the foundations for post-conflict recovery,” which aligns with the organization’s main goals of providing relief for people and stabilizing societies. The ICRC provides shelters in bombed cities, brings food to cities under siege, fixes water infrastructure, and prevents the spread of disease. This is because halting the decline of infrastructure that occurs during war allows for other actors to come in and offer support.

Maurer, P. (2019, October 9). What are the triggers for global conflicts, and what can we do about them? International Committee of the Red Cross. https://www.icrc.org/en/document/what-are-triggers-global-conflicts-and-what-can-we-do-about-them 

“ICRC position on autonomous weapon systems” by the International Committee of the Red Cross

     This article from the International Committee of the Red Cross (ICRC) discusses the organization's concerns with autonomous weapons and their recommendations for what States should do to address this issue. The ICRC is an organization that protects and assists victims of violence and armed conflict. The concerns that the ICRC has about autonomous weapons are that the users of the weapons do not choose or even know the specific targets, rather sensors and a profile of a target are used to choose who the application of force is applied to. This absence of human judgment and ethics is also a concern of the ICRC. Autonomous weapon systems bring a risk of harm to both civilians and combatants, raise challenges for compliance with international law on the protection of civilians, and ethical concerns about life and death decisions being made with a sensor.

The ICRC believes in limiting autonomous weapon systems and recommends that states adopt legally binding rules against them. There are three specific types of binding rules the ICRC proposes:

● The first is that unpredictable autonomous systems should be shut down because their effects are unpredictable and consequences unmeasurable, which could be achieved by a prohibition on systems that do not have clearly understood, predicted, and explained effects. 

● The second is that any autonomous weapon system that targets human beings should be banned to uphold international humanitarian law which could be achieved through a prohibition on systems that target humans.

●  The third is that all autonomous weapon systems that are not prohibited by the first two rules should have limits on the types of targets, limits on duration, geographical scope, and scale of use. 

These provisions would place human judgment back into the equation as well as limit autonomous weapon system usage to military targets only and protect civilians.


International Committee of the Red Cross. (2021, May 12). ICRC position on autonomous weapon systems. https://www.icrc.org/en/document/icrc-position-autonomous-weapon-systems 

“New START Treaty” by the Department of State (United States Department of State)

     The New START Treaty was signed between the United States and the Russian Federation in 2011 and will remain active until 2026 with the goal of limiting nuclear arms between the countries. The treaty also places limits on certain weapons, specifically, countries are allowed to have no more than “700 deployed intercontinental ballistic missiles (ICBMs), deployed submarine-launched ballistic missiles (SLBMs), and deployed heavy bombers equipped for nuclear armaments;” no more than “1,550 nuclear warheads on deployed ICBMs, deployed SLBMs, and deployed heavy bombers equipped for nuclear armaments (each such heavy bomber is counted as one warhead toward this limit);” and no more than “800 deployed and non-deployed ICBM launchers, SLBM launchers, and heavy bombers equipped for nuclear armaments.” This treaty also established a Bilateral Consultative Commission which meets twice a year to determine if both countries are operating in compliance with the treaty. Both countries also must exchange data on the “numbers, locations, and technical characteristics of weapons systems and facilities that are subject to the treaty and provide each other with regular notifications and updates.” The treaty has also prohibited the United States and Russia from interfering with each other's national technical means of verification which is what satellites classify as. This treaty is what currently prevents the United States and Russia from attacking each other's satellites, keeping the conflict in space minimal.


Department of State, New START Treaty (n.d.). United States Department of State. https://www.state.gov/new-start/ 

“Problems with autonomous weapons” by Stop Killer Robots “Campaign to Stop Killer Robots

     This article from the group Stop Killer Robots discusses problems posed by the usage of “killer robots” also known as autonomous weapons. The group writing this article is Stop Killer Robots—a coalition of multiple organizations with the mission of seeing reform to international laws to regulate autonomous weapon systems. The article proposes that the lack of human control over the weapon systems and the subsequent lack of accountability for the actions taken by the systems are major problems with autonomous weapons. The article also discusses the idea that these systems have prejudices coded into them by their human makers and the use of such systems reinforces structures of oppression. The article ends by discussing that technology should be developed to promote peace, justice, human rights, and equality, but that we need to draw a line against using autonomous weapons to kill. 


Stop Killer Robots. (n.d.) Problems with autonomous weapons. https://www.stopkillerrobots.org/stop-killer-robots/facts-about-autonomous-weapons/ 

“How to avoid a space arms race” by John Lauder, Frank Klotz, and William Courtney (The Hill)

     The article from John Lauder, Frank Klotz, and William Courtney analyzes the current situation of stationing weapons in space as well as proposes a potential solution for how the armament of space could be dealt with. John Lauder is a former intelligence officer who was chief of the Intelligence Community’s Arms Control Intelligence Staff and Nonproliferation Center as well as a deputy director of the National Reconnaissance office. Frank Klotz is a retired lieutenant general who was the first commander of the Air Force Global Strike Command and administrator of the National Nuclear Security Administration before becoming an adjunct senior fellow at the RAND Corporation. William Courtney is an adjunct senior fellow at the Corporation and was a U.S. ambassador to a U.S.-Soviet commission to implement the Threshold Test Ban Treaty. Courtney was also a Deputy U.S. negotiator in the United States and Soviet Defense and Space talks. The article begins by discussing how in September of 2020, Russian President Vladimir Putin proposed that leading space powers come together to prohibit the stationing of weapons in space and threatening or using force against other countries' space objects. The discussion of Russia’s testing then transitioned to an analysis of what a space weapon actually was as well as potential solutions for the increasing number of space weapons.

Defining what specifically constitutes a weapon being a space weapon is a complicated task because any weapon that threatens a satellite could technically be considered a space weapon. This could include missile defense systems, lasers, and jamming tools. Weapons that are used for earthly military presence could be considered space weapons too, which could pose an issue for a solution to the arms race proposed in the article. The article discussed how arms control accords could be a solution but unless there is trust among nations and a clear definition of terms then it's not a viable solution. The article then discusses how the U.S. Space Force already shares information with over 100 different governmental, organizational, and academic organizations from 25 separate nations and if there were data-sharing agreements between countries, those countries would be able to have more trust in each other. Through more multilateral efforts to share data there would be less of a need for weapons in space. This is because countries could trust each other more if all information is being shared which would eliminate fear of not knowing as a motivator for space armament.

Klotz, F., Courtney, W., & Lauder, J. (2020, October 24). How to avoid a space arms race. The Hill. https://thehill.com/opinion/national-security/522512-space-arms-control-small-steps-can-begin-to-overcome-the-obstacles/ 

“What are the dangers of autonomous weapons? | The Laws of War | ICRC” YouTube video

     This video from the International Committee of the Red Cross (ICRC) discusses the dangers of automated weapons. Governments are automating more weapons, and we are now in a world where drones, robots, submarines, and tanks can attack without human interaction. The article then gives three potential uses of autonomous weapons that could happen–

● An unmanned car with an automatic machine gun being driven into a city and shooting a crowded area

● Motion or heat sensor-triggered robots at checkpoints 

● Swarms of armed drones being used to attack

Battles are now being fought in cities and towns where autonomous weapons can accidentally harm civilians and essential infrastructure at a higher rate. Autonomous weapons cannot effectively assess risk to civilians, which makes them a danger to the public in these cities and towns. The article proposes there needs to be a treaty prohibiting unpredictable and people targeting autonomous weapons. The article introduces the issue of autonomous weapons as well as shows the potential consequences that using the weapons can have. The world is becoming more automated and as it advances so does warfare, the article ends with the ICRC proposing that a legally binding treaty to ban unpredictable autonomous weapons, autonomous weapons that target people, and to regulate other autonomous weapons represents the organization's goal of limiting autonomous weapons. 

International Committee of the Red Cross. (2021, December 1) What are the dangers of autonomous weapons? | The Laws of War | ICRC [Video]. YouTube. https://www.youtube.com/watch?v=8GwBTFRFlzA&t=1s 

“Treaty on Principles Governing the Activities of States in the Exploration and Use of Outer Space

      The Outer Space Treaty was the first form of regulation on space and is still used as the “basic framework on international space law.” This treaty was signed by the United States, the United Kingdom, and the Russian Federation in 1967. This treaty restricts nations in a multitude of ways such as making them liable for damages caused by their space objects, preventing nations from stationing nuclear weapons in space or on celestial bodies, and making states responsible for any space activities carried out by that country even if it is not sanctioned by the government entity. The treaty also lays out the rules of space activity that are still followed. These include: space exploration is free for all nations, no state can claim land, and that space exploration should be carried out for the benefit and in the interest of all mankind. This treaty created the frame that all space exploration has followed to now, as well as still being used as the main law regulating space exploration and activity. The treaty also makes nations liable for damage caused by their space objects and states that nations should avoid the harmful contamination of space and celestial bodies. These two statutes place responsibility for damage caused by space weapons on the nations that own them and directly prohibits the arms race. The arms race is causing space debris to be in the atmosphere which can cause damage to space objects, this directly goes against the provision to avoid harmful contamination of space.

Office for Outer Space Affairs. Resolution Adopted by the General Assembly 2222 (XXI). Treaty on Principles Governing the Activities of States in the Exploration and Use of Outer Space, including the Moon and Other Celestial Bodies (1967, January 27). United Nations. https://www.unoosa.org/oosa/en/ourwork/spacelaw/treaties/outerspacetreaty.htm

Global Conflict: Causes and Solutions for Peace. By Vikas Mbe (Thought Economics)

     This article from Thought Economics examines potential causes of conflict. The article features an interview section between Kristiina Rintakoski and Vikas Mbe about conflict and how organizations like the Crisis Management Initiative (CMI) are attempting to build peace internationally. Rintakoski discusses that all conflicts are different with different histories and reasons, but that in modern conflicts, societal inequality and inequality between religions have become key causes. These are not the only causes though, other factors are state fragility, resource competition, and climate change. Rintakoski then proposes the idea that peace cannot be imported into areas of conflict, rather those involved in the conflict need to be the ones trying to resolve it and make peace, not outsiders. This idea is what CMI operates under, with the goal of working along with governments, supranational organizations, like the United Nations or European Union, and military forces to help nations create their solutions, rather than have outsiders them impose on. This can be seen through the actions that Rintakoski states the CMI takes when dealing with these groups like policy development and support, the encouragement of information sharing between parties, and building an international network to help complement these groups. 

The article then moves from the interview to a general analysis of potential solutions by Mbe. Mbe states that formulating solutions for conflict would require, “...coordination amongst international actors and a need to find common means and common language and for multi-faceted and multi-disciplinary approaches to problems…” This statement reinforces the idea of conflict being a wicked problem because it is interconnected to other issues and there is no clear solution. Despite there being no clear answer, Vikas proposes some potential factors that cause conflicts. These proposed factors are economic equality, agricultural sustainability and climate change, energy and resource security, justice and human rights, and technology. 

  • Economic equality can      be seen through economic capital being removed from fragile areas fosters      more conflict.
  • Climate change      reducing land viability, growing agricultural demand, increasing      populations, and dislocation because of climate change, can result in      conflict in stressed areas becoming more stressed.
  • Energy resources      are becoming more important to powers, and the group with the resources      has more power. 
  • The level of injustice      and the number of human rights abuses going on in the world will continue      to expand conflict as well as move people further away from peace.       Technology makes outside actors more involved through victims of      conflict being able to share their stories and has allowed for populations      to better come together to discuss peace through communication      technologies. This use of technology to create collaborative discussion      about peace between powers has allowed the stakeholders in the conflict to      better take control from outside actors.

Vikas Mbe is a non-executive board member of the UK Government’s Department for Business, Energy & Industrial Strategy, and a non-executive Director of the Solicitors Regulation Authority and Kristiina Rintakoski is the Executive Director of the Crisis Management Initiative.


Mbe, V. S. (2015, June 3). Global conflict: Causes and solutions for Peace. Thought Economics.https://thoughteconomics.com/global-conflict-causes-and-solutions-for-peace/

“Historic opportunity to regulate killer robots fails but hope emerges for new route” by Campaign to

     This is a press release from the group Campaign to Stop Killer Robots about the failure of the United Nations Convention on Conventional Weapons (CCW) from setting plans to negotiate laws on autonomous weapons. After 8 years of discussion on a new international law to ensure human control over force, the 6th Review Conference has failed to agree to even discuss legislation on the issue. This comes while major members like the United States and Russia continue to invest heavily in the development of autonomous weapons. Though not everyone has agreed to take this as an end to the movement with:

  • The Austrian Foreign Minister, Alexander Schallenberg, and New Zealand’s Minister for      Disarmament and Arms Control, Phil Twyford are calling for new international law regulating autonomous weapons.
  • The new government coalition agreements of Norway and Germany promise to act on this issue.
  • 68 other states calling for a legal instrument to regulate this problem, 
  • Thousands of tech experts, AI experts, and scientists calling for new international law
  • The Stop Killer Robots campaign for the development of new international law on this issue.
  • Amnesty International calling for new international law.
  • Human Rights Watch calling for new international law.
  • The International Committee of the Red Cross calling for new international law.
  • 26 Nobel Laureates calling for new international law.

       

This press release discusses the failure of the Sixth Review Conference of the United Nations Convention on Conventional Weapons to make progress on banning autonomous weapons. Even though this conference failed to make progress on this wicked problem, there are people and groups still trying to solve the proliferation of autonomous weapons. These groups and people taking steps outside of the UN show how the potential solution of bodies outside of the UN having to create a solution is a potential solution for this wicked problem. 

           The United Nations Association-UK's Head of Campaigns, Ben Donaldson, thinks that this is a tone-deaf result, but as shown through past successful treaties like the successful cluster munitions and landmines treaties progress can be made outside the United Nations.


Campaign to Stop Killer Robots (2021, December 18) Historic opportunity to regulate killer robots fails but hope emerges for new route. United Nations Association–United Kingdom. https://una.org.uk/news/historic-opportunity-regulate-killer-robots-fails-hope-emerges-new-route

“The ideas behind 'Slaughterbots - if human: kill()' | A deep dive interview” by Dr. Emilia Javorsk,

     The video “The ideas behind 'Slaughterbots - if human: kill()' | A deep dive interview” is an interview with the creators of the film “Slaughterbots - if human: kill()” which is a short film based on the current state of autonomous weapons in the world and a plea for the United Nations of 2022 to pass prohibition of certain autonomous weapons. A slaughter bot is another name for a lethal autonomous weapon, which is a weapon designed to intentionally kill humans. This short film is divided into three sections: the risks of autonomous weapons, technological developments of autonomous weapons, and potential solutions. Autonomous weapons have reached a point where only 1 or 2 people are needed to carry out a major attack, which is concerning because a “slaughter bot’s” actions are determined to be morally right or wrong by the owner. Autonomous weapons are also still just programs that can be reprogrammed overnight, so there are no real safeguards against them being turned against people. Stuart Russell then discusses how the argument that bad guys will never get ahold of them is inherently flawed. In recent history, the United States has lost 200,000 AR-15s in Iraq, an STM cargo drone was sold to a faction in the Libya conflict despite a United Nations embargo, and there being few actual controls on the proliferation of these weapons show how these weapons can fall into the wrong hands easily. These weapons are ideal for assassinations because the user can be hundreds of miles away without having to take a risk of being there or trying to escape. The world was plunged into World War 1 because of assassination; the creators mention that to then discuss the idea that the easier it becomes for assassinations to be completed, the more at risk the world is. The use of autonomous weapons also takes human intervention and judgment out of situations and the use of these weapons directly devalues human life by reducing the decisions for them to live down to a code.

Section three of the interview moves into a discussion of potential solutions. The creators believe that there needs to be a prohibition on weapons that select human targets and attack without human control. The creators support the International Committee of the Red Cross’s stance on the issue of creating legally binding prohibitions on unpredictable systems, legally binding prohibitions on systems that target humans, and the need to regulate other types of autonomous weapons like autonomous fighter jet on autonomous fighter jet usage. The main idea is that there needs to be a legally binding instrument to control autonomous weapon usage. The creators then discuss past treaties that could set a precedent like the Bioweapons ban and Chemical Weapons ban. The banning of biological weapons in warfare widely opened biology work, turning it into a field of innovation and advancement, which is what the creators hope to see the field of AI do with the banning of autonomous weapons.

The three creators of “Slaughterbots - if human: kill()” who give this commentary are Stuart Russell, Max Tegmark, and Dr. Emilia Javorsk. Stuart Russell is a professor of Computer Science at the University of California Berkeley and is the Founder of the Center for Human Compatible AI, has been doing AI work for over 40 years. Max Tegmark is a Professor of Physics at the Massachusetts Institute of Technology (MIT) an AI researcher, and President of the Future of Life Institute. Dr. Emilia Javorsky is the Lead of Lethal Autonomous Weapons Policy and Advocacy at the Future of Life Institute.


Javorsk, E., Russell, S., Tegmark, M. [Future of Life Institute]. (2021, December 4) The ideas behind 'Slaughterbots - if human: kill()' | A deep dive interview [Video] YouTube. https://www.youtube.com/watch?v=Di9b3toTKJM

“What you need to know about autonomous weapons” by the International Committee of the Red Cross

     This article from the International Committee of the Red Cross details what an autonomous weapon is, why the ICRC is concerned about them, the legal challenges of autonomous weapons and the ethical concerns these weapons pose. The ICRC defines autonomous weapons  as “any weapons that select and apply force to targets without human intervention.” This process used by these weapons removes human judgment and makes the effects of these weapons harder to control. This organization believes this is a concern because this lack of human control increases the dangers facing civilians. Autonomous weapons can accelerate situations in an unpredictable manner because of this lack of human judgment and control. The reason autonomous weapons pose a problem for legal challenges is because “when there are violations of international humanitarian law, holding perpetrators to account is crucial to bring justice for victims and to deter future violations.” If there is not a person to place the blame on for autonomous weapon attacks then how can justice be served? These autonomous systems reduce or even remove moral agency from decisions to kill. The ICRC proposes a three-pillar solution to combat this problem: there needs to be a prohibition on unpredictable autonomous weapons, autonomous weapons that directly target people need to be prohibited, and there needs to be regulations on all other autonomous weapons. The ICRC is confident that States will agree to treaties on autonomous weapons when given the choice.

International Committee of the Red Cross. (2022, July 26). What you need to know about autonomous weapons. https://www.icrc.org/en/document/what-you-need-know-about-autonomous-weapons

“Killer Robots Are Here—and We Need to Regulate Them” by Robert F. Trager and Laura M. Luca

     This article discusses why lethal autonomous weapon systems (LAWS) need to be regulated before they begin to spread. As of the writing of this article, Israel, Russia, South Korea, and Turkey have deployed weapons with autonomous capabilities and Australia, Britain, China, and the United States are investing in developing lethal autonomous weapon systems. Autonomous weapons can look very different from each other like the Turkish Kargu-2 drone which is 2 feet long, weighs around 15 pounds, and can swarm in groups, or an unmanned AI-driven fighter jet like the modified L-39 Albatros. These weapons are constantly improving and advancing, and as this advancement continues the harder it becomes to regulate. Trager and Luca then propose the following reasons why we should be worried about LAWS:

  • LAWS could facilitate violence on a large scale since they’re not restricted by the number of people available to man them.
  • In combination with facial recognition and other technologies, they can target individuals or      groups that fit certain descriptions, which could appeal to violent groups and state militaries committing political assassinations and ethnic cleansing.
  • LAWS may make it easier for those who control them to hide their identities.
  • LAWS thus have the potential to upend political orders and enable tighter authoritarian      control.
  • they can always malfunction, including by mistaking civilians for combatants.

These weapons can be hard to regulate because trying to restrict this technology is a problem of trying to restrict the spread of software. This is to say that autonomy-enabling software can be paired with commercial hardware and technology to create autonomous weapons. How can this problem be stopped? An international legal nonproliferation regime could:

  • Mandate safety provisions to prevent copying of autonomous software.
  • Identify classes of software that should not be publicly available.
  • Restrict the transfer of sophisticated hardware.
  • Criminalize activities intended to further proliferation.

Trager and Luca believe that this regime would slow down the speed of proliferation and set the groundwork for a prohibition on these types of weapons.

Robert Trager is an associate Political Science professor at the University of California, Los Angeles (UCLA) and a Centre for the Governance of AI representative to the U.N. Convention on Certain Conventional Weapons, and Laura Luca is a political science graduate student at UCLA. 

Luca, L. M., & Trager, R. F. (2022, May 11). Killer robots are here-and we need to regulate them. Foreign Policy. https://foreignpolicy.com/2022/05/11/killer-robots-lethal-autonomous-weapons-systems-ukraine-libya-regulation/


Copyright © 2023 Wicked Toolkit - All Rights Reserved.


Powered by GoDaddy

This website uses cookies.

We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.

Accept