Drones: Autonomy and AI for Targeting and Killing

Drones: Autonomy and AI for Targeting and Killing

Discover the technical and ethical implications of autonomy and artificial intelligence in military drones.

Military drones are increasingly integrating artificial intelligence technologies to enhance their autonomy. This raises technical questions about the level of control that can be entrusted to machines, especially in lethal scenarios. This article examines in detail the levels of autonomy, the role of AI in decision-making, and the concerns associated with the use of autonomous military drones.

Levels of Autonomy in Military Drones

Military drones, or Unmanned Aerial Vehicles (UAVs), are classified based on their level of autonomy. These levels range from full human remote control to total autonomy without human intervention. According to NATO, five levels of autonomy are generally distinguished:

  • Level 1: Full Teleoperation – the drone is controlled in real time by a human operator.
  • Level 2: Decision Assistance – the drone provides data and recommendations, but the human makes the final decisions.
  • Level 3: Conditional Autonomy – the drone performs certain tasks autonomously under human supervision.
  • Level 4: High Autonomy – the drone can accomplish complex missions with minimal human intervention.
  • Level 5: Total Autonomy – the drone operates independently without any human intervention.

Currently, most military drones operate between levels 2 and 3. For example, the MQ-9 Reaper used by the U.S. military can perform surveillance missions by following predefined paths while being supervised by an operator. However, critical decisions, such as engaging targets, require human validation.

With advances in AI, level 4 and 5 drones are under development. The Neuron project, a European combat drone, aims to test advanced autonomy capabilities, including the ability to adapt missions in real time. According to a report by the Teal Group, global military drone spending could reach €93 billion over the next decade, reflecting the growing interest in more autonomous systems.

Increasing the level of autonomy poses technical challenges. It is necessary to ensure system reliability in complex and unpredictable environments. Software security, resistance to cyberattacks, and the ability to make ethical decisions are crucial aspects to consider.

The Role of AI in Target Identification and Decision-Making

Artificial intelligence is essential for enabling military drones to analyze the environment, identify targets, and assess threats without human intervention. Machine learning algorithms and computer vision allow drones to process large amounts of data in real time.

For example, drones can use neural networks to recognize specific military vehicles or detect suspicious movements. Advanced sensors combined with AI enable differentiation between military targets and civilians, thus reducing the risk of collateral damage.

However, the accuracy of these systems depends on the quality of the data and algorithms used. A study by MIT showed that facial recognition systems can have an error rate of up to 35% for certain populations, highlighting the risk of bias and errors in critical contexts.

In terms of decision-making, some experimental drones are equipped with systems capable of autonomous planning. They can adjust their trajectory, avoid unexpected obstacles, and react to emerging threats. The X-47B drone of the U.S. Navy demonstrated the ability to perform autonomous landings on an aircraft carrier, a complex task requiring extreme precision.

Despite these advances, allowing drones to make lethal decisions without human supervision is controversial. Autonomous lethal weapon systems (LAWS) are the subject of debates within the international community, with some calling for a preventive ban.

Concerns Related to Machine Control in Lethal Scenarios

Delegating lethal decisions to machines raises ethical and legal concerns. The absence of moral judgment in autonomous drones can lead to situations where inappropriate decisions are made.

One of the main issues is the difficulty in ensuring that drones comply with international humanitarian law, which requires distinction between combatants and non-combatants and proportionality in the use of force. Autonomous drones might not be capable of evaluating the complex context of combat situations.

Additionally, in case of error or malfunction, it is difficult to determine responsibility. Who is held accountable in case of violations of the laws of war: the software designer, the military commander, or the human operator?

The risks of cyberattacks also increase with autonomy. Adversaries could attempt to hack or deceive AI systems to provoke undesirable behaviors. In 2011, a U.S. RQ-170 Sentinel drone was captured in Iran, presumably after being hijacked.

These concerns have led to international calls to regulate or ban autonomous weapons. According to Human Rights Watch, more than 30 countries support a preventive ban on LAWS.

Drones: Autonomy and AI for Targeting and Killing

Consequences of Increasing Autonomy in Military Drones

The rise of autonomous drones has significant implications for global security and conflict dynamics. On one hand, these drones can reduce risks for soldiers by performing dangerous missions. They can also increase operational efficiency by quickly processing complex information.

On the other hand, the ease of deploying autonomous drones could lower the threshold for entering into conflict, making military engagements more likely. The proliferation of these technologies could also lead to an arms race, with nations seeking to surpass each other’s autonomous capabilities.

Autonomous drones could be used by non-state actors. In 2018, modified commercial drones were used in an attack against the Venezuelan president, demonstrating the potential of these technologies falling into the wrong hands.

Finally, there are concerns regarding the impact on human rights and civil liberties. Increased surveillance by autonomous drones could lead to privacy violations and excessive control.

Toward Regulation of Autonomy in Military Drones

In light of these challenges, it is essential to develop international regulations to govern the use of autonomous military drones. The United Nations has initiated discussions to establish norms on LAWS.

Researchers and engineers also have a role to play by integrating ethical principles into the design of AI systems. Protocols to ensure meaningful human oversight are proposed to guarantee that lethal decisions are not entirely automated.

Simultaneously, international collaboration is necessary to prevent the uncontrolled proliferation of these technologies and to ensure they are used responsibly.

War Wings Daily is an independant magazine.