Autonomous drones and ethical dilemmas

Autonomous drones and ethical dilemmas

Autonomous drones and autonomous weapons raise ethical questions. A frank technical analysis of the military, legal and moral issues at stake in 2025.

In April 2025, autonomous drones occupy a central place in military and civilian strategies. These systems, capable of operating without direct human control thanks to artificial intelligence (AI), are redefining the fields of action. The United States is allocating 3.7 billion euros for its drone programs in 2026, while China is deploying swarms of CH-901 drones, each weighing 9 kg, for coordinated missions. In the civilian sector, Amazon is aiming for 500 million annual deliveries by autonomous drones by 2030. But this autonomy raises brutal questions. Who is to blame if a drone kills by mistake? Can a machine decide on death without betraying humanity? International humanitarian law (IHL) struggles to regulate these technologies: in 2021, a Turkish Kargu-2 drone reportedly struck in Libya without human command, according to the UN. Autonomous weapons amplify the risks, between misidentification and conflict escalation. This article offers a technical and unfiltered analysis of the ethical implications, aimed at experts in the field. Using figures and concrete cases, it explores the capabilities, limitations and frameworks necessary for these systems.

Autonomous drones and ethical dilemmas

Technological advances in autonomous drones

The systems and their autonomy

Autonomous drones rely on advanced AI algorithms. The American MQ-9 Reaper, 11 meters long, flies at 370 km/h over 1,850 km thanks to LIDAR and thermal sensors. Its next version will integrate autonomous target selection, scheduled for 2027. In China, CH-901 drones, tested in swarms, achieve 95% accuracy at 5 km, according to tests in 2024. These devices use lithium-sulphur batteries (500 Wh/kg), doubling the range compared to the 250 Wh/kg of lithium-ion models. General Atomics, with a turnover of 2.5 billion euros in 2024, is pushing these innovations. The systems analyze data in real time, but their reliability varies. A 2023 study shows a 5% error rate in urban environments, due to confusing thermal signatures. This margin calls into question their use in populated areas. The costs are also rising: a Reaper is worth 30 million euros, and the autonomous versions are said to reach 50 million euros. Specialists must weigh these figures against the operational risks.

Technical flaws

Autonomy is not absolute. In Ukraine, since 2022, 30% of autonomous drones fail because of Russian jamming, according to Kiev. During a NATO test in 2023, a drone mistook a civilian truck for a military target, due to an incorrectly calibrated sensor. These errors reveal a dependence on external conditions. Drone swarms, although promising, saturate communication networks: a 2024 American test with 50 units showed a latency of 2 seconds, fatal in combat. The algorithms also lack flexibility. When faced with an adaptive enemy, such as in Syria where thermal decoys fooled drones in 2023, their effectiveness drops. These limitations require a human presence to correct any drift. Experts know that technology, despite its progress, remains an imperfect tool in complex scenarios.

The ethical dilemmas of autonomous weapons

The responsibility for lethal decisions

Autonomous weapons, such as the Turkish Kargu-2 (7 kg, on-board explosives), are redefining war. In 2021, this drone allegedly killed without human supervision in Libya, a first according to the UN. IHL imposes distinction, proportionality and necessity. But can an AI judge? A 2024 simulation in the United States revealed that an autonomous drone would accept 20% excess civilian casualties, where a human would refuse. Some, like Ronald C. Arkin, believe that machines outperform humans under stress: a 2019 study puts human errors at 12% compared to 3% for AI. However, a machine has neither conscience nor guilt. If it kills a civilian, who is to blame? The developer? The commander? This ethical opacity undermines all accountability. Autonomous weapons dilute the chain of command, making justice impossible under current frameworks.

The risks of proliferation

Autonomous drones lower the human cost of conflict, encouraging their use. In 2023, Azerbaijan deployed Israeli Harop drones, killing 300 people in 48 hours, according to Human Rights Watch. Total autonomy could have amplified this carnage. Non-state groups are also getting into the act: ISIS makes €1,000 drones with explosives. Cheap AI, available on the dark web, could equip these devices, outside of state control. Conflicts are thus intensifying without moral restraint. In 2025, Russia is testing Marker drones, capable of strikes at 15 km, sold to unstable allies. This dissemination threatens global stability, a challenge that experts cannot ignore.

Autonomous drones and ethical dilemmas

The legal and social frameworks in question

The existing laws

IHL governs weapons, but not specifically autonomous drones. The United States tolerates 10% civilian casualties, according to a 2016 memo, while the EU, through a 2014 resolution, wants to ban lethal autonomous systems. At the UN, negotiations on SALW (Self-Loading Weapons Systems) are stalling in 2025, blocked by Russia and China. Manufacturers are moving forward despite everything: a project for an autonomous drone costs 500 million euros in R&D, according to General Atomics. Without clear rules, the major powers dominate, marginalizing the small states. In France, Safran’s Patroller drones (20 million euros) remain semi-autonomous, but pressure is mounting for lethal versions. Lawyers in the sector note a glaring vacuum in the face of this technological acceleration.

Societal expectations

NGOs such as Stop Killer Robots are campaigning for a total ban. In 2015, Stephen Hawking and Elon Musk signed a letter against autonomous weapons. However, armies see them as an asset. In Europe, 60% of citizens reject lethal autonomous drones (2023 survey), compared to 35% in the United States. China, for its part, exports CH-5 drones (10,000 km range) without public debate. Manufacturers, for their part, are thriving: General Atomics is targeting 15 military contracts by 2025. This divide between society, industry and governments complicates any regulation. Experts must anticipate these tensions to guide future policies.

War Wings Daily is an independant magazine.