1/8
The Lanius (Latin for “butcher”) is a kamikaze drone with artificial intelligence.
Marguerite Meyer and Ariane Luthi
You see a shattered city. Soldiers are looking for enemy fighters. There’s a bang, someone shouts: “We’ll be hit. Send Lanius!” Eight small drones rise. They approach the enemy soldiers and kill them one by one.
These are not scenes from a war film, but from a new commercial for the Israeli arms manufacturer Elbit Systems. The Lanius (Latin for “butcher”) is an artificial intelligence (AI) kamikaze drone. It can navigate independently, even find its way through narrow building openings and recognize human targets.
Killing machine at your fingertips
The soldier only has to press a button and the drone becomes a killing machine: It fires at a target or lunges at it and explodes – the seamless dovetailing of human combat decisions, drone technology and AI. And maybe the man or woman at the control desk is no longer needed. Today, the weapon is still set in such a way that a flesh-and-blood human being has to trigger the deadly action – and is responsible for it – but sooner or later this intermediate step can be programmed away.
What weapons like the ones in the Elbit video clip can do is also being worked on in this country: in basic research at Swiss universities. Davide Scaramuzza reports in his office at the University of Zurich in Oerlikon: “When my team and I saw the Elbit video, we were shocked.” The professor heads the “Robotics and Perception” research group, and he and his team are world leaders in this field. Among other things, Scaramuzza and his team are working on artificial intelligence to teach drones acrobatic maneuvers that make them usable even in complex environments. In a video from the university, one of these aircraft darts around in a forest, buzzing between trees and around dense buildings, flying into windows. The resemblance to the technology seen in Elbit’s videos is unmistakable.
Armies around the world are working on such technologies
In 2021, Israel’s military sent a swarm of such drones into the Palestinian Gaza Strip for the first time. Another person was involved in the chain of action. The year before, a Turkish Kargu-2 autonomous drone was deployed in Libya. The exact circumstances are unclear, but it is possible that this plane was already operating without human intervention. In addition to Israel and Turkey, other armies are also working on such technologies, including those in the USA, China, Great Britain and India.
AI researchers have been alarmed for a long time. They fear an arms race for autonomous weapons. In 2017, the so-called “Slaughterbots” video made the rounds, in which mini drones specifically hunt people without anyone controlling them. Slaughterbot means «slaughtering robot».
“It’s about weapons of mass destruction that would be accessible to everyone.”Max Tegmark, physics professor
Leading scientists from all over the world have been calling for more effective regulation of such research approaches for years. They want to prevent autonomous weapons from becoming the Kalashnikovs of tomorrow. “Many politicians have not understood the technology well enough,” says Max Tegmark, professor of physics at the Massachusetts Institute of Technology in Boston, the leading technical university in the USA. “It’s about weapons of mass destruction that would be accessible to everyone.” And the American military expert Zachary Kallenborn compares the dangers of armed drone swarms with those of chemical and biological warfare agents.
Scientific appeal to politics
Eight Swiss researchers made an urgent appeal to politicians in November 2021. In their paper, which has now been published and is available on SonntagsBlick, they call on Federal Councilor Ignazio Cassis to ensure that algorithms are never allowed to decide about people’s life and death. Without government regulation, lethal, autonomous weapons could be a reality in as little as ten years. Cassis replied: “The Federal Council shares many of the legal, ethical and security concerns that scientists and researchers have expressed about such weapons.” Switzerland wants to work internationally for appropriate regulation, as already exists worldwide for chemical and nuclear weapons.
The problem with this is that an international agreement is currently unrealistic (see interview with Reto Wollenmann). And the US military expert Kallenborn sees it the same way: “The major military powers do not want to give up any weapons that could be of use to them.”
Even Switzerland cannot be stopped from pulling out all the stops and being at the forefront as a top location for drone development. The country’s technical universities are among the best in the world. In terms of the quality of scientific publications and their influence on research, Switzerland even takes first place. The Zurich area is considered the “Silicon Valley of robotics” – thanks to Google and Co., but also because of its top university laboratories.
Visiting the drone laboratory
SonntagsBlick is visiting Professor Scaramuzza in one such laboratory. Drones are displayed in showcases in the corridor. One room is set up as a flight hall, with obstacles on the floor and nets hanging from the ceiling – for safety, should a drone stray from the path.
Scaramuzza co-signed the letter to Cassis. He’s not one to shy away from debate. The head of research takes two hours to explain to us how his laboratory works – and how his applications, which seem so similar to those of the Israeli armaments company, work. As early as 2009, Scaramuzza’s team achieved a breakthrough: the construction of a drone equipped with a camera that can fly autonomously, without any GPS. Since then, one success has followed the next. The professor led a European project that developed an autopilot – today the patent is used millions of times. One of his teammates went to NASA and brought technology developed in Switzerland to Mars. Scaramuzza’s laboratory’s first entrepreneurial project was bought by Facebook in 2016 – and developed “Oculus Quest”, the leading virtual reality glasses, also known as VR headsets.
Scaramuzza enthusiastically tells of his work that his team is working on new sensors so that they can also fly in smoke and that they are developing algorithms for AI with the help of which robots can take over human tasks.
Challenge: Good technology in the wrong hands
“Of course, that raises a lot of ethical questions,” says the professor. “Anything that can be used for good purposes can also be used for bad ones.” This is a well-known challenge in robotics – and has always been. In the same breath, Scaramuzza clarifies: “The same algorithms we use to fly these drones were used for breast cancer screening. You have already saved millions of people. Should we ban them? No.”
The AI-controlled drones are still too imprecise to be used in a war. But research is making progress. That’s why Scaramuzza thinks it’s the right time to ask himself: “How do you make sure that the technology isn’t misused?”
Scaramuzza himself worked on a project funded by US military institute DARPA. Pure basic research, as he emphasizes. He took part in a drone race organized by the armaments company Lockheed Martin and demonstrated in Dübendorf ZH in 2021 that AI can fly faster than a human pilot. However, no software was made available to the American military, it was only informed of the results before publication. His team did not provide any code to the armaments company either.
No exchange between the University of Zurich and the armaments company
But: The visual control, as he researches it, plays a key role for military applications. So what is the connection between the new weapon systems from the arms company Elbit and its research? “They use similar algorithms,” explains Scaramuzza.
The researcher says there is neither direct contact nor technology transfer between his laboratory and Elbit. “I condemn any military application of our technology,” he adds unequivocally. Any cooperation with external companies is checked by the university. Dual-use goods, i.e. goods that can be used for both civil and military purposes, require approval from the university management and Seco. The media office of the University of Zurich confirms this. However, Scaramuzza sees indirect connections: Academic publications are mostly freely accessible. And employees take their know-how with them when they change jobs – even if they switch to an armaments company.
How to mitigate risk
The researcher emphasizes that progress lives from the free exchange of knowledge. Any censorship is dangerous. But there are ways to mitigate the risks. “Researchers can keep parts of a code or only pass them on under license,” says Scaramuzza. This is already being done today – for ethical or commercial reasons. However, there are still no structured processes for this in Switzerland.
Which is why some universities abroad have already started asking about risks. The professor: “That’s good because it opens the researchers’ eyes.”
The research was supported by a grant from the Journafonds.