The Dual-Use Dilemma in Open-Source Robotics

The Dual-Use Dilemma in Open-Source Robotics

Open-source technology developed in the civilian sector has the capacity to also be used in military applications or be simply misused. Navigating this dual-use potential is becoming more important across engineering fields, as innovation goes both ways. While the “openness” of open-source technology is part of what drives innovation and allows everyone access, it also, unfortunately, means it’s just as easily accessible to others, including the military and criminals.

What happens when a rogue state, a nonstate militia, or a school shooter displays the same creativity and innovation with open-source technology that engineers do? This is the question we are discussing here: How can we uphold our principles of open research and innovation to drive progress while mitigating the inherent risks that come with accessible technology?

More than just open-ended risk, let’s discuss the specific challenges open-source technology and its dual-use potential have on robotics. Understanding these challenges can help engineers learn what to look for in their own disciplines.

The Power and Peril of Openness

Open-access publications, software, and educational content are fundamental to advancing robotics. They have democratized access to knowledge, enabled reproducibility, and fostered a vibrant, collaborative international community of scientists. Platforms like arXiv and GitHub and open-source initiatives like the Robot Operating System (ROS) and the Open Dynamic Robot Initiative have been pivotal in accelerating robotics research and innovation, and there is no doubt that they should remain openly accessible. Losing access to these resources would be devastating to the robotics field.

However, robotics carries inherent dual-use risks since most robotics technology can be repurposed for military use or harmful purposes. One recent example of custom-made drones in current conflicts is particularly insightful. The resourcefulness displayed by Ukrainian soldiers in repurposing and sometimes augmenting civilian drone technology received worldwide, often admiring, news coverage. Their creativity has been made possible through the affordability of commercial drones, spare parts, 3D printers, and the availability of open-source software and hardware. This allows people with little technological background and money to easily create, control, and repurpose robots for military applications. One can certainly argue that this has had an empowering effect on Ukrainians defending their country. However, these same conditions also present opportunities for a wide range of potential bad actors.

Openly available knowledge, designs, and software can be misused to enhance existing weapons systems with capabilities like vision-based navigation, autonomous targeting, or swarming. Additionally, unless proper security measures are taken, the public nature of open-source code makes it vulnerable to cyberattacks, potentially allowing malicious actors to gain control of robotic systems and cause them to malfunction or be used for malevolent purposes. Many ROS users already recognize that they do not invest enough in cybersecurity for their applications.

Guidance Is Necessary

Dual-use risks stemming from openness in research and innovation are a concern for many engineering fields. Did you know that engineering was originally a military-only activity? The word “engineer” was coined in the Middle Ages to describe “a designer and constructor of fortifications and weapons.” Some engineering specializations, especially those that include the development of weapons of mass destruction (chemical, biological, radiological, and nuclear), have developed clear guidance, and in some cases, regulations for how research and innovation can be conducted and disseminated. They also have community-driven processes intended to mitigate dual-use risks associated with spreading knowledge. For instance, BioRxiv and MedRxiv—the preprint servers for biology and health sciences—screen submissions for material that poses a biosecurity or health risk before publishing them.

The field of robotics, in comparison, offers no specific regulation and little guidance as to how roboticists should think of and address the risks associated with openness. Dual-use risk is not taught in most universities, despite it being something that students will likely face in their careers, such as when assessing whether their work is subject to export-control regulations on dual-use items.

As a result, roboticists may not feel they have an incentive or are equipped to evaluate and mitigate the dual-use risks associated with their work. This represents a major problem, as the likelihood of harm associated with the misuse of open robotic research and innovation is likely higher than that of nuclear and biological research, both of which require significantly more resources. Producing “do-it-yourself” robotic weapon systems using open-source design and software and off-the-shelf commercial components is relatively easy and accessible. With this in mind, we think that it’s high time for the robotics community to work toward its own set of sector-specific guidance for how researchers and companies can best navigate the dual-use risks associated with the open diffusion of their work.

A Road Map for Responsible Robotics

Striking a balance between security and openness is a complex challenge, but one that the robotics community must embrace. We cannot afford to stifle innovation, nor can we ignore the potential for harm. A proactive, multipronged approach is needed to navigate this dual-use dilemma. Drawing lessons from other fields of engineering, we propose a road map focusing on four key areas: education, incentives, moderation, and red lines.

Education

Integrating responsible research and innovation into robotics education at all levels is paramount. This includes not only dedicated courses but also the systematic inclusion of dual-use and cybersecurity considerations within core robotics curricula. We must foster a culture of responsible innovation so that we can empower roboticists to make informed decisions and proactively address potential risks.

Educational initiatives could include:

Incentives

Everyone should be encouraged to assess the potential negative consequences of making their work fully or partially open. Funding agencies can mandate risk assessments as a condition for project funding, signaling their importance. Professional organizations, like the IEEE Robotics and Automation Society (RAS), can adopt and promote best practices, providing tools and frameworks for researchers to identify, assess, and mitigate risks. Such tools could include self-assessment checklists for individual researchers and guidance for how faculties and labs can set up ethical review boards. Academic journals and conferences can make peer-review risk assessments an integral part of the publication process, especially for high-risk applications.

Additionally, incentives like awards and recognition programs can highlight exemplary contributions to risk assessment and mitigation, fostering a culture of responsibility within the community. Risk assessment can also be encouraged and rewarded in more informal ways. People in leadership positions, such as Ph.D. supervisors and heads of labs, could build ad hoc opportunities for students and researchers to discuss possible risks. They can hold seminars on the topic and provide introductions to external experts and stakeholders like social scientists and experts from NGOs.

Moderation

The robotics community can implement self-regulation mechanisms to moderate the diffusion of high-risk material. This could involve:

  • Screening work prior to publication to prevent the dissemination of content posing serious risks.
  • Implementing graduated access controls (“gating”) to certain source code or data on open-source repositories, potentially requiring users to identify themselves and specify their intended use.
  • Establishing clear guidelines and community oversight to ensure transparency and prevent misuse of these moderation mechanisms. For example, organizations like RAS could design categories of risk levels for robotics research and applications and create a monitoring committee to track and document real cases of the misuse of robotics research to understand and visualize the scale of the risks and create better mitigation strategies.

Red Lines

The robotics community should also seek to define and enforce red lines for the development and deployment of robotics technologies. Efforts to define red lines have already been made in that direction, notably in the context of the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems. Companies, including Boston Dynamics, Unitree, Agility Robotics, Clearpath Robotics, ANYbotics, and Open Robotics wrote an open letter calling for regulations on the weaponization of general-purpose robots. Unfortunately, their efforts were very narrow in scope, and there is a lot of value in further mapping end uses of robotics that should be deemed off-limits or demand extra caution.

It will absolutely be difficult for the community to agree on standard red lines, because what is considered ethically acceptable or problematic is highly subjective. To support the process, individuals and companies can reflect on what they consider to be unacceptable use of their work. This could result in policies and terms of use that beneficiaries of open research and open-source design software would have to formally agree to (such as specific-use open-source licenses). This would provide a basis for revoking access, denying software updates, and potentially suing or blacklisting people who misuse the technology. Some companies, including Boston Dynamics, have already implemented these measures to some extent. Any person or company conducting open research could replicate this example.

Openness is the key to innovation and the democratization of many engineering disciplines, including robotics, but it also amplifies the potential for misuse. The engineering community has a responsibility to proactively address the dual-use dilemma. By embracing responsible practices, from education and risk assessment to moderation and red lines, we can foster an ecosystem where openness and security coexist. The challenges are significant, but the stakes are too high to ignore. It is crucial to ensure that research and innovation benefit society globally and do not become a driver of instability in the world. This goal, we believe, aligns with the mission of the IEEE, which is to “advance technology for the benefit of humanity.” The engineering community, especially roboticists, needs to be proactive on these issues to prevent any backlash from society and to preempt potentially counterproductive measures or international regulations that could harm open science.

0 Shares:
Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like