Evaluating the risks posed by synthetic biology
Last week, nearly 70 experts from around the world gathered at EPFL for a workshop on potential threats arising from synthetic biology. Technologies developed in this field aim primarily at treating diseases and combating the effects of climate change, but they can also have unintended consequences or even be used for malicious purposes.
This four-day workshop was organized by EPFL’s International Risk Governance Center ([email protected]). We spoke with Marie-Valentine Florin, the Center’s executive director, about IRGC’s role at EPFL.
How can synthetic biology be used improperly or maliciously?
Synthetic biologists are able to construct new biological systems and functions with applications in energy, health care and farming. Yet these same technologies can also be hijacked to create potentially dangerous pathogens for which there is no known treatment.
So as researchers, we need to ask ourselves what we can do to advance science and technology for the common good while at the same time managing the risks of “dual-use” research, or research that can be turned against people or the environment.
The EPFL workshop, which was funded by NATO’s Science for Peace and Security Programme, gave scientists, national and international regulators, security agencies and businesses a chance to pool their expertise in order to clarify the causes and potential consequences of these risks and refine existing response strategies.
The working groups formed during the workshop will now turn their discussions into a report and a book discussing the various problems, current and potential future solutions, existing obstacles and recommendations. The workshop participants explored both the failure of well-intentioned researchers to take necessary precautions and the impact of their negligence, as well as the danger that knowledge or material will be used for malevolent purposes.
What is [email protected]?
IRGC was created by EPFL in 2017 to serve as a forum for cross-disciplinary dialogue on complex, uncertain and ambiguous risks – which are often a counterpart to opportunities. Our goal is to give policymakers the information they need to make decisions based on solid scientific and technological foundations.
Most technologies are aimed at reducing existing risks – think climate change, diseases and natural disasters – and this is something that should be promoted by public policies. It is also essential to create positive incentives through targeted research programs, financial support, and standards that reward performance gains. But new dangers can always arise, and this is where IRGC comes in. Our recommendations are meant to highlight key risks and challenges and help identify possible strategies to deal with them.
We have a dual mission. First, we seek to develop widely applicable risk governance concepts and instruments. Risk governance refers to the processes, standards, regulations and structures that come into play when risk-related decisions have to be made. This includes assessing, managing and communicating about risks with the involvement of the various stakeholders. We then issue recommendations, mainly for public policymakers, about how to manage the risks posed by some emerging technologies.
Why does EPFL need IRGC?
All major universities around the world have an institute or center that studies the link between technology and public policy. The concept of risk is central because it is what justifies public intervention. Yet the risk governance approach that we take encompasses more than simple risk management. For a university like EPFL, it means creating the conditions necessary for new technologies to be adopted. For example, we commonly recommend that new technological applications must not only improve existing performance in some way, but they must also be economically viable and generally responsible, i.e. socially acceptable and environmentally respectful.
Our role at EPFL is to answer researchers’ questions about such topics. We can also be more proactive in certain areas, such as insisting on the importance of taking cultural differences into account when assessing risk acceptability – in genome editing, for example – and raising awareness of the role and place of ethics in researchers’ work.
What is IRGC currently working on?
Our work is focused on two of EPFL’s areas of expertise: digitalization and life sciences. In digitalization, this year we are working on deepfakes, which is when text, audio or video content is falsified in order to mislead or manipulate people. We’re looking at what steps can be taken to ensure machine learning is used to improve diagnostics, predictions and decisions without being applied to actually producing deepfakes as well. In the life sciences, we have set up a program to help decision-makers achieve the social, regulatory and economic conditions needed to promote fairness in the burgeoning field of precision medicine. This year, our focus is on value creation in this field.