May 28, 2015, by Brigitte Nerlich
Ta(l)king responsibility
In social science and policy circles there has been a lot of talk about Responsible Research and Innovation (RRI). However, nobody quite knows yet what this means and how it works in the context of harsh economic realities. In the meantime, natural scientists have taken responsibility for their research and innovations in the context of new developments in genomics and synthetic biology: gene editing using CRISPR.
This is not new. Forty years ago, scientists also took responsibility in the context of recombinant DNA, made their concerns public, sought public views and implemented guidelines, regulations and so on. In the following I’ll first explain how scientists took responsibility 40 years ago, then how they are doing so now and what this might mean for RRI and making science public. This is quite a long post. However, I think it needs to be, as it shows us how much still needs to be thought about and done to make RRI work, not only with ‘the public’ but also with ‘the scientists’.
Recombinant DNA
In 1974, molecular biologists expressed their concerns about possible dangers posed by new methods they had developed to splice and recombine DNA. They also temporarily stopped work using this new technology. A year later, in 1975, and together with the US National Institutes of Health (NIH) and the National Academy of Sciences, they organised a conference in Asilomar, California. At that conference “the biologists decided among themselves what restrictions should apply to various types of genetic manipulation” (Rasmussen, 2015). As scientists, they focused, of course, first and foremost “on laboratory safety but not wider social concerns” (ibid.). However, their reflections on the risks and benefits of this new bio-technology were very detailed, wide-ranging and thorough, as can see when looking at a few letters published in Science in 1976 and 1977.
I came across these letters by accident after a twitter exchange about designer babies and the slippery slope (topics that are quite popular in this era of CRISPR), when somebody tweeted something about ‘genetic meddling’ and the Daily Mail. I hadn’t come across this phrase before, so I put it into a certain search engine and almost the first item that came up was a 1976 letter to Science entitled “On the dangers of genetic meddling”, arguing that the Asilomar group, as one might call it, had neglected to think about the fare-reaching dangers of recombinant DNA and just wanted, like “Dr Frankenstein”, to continue producing “little biological monsters” – this certainly sounds quite Daily Mailish, avant la lettre!
Immediately following that letter appears one by a member of Friends of the Earth calling for opening up debate about all sorts of issues the scientists had neglected to tackle, including “the imposition of complex medical decisions on individuals and society, and the inherent fallibility (not to mention corruptibility) of inspection, enforcement, and regulator bodies”. A tall order!
More interestingly, two letters by those involved in Asilomar took up the gauntlet. One expressed surprise to find that what “began as an act of responsibility by scientists” had now become “the breeding ground for a horde of publicists” (one might call it hype today). The author, Stanley N. Cohen, tried to correct some misunderstandings about both the risks and benefits of recombinant DNA, but, more importantly, he tried to dispel some worrying myths which were circulating at the time. One myth relates to people thinking that scientists wanted to protect their freedom of inquiry and continue with experiments regardless of the dangers they may pose. “Instead”, Cohen writes, “the history of this issue is one of self-imposed restraint by scientists from the very start”. Importantly “their concern was made public” so that those less well informed could also use restraint.
These actions of restraint, self-regulation and risk communication had some unforeseen and un-anticipated consequences. “The scientific community’s response has been to establish increasingly elaborate procedures to police itself – but these very acts of scientific caution and responsibility have only served to perpetuate and strengthen the general belief that the hazards under discussion must be clear-cut and imminent in order for such steps to be necessary.”
This is interesting, as RRI is intended to anticipate the unexpected. So this is one aspect of the future that we need to keep a close eye on when anticipating the effects of RRI and its implementation.
I’ll now come to the last letter that appeared in Science in response to the ‘genetic meddling’ one, written by Maxine F. Singer and Paul Berg, key members of the Asilomar group. They again stress that “we were among those who first publicly expressed concern over the potential hazards of recombinant DNA experiments” and to call for “a voluntary deferral of certain experiments”. They point out that they “intervened early and assumed responsibility”, rather than any other agency did or could have done at that time, given that the research into recombinant DNA was not widely known or understood. They also stress that “[a]cceptance of responsibility in this matter by the past and present directors of NIH was courageous, farseeing, and proper”.
What is important here is that they took responsibility as soon as ethical and regulatory concerns presented themselves to them – this is what one may call ‘upstream (ethical) engagement’. What about openness and ‘making science’ and responsibility public? This is what they have to say on that point: “The discussions on recombinant DNA have been public since their beginning. The matter has been widely reported by the public press. The publicity permitted all concerned individuals and groups to enter the deliberations. No datum has been classified and no commentary has been withheld from the public. Indeed, most policy has been developed in public sessions.”
(For people interested in delving into the history of “Recombinant DNA Technologies and Researchers’ Responsibilities, 1973-1980”, these Paul Berg papers might be a good starting point.)
This little bit of historical research into early discussions around recombinant DNA (which involves “[p]recisely snipping bits of DNA from one organism and transposing them into others, using enzymes as molecular ‘scissors’”; Jasanoff, 2011), opens up a new vista on responsibility and openness and should give us pause for thought when dealing with similar issues today.
Gene editing
Scientists have now developed a new gene snipping technology called CRISPR, which allows them not only to recombine DNA but, in a sense, precision engineer it. Again scientists, taking responsibility, have called for a moratorium and a ‘prudent path’ forward. And not only that. As Sheila Jasanoff points out in an article published in the The Guardian at the beginning of April 2015, the scientists also recommend four actions: “a hold on clinical applications; creation of expert forums; transparent research; and a globally representative group to recommend policy approaches”.
However, even before all this could be implemented and well before a more democratic approach of public deliberations recommended by Jasanoff could get into gear, Chinese researchers had experimented with CRISPR to ‘edit’ a human embryo. And again, they took responsibility. At the end of April they made their (mainly negative) results public, a decision that was, one can argue quite courageous, honest and ethical.
They published their results in the online journal Protein & Cell whose editor then wrote in an editorial defending the decision to publish, but calling for restraint and ethical, social and legal reflection: “Until a consensus on new regulatory rules can be reached, it is in the best interest of all parties that the research field should voluntarily avoid any study that may pose potential safety and/or ethical risks. Only by holding themselves to the highest standards will scientists retain the public’s trust in biomedical research, and at the same time, provide the best service for the well-being of our society.” You can read the paper about the failed experiments here. Carl Zimmer has published an excellent summary of this affair here.
Conclusion
In a letter to Nature Filippa Lentzos of King’s College London said: “The original Asilomar meeting failed to engage the public in discussions, which we now know is crucial to the regulatory decision-making process. Had it done so, the resulting guidelines on recombinant DNA might have extended to legislation covering all users – including the military and commercial sectors – and not just those funded by the US National Institutes of Health.” (Lentzos, Nature, 21 may 2015, vol. 521, p. 289). This might be so.
However, is it realistic to ask scientists, who voluntarily call for guidelines to govern their research, which is their domain of expertise, and who call on the public to judge it, to also take responsibility for establishing guidelines that reach well beyond their domains of expertise and into all sectors of society? Would that not be arrogant and hubristic? Isn’t this rather the time and place for social scientists and policy makers, as well as ethical and forecasting experts, to step in and, instead of talking about responsibility and openness, to take responsibility, ‘do’ RRI in the way they see fit, and involve the public, and of course scientists too, in that enterprise, drawing on and learning from (some) scientists’ long experience in taking responsibility.
PS
I had just finished drafting this post (25 May), when the results of a survey of public opinion on gene editing was published by the Synthetic Biology Project. The findings are interesting. Also: “Many respondents initially did not feel they knew enough about the technology to have an opinion about it.”
[This Making Science Public post also contributes to my social science work on the BBSRC/EPSRC funded Synthetic Biology Research Centre. You can find other posts on synthetic biology here]
This is an excellent post Brigitte, thank you.
I agree with your conclusion that RRI must be considered at all points in the innovation pathway from research funding to recycling and reuse and all those involved must themselves be mindful of their responsibilities and the impact of their contribution – including scientists, governments, business, NGOs, media and the public ourselves.
I am struggling myself with the practicalities of that. I have an unfunded plan for adapting our Principles for Responsible Innovation, a quite practical document for business, to make separate documents under the same principles, for all those stakeholders. I can’t see another way to be honest.
However, the very big issue, as you have identified, is that there will always be different views of what beneficial innovations are and what trustworthiness looks like. You can’t engage all the people, all the ngos, all the media, scientists and social scientists. Some won’t have read the paper, the article or seen the programme which explains what has happened, some will disagree with the conclusions, which they will make abundantly clear. This makes ‘technology acceptance’ and uncontroversial decisions all but impossible in many of these areas.
There will always be disagreements about innovation pathways and applications, particularly in this type of area. These disagreements are based on world views, on values, on information available, inclination to consider others’ views, on assumptions of trustworthiness, on history of trustworthiness, on funding, on PR, on grandstanding and all the worthy and perhaps less worthy emotions and influences in between – human nature in short. This to me is the fundamental of RRI, you will not get total public acceptability because you won’t always be right and people won’t agree for good and bad reasons at different times in the process. So then what?
This is why I have come to the conclusion that what I am calling ‘radical transparency’ is the only way forward. So that everyone can see the evidence or lack of it; they can see the decision processes and outcomes, including the involvement of the public and actions of those who disagree. It probably won’t help persuade those who disagree, because the process won’t be perfect, but it should at least help build the trustworthiness of the system and give confidence to those having to make difficult choices about application pathways, those using the innovations and even quality evidence to commentators which, you never know, may enrich the dialogue we all engage in at the very least. However, it won’t change the mind of those who disagree, you know the evidence which says more evidence actually hardens people’s view of the rightness of their own position. So we mustn’t expect too much!
But more importantly in my view, there will always be unforeseen and undesirable outcomes whether we like it or not. Robert Winston’s Bad Ideas – an arresting history of our inventions, and the Late Lessons from Early Warnings project show that. So more attention must be focused on what we have called ‘Welcome Warnings’. This looks at the practicalities of ‘anticipation’, by considering cultural and institutional pressures and cultural, psychological, methodological and institutional solutions to the problem that despite everyone’s best efforts bad things happen, but very often they are ignored or brushed under the carpet for peculiar reasons. Finding ways to act in a timely fashion, to find ways of identifying early warnings better, to reward whistleblowers, to laud those who point out problems, is another essential component of the mix.
Sorry that my comment is longer than your post!
I totally agree with everyword you say!! I am sorry my answer is shorter than your last sentence 😉
(We’ll have to have a longer chat about all this!)