May 28, 2015, by Brigitte Nerlich
In social science and policy circles there has been a lot of talk about Responsible Research and Innovation (RRI). However, nobody quite knows yet what this means and how it works in the context of harsh economic realities. In the meantime, natural scientists have taken responsibility for their research and innovations in the context of new developments in genomics and synthetic biology: gene editing using CRISPR.
This is not new. Forty years ago, scientists also took responsibility in the context of recombinant DNA, made their concerns public, sought public views and implemented guidelines, regulations and so on. In the following I’ll first explain how scientists took responsibility 40 years ago, then how they are doing so now and what this might mean for RRI and making science public. This is quite a long post. However, I think it needs to be, as it shows us how much still needs to be thought about and done to make RRI work, not only with ‘the public’ but also with ‘the scientists’.
In 1974, molecular biologists expressed their concerns about possible dangers posed by new methods they had developed to splice and recombine DNA. They also temporarily stopped work using this new technology. A year later, in 1975, and together with the US National Institutes of Health (NIH) and the National Academy of Sciences, they organised a conference in Asilomar, California. At that conference “the biologists decided among themselves what restrictions should apply to various types of genetic manipulation” (Rasmussen, 2015). As scientists, they focused, of course, first and foremost “on laboratory safety but not wider social concerns” (ibid.). However, their reflections on the risks and benefits of this new bio-technology were very detailed, wide-ranging and thorough, as can see when looking at a few letters published in Science in 1976 and 1977.
I came across these letters by accident after a twitter exchange about designer babies and the slippery slope (topics that are quite popular in this era of CRISPR), when somebody tweeted something about ‘genetic meddling’ and the Daily Mail. I hadn’t come across this phrase before, so I put it into a certain search engine and almost the first item that came up was a 1976 letter to Science entitled “On the dangers of genetic meddling”, arguing that the Asilomar group, as one might call it, had neglected to think about the fare-reaching dangers of recombinant DNA and just wanted, like “Dr Frankenstein”, to continue producing “little biological monsters” – this certainly sounds quite Daily Mailish, avant la lettre!
Immediately following that letter appears one by a member of Friends of the Earth calling for opening up debate about all sorts of issues the scientists had neglected to tackle, including “the imposition of complex medical decisions on individuals and society, and the inherent fallibility (not to mention corruptibility) of inspection, enforcement, and regulator bodies”. A tall order!
More interestingly, two letters by those involved in Asilomar took up the gauntlet. One expressed surprise to find that what “began as an act of responsibility by scientists” had now become “the breeding ground for a horde of publicists” (one might call it hype today). The author, Stanley N. Cohen, tried to correct some misunderstandings about both the risks and benefits of recombinant DNA, but, more importantly, he tried to dispel some worrying myths which were circulating at the time. One myth relates to people thinking that scientists wanted to protect their freedom of inquiry and continue with experiments regardless of the dangers they may pose. “Instead”, Cohen writes, “the history of this issue is one of self-imposed restraint by scientists from the very start”. Importantly “their concern was made public” so that those less well informed could also use restraint.
These actions of restraint, self-regulation and risk communication had some unforeseen and un-anticipated consequences. “The scientific community’s response has been to establish increasingly elaborate procedures to police itself – but these very acts of scientific caution and responsibility have only served to perpetuate and strengthen the general belief that the hazards under discussion must be clear-cut and imminent in order for such steps to be necessary.”
This is interesting, as RRI is intended to anticipate the unexpected. So this is one aspect of the future that we need to keep a close eye on when anticipating the effects of RRI and its implementation.
I’ll now come to the last letter that appeared in Science in response to the ‘genetic meddling’ one, written by Maxine F. Singer and Paul Berg, key members of the Asilomar group. They again stress that “we were among those who first publicly expressed concern over the potential hazards of recombinant DNA experiments” and to call for “a voluntary deferral of certain experiments”. They point out that they “intervened early and assumed responsibly”, rather than any other agency did or could have done at that time, given that the research into recombinant DNA was not widely known or understood. They also stress that “[a]cceptance of responsibility in this matter by the past and present directors of NIH was courageous, farseeing, and proper”.
What is important here is that they took responsibility as soon as ethical and regulatory concerns presented themselves to them – this is what one may call ‘upstream (ethical) engagement’. What about openness and ‘making science’ and responsibility public? This is what they have to say on that point: “The discussions on recombinant DNA have been public since their beginning. The matter has been widely reported by the public press. The publicity permitted all concerned individuals and groups to enter the deliberations. No datum has been classified and no commentary has been withheld from the public. Indeed, most policy has been developed in public sessions.”
(For people interested in delving into the history of “Recombinant DNA Technologies and Researchers’ Responsibilities, 1973-1980”, these Paul Berg papers might be a good starting point.)
This little bit of historical research into early discussions around recombinant DNA (which involves “[p]recisely snipping bits of DNA from one organism and transposing them into others, using enzymes as molecular ‘scissors’”; Jasanoff, 2011), opens up a new vista on responsibility and openness and should give us pause for thought when dealing with similar issues today.
Scientists have now developed a new gene snipping technology called CRISPR, which allows them not only to recombine DNA but, in a sense, precision engineer it. Again scientists, taking responsibility, have called for a moratorium and a ‘prudent path’ forward. And not only that. As Sheila Jasanoff points out in an article published in the The Guardian at the beginning of April 2015, the scientists also recommend four actions: “a hold on clinical applications; creation of expert forums; transparent research; and a globally representative group to recommend policy approaches”.
However, even before all this could be implemented and well before a more democratic approach of public deliberations recommended by Jasanoff could get into gear, Chinese researchers had experimented with CRISPR to ‘edit’ a human embryo. And again, they took responsibility. At the end of April they made their (mainly negative) results public, a decision that was, one can argue quite courageous, honest and ethical. They published their results in the online journal Protein & Cell whose editor then wrote in an editorial defending the decision to publish, but calling for restraint and ethical, social and legal reflection: “Until a consensus on new regulatory rules can be reached, it is in the best interest of all parties that the research field should voluntarily avoid any study that may pose potential safety and/or ethical risks. Only by holding themselves to the highest standards will scientists retain the public’s trust in biomedical research, and at the same time, provide the best service for the well-being of our society.” You can read the paper about the failed experiments here. Carl Zimmer has published an excellent summary of this affair here.
In a letter to Nature Filippa Lentzos of King’s College London said: “The original Asilomar meeting failed to engage the public in discussions, which we now know is crucial to the regulatory decision-making process. Had it done so, the resulting guidelines on recombinant DNA might have extended to legislation covering all users – including the military and commercial sectors – and not just those funded by the US National Institutes of Health.” (Lentzos, Nature, 21 may 2015, vol. 521, p. 289). This might be so. However, is it realistic to ask scientists, who voluntarily call for guidelines to govern their research, which is their domain of expertise, and who call on the public to judge it, to also take responsibility for establishing guidelines that reach well beyond their domains of expertise and into all sectors of society? Would that not be arrogant and hubristic? Isn’t this rather the time and place for social scientists and policy makers, as well as ethical and forecasting experts, to step in and, instead of talking about responsibility and openness, to take responsibility, ‘do’ RRI in the way they see fit, and involve the public, and of course scientists too, in that enterprise, drawing on and learning from (some) scientists’ long experience in taking responsibility.
I had just finished drafting this post (25 May), when the results of a survey of public opinion on gene editing was published by the Synthetic Biology Project. The findings are interesting. Also: “Many respondents initially did not feel they knew enough about the technology to have an opinion about it.”