April 28, 2017, by Brigitte Nerlich
Public trust in science: Myths and realities
The March for Science has come and gone. There was no fuss; but there was wit and fun; and solidarity and conviviality. The march did what it set out to do: it got people talking about science and politics.
During the march, some tweets jumped out at me. They focused on the thorny issue of trust. One tweet said: ‘Trust in science has to be hard-earned’; another asked ‘Can March for Science participants advocate without losing the public’s trust?’; another ‘Will the #marchforscience increase public support for science or entrench polarisation?’, and yet another ‘How much confidence do people have in scientists?’ This tweet also provided an answer: ‘Actually, a lot’ (an assertion supported by a graph).
Public trust in science: myths and realities
That made me think, yet again, about the issue of ‘trust’ in science. Since entering the field of Science and Technology Studies about 15 years or so ago, I have heard again and again that ‘the public doesn’t trust science’, or that ‘there is growing public distrust in science’, or that there is ‘radical scepticism’, or that scientists have lost the trust of the public (this is also a recurring topic in the media). Many calls for remedial action were based on this meme or myth of a ‘deficit of public trust in science’: to regain public trust in science, restore it, rebuild it and so on.
I am supportive of such actions, but I often asked myself: What’s the evidence for the initial assumption that there is a loss of trust? What public are we talking about? What science? What loss? And what do we want more of? Let’s look at some data.
Here are some results of the latest (2014) British Public Attitudes to Science Survey:
- The UK public are as interested and enthusiastic about science as they have ever been in the last 25 years
- Science is increasingly seen as important to the economy and the public widely support continued government funding for science. […]
- The public think it is important for them to know about science and want to hear more from scientists, government and regulators
- Nine-in-ten (91%) agree that young people’s interest in science is essential for our future prosperity
- The public continue to support government funding of science. […]
- 81% of people agree that “science will make people’s lives easier” and over half (55%) think that the benefits of science outweigh any harmful effects
In short: “We really do quite like scientists.”
Indeed, Martin Bauer, former editor of the seminal STS journal Public Understanding of Science, blogged in 2015 that: “in the UK trust in scientists has [been] rising to a long-term high. This in a context of a country that was a major exporter of public debates and doubt over modern science over the past 30 years; suffice to mention the Public Understanding of Science report of the Royal Society of 1985 and the Loss of Public Confidence report of the House of Lords of 2000″. That’s an interesting comment!
Over in the United States, we find that in 2013: “Data on public confidence in institutions from General Social Survey8 demonstrates that confidence in the scientific community has remained relatively stable since 1973.9 ‘Trust does not look to be on the decline over time,’ Funk said. The GSS data also show that 95% of surveyed individuals agree that scientists are ‘helping to solve challenging problems,’ and 88% agreed that scientists are ‘dedicated people who work for the good of humanity’”.
An article on the science march reports that “trust in scientists remains high throughout the world. A survey from 2016, for example, found that Americans are more likely to have ‘a great deal of confidence’ in leaders of the scientific community than in leaders of any group except the military”.
Within such surveys there are, of course (gender, geographical, socio-economic) nuances and changes of hopes and fears over time. Regarding the 2014 British survey for example, Alice Bell commented: “We Brits quite like wind power and vaccination, but we’re a bit more divided on shale gas, genetically modified plants, animal testing and nuclear energy.”
The deficit model of trust and its dangers
But why has the ‘deficit model’ of public trust in science been so pervasive in the media and the social sciences? Why do scientists buy into this meme or cliché? One reason is probably that, in the UK at least, various learned society reports, informed and inspired by social science research, and, in turn, stimulating investment in more social science research, have made this model popular. In a 2008 book chapter Martin Bauer himself spoke for example about a ‘trust deficit’, an ‘expert deficit’ and a ‘public crisis of confidence’ (p. 115). Such deficits were diagnosed by social scientists and acted upon by the Royal Society and the House of Lords, among others.
I feel, however, that perpetuating and entrenching this meme of the public distrust in science, wherever it comes from, is dangerous, as it eats into the social fabric that holds society together and of which science is an essential part. Indeed, in face of the data, this meme almost appears to be something like an alternative fact.
An important ingredient that holds our social fabric together is social trust or, to use a better phrase: ‘warranted belief’ – which is very different from ‘blind trust’. This type of trust is “quite literally, inseparable from the work of science, and the knowledge that comes from it”. That’s what science is good at and good for. That’s probably why, overall, people trust science and scientists. And that’s why sowing the meme of public distrust in science may be quite dangerous.
Social trust in science and society
This type of ‘social trust’, based on warranted belief, also upholds the work of institutions other than science (schools and universities, banks, churches, courts of law, and so on). It is here, however, that surveys have found a decline in trust – not in science per se, but in public institutions overall. The reasons why this might be happening are hinted at in a great article on the March for Science by Paul Rosenberg for Salon magazine – read it!
The article ends on a note of hope: “Science cannot undo the historical forces at work that have so weakened the bonds of trust, but it can help us respond more effectively. It can help us understand those forces”. To do that, we all – scientists, science communicators, teachers – need to work with the grain of social trust not against it. This includes stepping away from easy memes and clichés that reiterate the deficit model of public trust in science.
We should not forget however (and Rosenberg stresses this just as much as the tweet I mentioned at the beginning of this post) that: “Trust should be earned, every day.” More importantly still, as Baroness Onora O’Neill has recently said so eloquently, we don’t actually need ‘more trust’, we need more trustworthiness. Let’s try to earn that. One step at a time. One conversation at a time.
PS 3 December 2021 (second year of the covid pandemic): Yet another survey, this time the 2020 Wellcome Trust Global Monitor, has shown that ‘people’ trust ‘scientists’ and that this trust went up even through a pandemic, characterised by the spread of not very-much-mutating misinformation alongside a mutating virus. There were, of course, differences between countries, differences that mapped partly onto other trust issues, such as trust in government. Here is a great thread by the Wellcome Trust, showcasing all the results. The Wellcome Trust also monitored trust in vaccines in 2018. These data have been reanalysed now and the findings have been published in the middle of this pandemic by Patrick Sturgis, Ian Brunton-Smith and Jonathan Jackson. They found that “In countries with a high level of consensus regarding the trustworthiness of science and scientists, the positive correlation between trust in science and vaccine confidence is stronger than it is in comparable countries where the level of social consensus is weaker.” Their results shed light on vaccine hesitancy in the context of covid. So if you are interested in that topic, read the article!
Image: Wikimedia commons
It got people talking….
Did it, who exactly.. did it get the general public talking? Because I really think not, not in public, nor in the media. Usual people talking to selves/each other
I went on the London Science march, and despite reports, it was 3000 people tops, (quality not quantity? 🙂 ) the public (London) did not really know it happened, beyond tourists taking pictures.. the following march, against the Armenian genocide, was actually bigger, but equally ignored, equally no impact
Plus point, my first march ever, great fun, some fun banners, lots of nice people, organisers were great, nice people.. but, achievements?
My slogan
Science March 4
Open Access
Open Data
Open Review
All Trials
Let’s solve Replication Crisis
It got people talking – perhaps not everybody, but families, friends, associates, students, future scientists etc. It’s a start. It was a bit geeky, I confess. Let’s see how things go. http://www.sciencemag.org/news/2017/04/march-science-live-coverage
I do think the public trust science, as they trust engineering, medicine, etc at a broad level. But at they just get on with it level… as soon as it becomes a bit more hectoring, or insistent on something, then a bit of caution/scepticism creeps in. I.e. Politician publicly feeding their children hamburgers
Some stupid slogans/chants
what do we want..
evidence based policy making
When do we want it
After peer review
As if they have never heard of the replication crisis…
Wakefield MMR paper, past peer review. It was published in the prestigious Lancet journal. There were 12 prestigious, scientists as co-authors. It took 12 years before that paper was retracted…
Yet, peer review,
Even though the paper was only fully retracted in 2010, as I underdstand it, 10 of the 12 authors published a retraction in 2004.
0nly 6 years for the other authors to bail… ! and the Lancet still didn’t retract for another 6…. ! imagine if Wakefield had tried the -science under attack, anti-science, denier tactics, critical science in the pay of ‘Big Pharma’, dark moneyed, etc,etc. claiming he’d been harassed, and tried the Bishop/Lewandowsky warning flag method.. as the critic that pursued him was outside of academia.
Can you elaborate a bit more on the link between the argument re public trust in science and the deficit model? I don’t understand what the link is.
It’s not the usual deficit model – that people don’t know enough about science and that that deficit needs to be filled with more knowledge -, but the deficit in trust model, where people are said to have no or not enough trust in science and that that deficit in trust needs to be restored, rebuild.. acted upon etc. The solutions proposed to dealing with both deficits have been public engagement etc.. (if I understand things correctly, which I might not!)
There is of course also the claim that giving people more knowledge (filling the knowledge deficit) will give them more trust (fill the trust deficit) which has been debated and disproved for a long time… but that’s another topic, I think