Saturday, October 20, 2018

What should the public understand?

Section 2 of Stengers first essay starts to ask why it is that scientists have ended up blind to their arrogance with respect to the public and why they seem so keen to circle the wagons and rely on their supposed authority over the facts when pushed into public debate, rather than exercising the critical skills they so manifestly have.  She poses this at the outset of the section as a question of why more scientists don't step forward to challenge those scientists who say things like, "GMOs are the only safe scientific solution to a growing population".  This is clearly not a 'scientific' statement, no matter how much impressive molecular genetics went into it.  So why don't we frequently see scientists police their own by shouting down this sort of abuse of their authority?  

In fact, why it is so surprising and controversial when one does?  For example, why does Cliff Mass receive threats from The Stranger when he tries to show us actual data that makes it seem pretty unlikely that global warming has anything to do with this year's Pacific Northwest forest fires?  He's not saying global warming doesn't exist.  He's not saying that there are no real effects yet.  He's not saying we should do nothing to change emissions regulations until we can prove that it's causing more wildfires.  He's just pointing out that every time people in downtown Seattle see smoke, there will be another series of newspaper articles attributing the fires to global warming, but that this supposed connection, made in the name of the 'scientific consensus' around global warming, doesn't look to be very scientific at all and is not borne out by the actual facts (eg. there seem to be fewer fires than there were 100 years ago).  For pointing out what he considers an abuse of the authority of science, and backing up his critique with a rational and scientific argument, he is regularly pilloried, rather than engaged with.  Why is that?

Stengers explanation for this phenomenon is basically that scientists are not naive.  They know that there are all sorts of 'unscientific' forces that shape the agenda, conclusions, and ultimate impact of their work.  They know where the money comes from and what the governments and corporations and tenure review boards ask for in exchange.  They know how the sausage is made.

They can't, however, talk about that openly, because they fear that if the public were to become aware of the ways in which science 'is made', they would lose confidence and reduce scientific proposals to simple expressions of particular interests. 'People' must continue to believe in the fable of 'free' research, driven by curiosity alone towards the discovery of the mysteries of the world (the kind of candy that helps so many well-meaning scientists to set about seducing childish souls).
 
In short, scientists have good reason to be uneasy, but they can't say so. They can no more denounce those who feed them than parents can argue in front of their children. Nothing should upset the confident belief in Science, nor should 'people' be urged to get involved in questions they are not, in any case, capable of understanding.
 
In other words, the arrogance that culminates in the strict separation of scientific fact from mere public opinion is a defensive mechanism for scientists.  Their call to be left alone to do 'pure' research is a last ditch effort to reclaim what they see as a lost golden era when science went its own way, heroically, but dispassionately, uncovering the mysteries of the universe.  I think she understands this longing for the past on a number of levels.  It's about a historic age where science was less bureaucratic and more open.  It's about a age of life (say, 2 months into the start of a PhD program) when you still don't know how the sausage is made and think you're setting off on a wonderful intellectual adventure to pursue the 'big questions'.  And it's even about an age when the boys had the run of the place and their dutiful wives didn't ask them to come home from the lab for dinner (she introduces this gender idea in the next essay, I'm not doing it justice here, but it's clearly thematically connected).  Basically, she thinks scientists circle the wagons because they feel threatened.  And guess what, she agrees: just because you're paranoid doesn't mean you don't have enemies!

Tuesday, October 2, 2018

Should 'the public' 'understand' the sciences?

Can you think of a more French title for the first section of your essay?  I guess she could have put 'the sciences' in quotes too.  Nevertheless, I think her point in this section (pg. 1-4) is actually pretty straightforward.  The reason she puts 'understand' in quotes is because of the way it seems to imply a one-way transmission of information from someone who has it to someone who needs it.  So when someone bemoans the fact that the public does not understand science in some context, they are very often thinking that science (as educated people -- ie. scientists -- obviously know) is only concerned with the authority of established true legitimate facts, and nothing more.  In particular not with (mere) opinions.  The scientists have these facts, the public needs these facts, and understanding happens when the public accepts that these are the true facts, and that anything outside these facts is just a matter of (uninformed, unscientific) opinion.  Implicit in this notion of understanding as transmission of truth is the idea that, when properly understood, the facts will speak for themselves and compel anyone to accept that the scientists were speaking the truth.  In other words, the scientists will know they've been understood when you agree with them.  Insofar as you disagree with them, you didn't get it.  As Andre the Giant famously said, "Obey" (by the way, this is the best example I can think of to explain what Delueze and Guattari called an 'order word' -- basically, something that doesn't communicate information so much as tell you what you're supposed to think is important).

As I said before, the problem here is arrogance, not error.  The idea is that scientists are often called in to circle the wagons and tell the public to shut up, do what we say, and quit worrying.  They do this because they are the authority, and because, as the authority, they are convinced that they are right.  Of course, as Stengers points out, this is not the way scientists treat each other, and not the way science actually happens in practice.  Remember the ether theory of the vacuum, the phlogiston theory of fire, and the 'central dogma' of molecular biology?  But this is the way scientist like to think about what their doing -- just uncovering the facts ma'am -- and it is certainly the way they like to present themselves to the public (in fairness of course, this is not all on scientists; the public and its media have a strong taste for authority as well).

Thinking historically, I suppose that scientists came by this arrogance honestly.  They used the compelling authority of the facts to successfully take on the Catholic church in the court of public opinion.  Pretty impressive.  Notice, however, that this was accomplished in the context of a very particular and very narrow question about the movement of planets.  I'm not saying trivially narrow, but predicting the next eclipse is a very definite question, with a very definite answer, posed to what turns out to be a very simple system, that happens to be about as isolated from the influence of the rest of the cosmos as a thing can be.  That is to say that in some sense Galileo got lucky.  If he had been trying to argue for a mechanistic and scientific world view by trying to predict, say, next Thursday's weather ... well, let's just say that he would have found the reception significantly ... warmer.  Getting lucky doesn't make him less right about the planets or the weather.  And of course, we should all be grateful for the incredible string of luck science has had (imagine if the Haber-Bosch process required knowing particle physics, or if antibiotics could only be effectively produced by a heavily genetically engineered organism).  But getting lucky does tend to make people feel smarter and more skilled than they actually are.  If you've been investing in the stock market as long as I have, you've probably joined me in learning this lesson the hard way.  Getting lucky almost always leads to arrogance.

Unfortunately, we're not so lucky anymore. It seems that we've picked a lot of nature's low hanging fruit, both scientifically, and as a society.  Most of the situations where science comes up against public opinion are pretty damn complicated now.  We want to know whether global warming is going to wipe us out, whether GMOs are going to wipe us out, whether the banking industry is going to wipe us out, whether robots are going to wipe us out, whether 5G cell phone radiation is going to wipe us out, whether eating cooked food is going to wipe us out  ... (a full discussion of this obsession can be found in my latest book -- Anti-Messianism: Towards The End of The End of The World).  You may fear some of those and laugh at others, but the truth is that none of them are simple questions with quick and easy answers like, "next Tuesday at 2pm there will be a solar eclipse in Lisbon".  

In fact, in a lot of these cases, we're not even sure what the important questions are.  What do we want to know about global warming?  What would it mean to call GMOs safe?  How should we handle a industry like finance that is inherently unstable if people believe it to be?  This is the sort of stuff that Stengers is calling "matters of concern".  Before you get down to the business using the authority of scientific facts to investigate any of these issues, you have to stop and think about what you wanted to know, about what the important questions are.  This isn't because science is somehow inapplicable in these cases, but simply because these situations are complicated.  We're talking about physics and chemistry and biology interlocking with economics and politics in a bewildering variety of ways.  What are the right questions to ask, what are the important facts to establish, what definition of variables should we use, what measurements should we make?  It's just not clear at the outset how to approach these problems.  

One thing that is clear though, is that saying: "Trust me, I'm a scientist.  GMOs are safe.  I eat them everyday in the lab and I'm fine.  Anyway, you don't understand how genetic modification works" is just not going to cut it.  Because we have no idea if this lab scientist has thought through the many dimensions of the problem, most of which he is not trained to be any more of an expert in than we are.  And we'll have to have a similar skepticism for authority in the case of climate change, or any of the other problems that we're asking science to help with.  So we don't want to simply 'understand' the ready-made solutions that science proposes for one dimension of these problems, and go on to accept these as 'the solution'.  We want instead to have an intelligent relationship with some facts (not the facts) that scientists really can offer, as well as the way they fit together with each other and into an overall context.  And we want scientists to have an intelligent relationship with us, so that when we wonder if they've really addressed something that might become an issue, they can say to us, "that's a good question that I'm not sure I can answer right now, but I bet we could investigate it some more", and we'll believe them and give them some time to think about it.  I confess that Stengers attempt to call this more two-way conversation "public intelligence", as opposed to the one-way authority of "understanding", does pretty much nothing for me rhetorically.  Even now having a clear concept of the difference she's trying to get at, the terms are not going to help me encapsulate that difference in a memorable way.  But then, maybe some things really are simply lost in the translation to Plain English.

P.S.  Would it make sense to call this an appeal for "scientific literacy", broadly speaking?  Of course we want people (including some scientists expert in another field) to be able to read scientific results and makes sense of them.  For example, to know how solidly established they are, what some of the premises and restrictions of the conclusions are, why this research is important and interesting, etc ... But literacy can actually go way beyond this.  When I (didn't) read Moby Dick in high school, I was "literate" in the very narrow sense of being able to follow the plot.  When I read it 5 years ago though, I had a much deeper relationship to it.  There's was a lot more literature to read there.  The more literacy you gain, the more a great book has to say.  Yet the purpose of gaining literacy in this sense isn't to reach some level of authority.  Nor do we assume that an English professor who is verily an expert in the book is somehow thereby granted a monopoly on its true meaning.  The more literate we are, the more well read, the more we will be able to read into the book and the expert both.  Authority here isn't closing off debate, but opening up new doors.