Follow the reluctant adventures in the life of a Welsh astrophysicist sent around the world for some reason, wherein I photograph potatoes and destroy galaxies in the name of science. And don't forget about my website, www.rhysy.net
There's a funny thing about voting. You'd think it would be nice and simple : here are some choices, pick which one you like best and whichever gets the most votes wins. Except that every single step of this procedure is hideously complicated.
Take this silly example, inspired by John Oliver's quite correctly pointing out that science isn't done by voting at all. It's a yes or no question... but wait ! There's a third option, which is funny ! So everyone votes for that option instead. There could have been legitimate third options ("unsure", "abstain") but in this case it's just silly. And the "no" option is clearly wrong, but people voted for it anyway.
This ridiculous poll has all the same legal standing as the non-binding, advisory Brexit referendum, by the way.
So the options to begin with may be flawed. The result may not mean or change anything. The voters may not understand the issue presented (if you didn't watch the John Oliver show you're probably baffled by the question). Their votes may be weighted differently or they might be allowed to give multiple votes for multiple options. Or they may vote not directly on the issue at hand, but in order to enact other consequences - e.g. if option 1 favours party A, maybe option 2 starts to look better.
None of this means democracy is a hopelessly stupid idea, just that like anything with great power, it needs to be handled very carefully.
When I was a teenager I discovered* the heroic tales of classical Greece, wherein the plucky Athenians discovered direct democracy. Herodotus was in no doubt that it was this that transformed Athens into a state capable of defeating the mighty Persian empire at Marathon and later in a spectacular naval victory at Salamais (though many other Greeks thought the uber-militaristic Spartans were responsible). Plato, on the other hand, thought that democracy just encouraged mob rule, but then he was biased by the state's execution of his mentor Socrates. Socrates himself didn't seem to have too much truck with democracy either; though he respected the rule of law to a fault, there didn't seem to be much that could persuade him to do anything he didn't really want to do. * Quite by myself, British history education in schools being about as useful as a pile of rat vomit.
Later on I read of the Peloponnesian War, which isn't talked about as much as the heroic feats of the Persian Wars (which really do deserve an epic film trilogy - yes, a trilogy, nothing less will do). In that far more sordid episode, the potential for mob rule became much more evident. There was no real reason for the war at all, the Athenians just got big-headed. Direct democracy may have saved them at Marathon, but it also led a to hugely destructive and completely pointless conflict. For all its many virtues, democracy doesn't lead to perfect decision making - and sometimes it results in absurdly stupid decisions.
Sadly, this is the case with Jeremy Corbyn. While there's much I respect about his policies and yes, much I like about his personal style (or lack thereof), unfortunately there are also aspects which are seriously worrying. I won't be voting for him. I might be voting for Labour, but that remains to be seen...
This is where I stand politically. For reference this is where the major
British political parties stand, here are some American politicians
and here are some other well-known figures.
I should be clear that I was rather pleased when Corbyn was elected. I thought Tony Blair's* opinion that Labour risked "annihilation" under Corbyn was worth listening to, but it seemed to me that the best option was to see how he was getting on after about a year. If badly, then he could be ditched with still enough time to find someone else before the next election. That's not such an uncommon practise in politics. Any sensible politician knows when they're losing that they should step down for at least a little while. Oh my youthful naivety...
* Yes, I know many of you think he's the devil incarnate, but he won three elections. Leave the morality issues : in pure political terms the man is a genius. His opinion on politics itself is well worth listening to, regardless of what you think of his actual policies.
When Corbyn was first elected the situation was, quite understandably, somewhat confused. He'd been thrown into the contest almost as a joke, with essentially no chance of winning. Sometimes it's nice to show diversity in the party, after all : "look, we have people who stand for your opinions too". And being on the far left of the party it seemed only natural that there would be friction, but I dismissed reports of "chaos and infighting" as being media exaggeration. I heartily approved of the idea of "politics by consensus", of forming policy based on agreement rather than diktat.
I even went so far as to say that the disgruntled elements in the party should just learn to accept it - the party members had voted for a left-wing leader, which is a clear sign they wanted a more left-wing approach. So, grin and bear it, wait and see what happens - the time for unity was now.
There is something very worrying when a leader elected with overwhelming support doesn't command much in the way of enthusiasm from the other MPs. Maybe they're right and know Corbyn can't win mass appeal. Maybe they've been in politics too long. In any case, methinks they haven't quite grasped the idea of a democracy.
Trying to oust Corbyn at this stage would do nothing except alienate everyone who voted for him. Now, if a year down the line it does prove that he hasn't got what it takes, then it would be understandable. But at this stage, even suggesting they can't work with the leader most people wanted makes them looks foolish.
At that stage I was more worried about the behaviour of the other MPs than Corbyn. Labour is of the left, having a leader of the left should have been a good thing. Although there were a few warning signs that the opposite was in fact true - that Corbyn was the one to worry about instead of the rest - it's hard to see these as anything more than the usual level of suspicion over politicians. Especially those who seem to good to be true. But anti-austerity, nationalising the railways, greater efforts for resolving conflicts peacefully, and the prospect of a universal basic income all seemed to be exactly the sort of things I wanted a Labour leader to say. My one major disagreement was over Trident, but that wasn't important enough for me to hold much of a grudge.
And anyway, there was that prospect of political consensus. Which I presumed meant give and take, not trying to ride roughshod over any disagreements.
As time went on, the media attacks against Corbyn became more vicious and less and less true. When they were fact-checked, they failed miserably. Corbyn called Hamas "friends" ? Yes, but he would say that to (almost) anyone to bring them to the negotiating table. Corbyn wants to negotiate with ISIS ? No, he's said that they are a case so unusually extreme that that's impossible. Corbyn wants to abolish the monarchy ? Yes, but he's explicitly said that's not the fight he wants to lead. Then there was Cameron's disgusting personal attack in the wake of pig-gate, which Corbyn, to his personal credit, pointedly ignored.
All this painted a very clear picture : a moral, principled, authentic politician being abused by the establishment. Much of the above may even still be true. And yet I've completely changed my mind about Corbyn as leader, for three quite simple reasons.
1) Corbyn claimed that all his rebellions as a backbench MP were done on principle
Yes. Good. I'm sure they were. But this was in response to accusations that Corbyn was trying to force other MPs to fall in line, so it's pretty hard to escape the conclusion that he thinks other MPs won't rebel because of their own moral principles. It's silly to assume that everyone else was following the party leader or official message just because they were told to - yes, fine, lots of politicians are awful, but just because someone disagrees with you doesn't mean they're unprincipled. They just have different moral principles. Maybe you don't like those principles, but you've got to be able to work with people you disagree with.
That's politics in a nutshell. It was not the slightest bit convincing to say that his rebellions were principled, as though everyone else's "obedience" was not. In fact people in one political party tend to vote the same way for the simple and buggeringly obvious reason that they joined that party because they have similar views ! He must think of himself as special snowflake, uniquely qualified to pronounce moral judgements on the rest of the party. It was a statement and attitude that said, "I don't actually respect my valued colleagues opponents at all".
This was a warning sign, an indication that he didn't really know how to reach out or build bridges or do any of the nice things he promised. It was worrying, but not fatal because it was ambiguous (and still early days, at the time). He didn't directly say that all his opponents were just doing what they were told - but it's a clear implication. While Corbyn did initially seem to be willing to compromise (i.e. not working to overthrow the monarchy, some hints of room for negotiation on Trident), this statement hinted at a darker prospect : that "political consensus" really meant just doing whatever Jeremy said. Which is crazy, because Corbyn has rebelled so often one wonders why he's in Labour at all.
2) His poor showing during the Brexit campaign
With Labour being a party of the left, it's no surprise that they're pro-EU. Unfortunately Corbyn is one of those rare individuals on the left with an arse-backwards opinion of the EU who think's it's more or less the opposite of what it actually is. He seems to have some twisted notion that it exists to give business a free hand, or some such other bloody nonsense. Everyone has some stupid opinions about some things, but it was hugely unfortunate timing that Labour had a Euroskeptic (using the term "skeptic" in a literal fashion) leader during the Brexit campaign. His should have been the loudest voice for Remain of all, supporting freedom of movement, anti-discrimination, curbs on excesses of government power and support for human rights.
Instead we got... well not much of anything, really. At least no more than the absolute bare minimum at which he could claim to have shown up, famously rating the EU as "7.5/10". Yes, that's an assessment I would share. No, it's not a sane way to lead a campaign when your opponents are xenophobic idiots. Yes, I realise some of you reading this are anti-EU and not xenophobic or idiots, but xenophobic idiocy played a major part in the campaign. And leaving the EU is still a monumentally stupid thing to do whether you like it or not.
Still, in other times this would have been much less important. But after the vote, what did Jeremy do ? Still nothing, essentially. Now is the time Labour should be organising a massive resistance to the advisory referendum opinion poll, and that's exactly what it would be doing if Owen Smith was in charge. Instead Corbyn is capitulating, presumably because of his arse-backwards idea of what the EU is. Oh, yay, let's give a free hand to the far right, courtesy of the far left.
3) When you lose a vote of no confidence, you need to go home and rethink your life
The mere act of holding the vote isn't a death sentence - its the result that matters. He could have won it and the rebels revealed as a small group of malcontents. Or he could have lost it by a small margin, in which case it might be better to keep him and have him make more efforts to work with those in the party who are more toward the centre.
But he didn't win it, or only just lose it. He lost it massively, with 80% of his MPs against him. These are the people he's supposed to lead in government, and they don't want to work with him. Having nice policies is only part of his job. If he can't persuade people to work with him in opposition, what hope is there he could form a credible government ? None, that's what.
The only sensible response to this is to leave. Doesn't even matter why his MPs hate him, the fact is that they do. No individual is supposed to be bigger than the party, so the only sane response is to accept that you can't lead this group of people and bugger off. But, astonishingly, he didn't. This is madness. There are some who claim that this is all due to "Blairite" opposition. Well, sorry, but that's bollocks.
A vote of no confidence is supposed to be a near-nuclear option of last resort. Ignoring the result sends a very clear signal that he doesn't, for all his nice platitudes and promotion of a kinder politics, actually give a damn about what people think. "I have a responsibility to those who elected me" ? Like hell. You have a responsibility to lead your MPs. They're elected too, and by the general populace, not just party members. They represent the voice of the people every bit as much as the leader - more so, because there's more of them, and they were voted across the country by people of different political inclinations, not just the party faithful and activists. He's treating the mandate given by his success in the leadership election as an absurd absolute, every bit as much as Brexit supporters think their minor victory in the referendum means they can do whatever they like.
If you won't even leave when 80% of your MPs tell you to step down, what does that say about building bridges, a kinder politics, or forming a consensus ? To ignore this most extreme and extremely clear method of ostracisation is tantamount to declaring a dictatorship. A vote this far against you is a no-win situation : you can either leave with honour and everyone accepts that it's unfortunate (but you've still gone), or you can stay and reveal your true colours : every bit as unprincipled as any other politician.
In the movie The Postman, Will Patton plays the evil general Bethlehem in a post-apocalyptic, recovering society. Before the war, Bethlehem was a completely unremarkable photocopier salesman, whose hitherto unknown talents are only revealed by the extreme circumstance he finds himself in.
At this point Corbyn is beginning to remind me quite a lot of General Bethlehem. No, he's not evil... but he is very, very worrying. He never sought office - he's held no ministerial position before, but he finds himself in this position even so. If he won't even quit as leader as the opposition when his MPs tell him to, can you imagine how dangerous that would be if he ever became Prime Minister ?
Corbyn may or may not be a genuinely nice man, I don't know. But even pacifism can be extremely dangerous if you don't fight when fighting really is the only answer. I suspect - and I hesitate to go further than "suspect" - that something similar is at work here. He claims to be nice. He claims to want to work with everyone. But there have been too many whiffs of suspicion that all he really wants is to get things his own way, to deselect MPs who disagree with him, turning a blind eye to "see no evil" where it suits him, to give more power to activists - which is a politically disastrous move. We already know activists support him, but it doesn't appear that the rest of the country does. As with the leadership election, there's a massive selection effect at work here that doesn't come into play during a general election. Power base, anyone ?
I rather suspect that underneath his brown-coated veneer of a nice man who spends all his time in an allotment, Jeremy Corbyn may actually be a thoroughly nasty piece of work. Villains who twirl their moustaches... racists and xenophobes are easy to spot and (sadly) well-known in British politics. Marxists and dictators are far less common. Maybe I'm wrong, but there's no way I'd vote for someone with that suspicion lurking in the back of my mind.
So what next for Labour ? They should have elected someone like Owen Smith - someone who could seriously challenge Brexit and build support. They'd gain ground at the next election, but Labour are so far behind right now that a win is very unlikely (if they do win, nothing has been lost there). So don't waste one of the heavy-hitters on a battle you probably can't win. Use a credible candidate to win back ground, then next time bring up someone like Alan Johnson or Hilary Benn - someone the wider electorate actually would vote for.
So, a year on, I was wrong about Corbyn - but I was right to give him the test period. Unfortunately it's a test he's failed abysmally. A year is a long time in politics, and sometimes it's also very unpleasant.
Too long didn't read version : "So that's the brain. Impressive, isn't it ? But, also, a bit stupid."
That's the afterword and essentially the short version of Dr Dean Burnett's hugely impressive book, The Idiot Brain. Anyone interested in rational thinking - and if you're reading this blog that's probably you - is going to want to buy this book and probably subscribe to the author's blog. Unless of course you're a neuroscientist yourself and/or you don't like jokes for some reason.
The Idiot Brain is a mixture of neuroscience and psychology exploring the many quirks of human behaviour, from the mundane but interesting (why do we go into a room and immediately forget why we went in ?) to the complex and annoying (why do we believe things despite overwhelming evidence to the contrary ?). Much of the pop psychology on the internet turns out to be basically correct, but where Burnett goes deeper is in exploring the physiological reasons why this happens. While never really tackling the issue of what consciousness actually is, Burnett takes a no-nonsense approach to any ideas about mystical woo-woo :
... if you feel the brain is a mysterious and ineffable object, some borderline-mystical construct... etc., then I'm sorry, you're really not going to like this book. Don't get me wrong, there really is nothing as baffling as the human brain; it is incredibly interesting. But there's also this bizarre impression that the brain is 'special', exempt from criticism, privileged in some way, and our understanding of it is so limited that we've barely scratched the surface of what it's capable of. With all due respect, this is nonsense. The brain is still an internal organ of the human body, and as such is a tangled mess of habits, traits, outdated processes and inefficient systems.
By the end of the book I emerged feeling astonished not that people believe in crazy things, but that they believe anything at all. That online conversations often degenerate into insulting some stranger's questionable parentage is nowhere near as impressive as the fact that we're even able to hold self-consistent conversationswithout constantly collapsing into a dribbling wreck and pooping everywhere.
The worst thing I can find to say about this book is that there are numerous obvious typos. It's not like it's War & Peace or anything, it took me about two days to read, so it should have been easy to find someone to proof read it and spot these. But who cares ? This dude has succeeded in making neuroscience funny. That he's also got a few typos is testament to the fact that the brain isn't a computer. So let's start with that. I'm going to keep this review relatively short, because I really think you should actually go out and buy this book. If you don't enjoy it, I personally will come to your house and bake you some cookies. Or not.
The Brain Is Not A Computer
Incidentally, it's true that we don't know what sleep is for. Although we know a lot more about the brain than is often reported, there's a huge amount we still don't know.
Or at the very least, it's certainly not like any artificial computer. Computers store memory in labelled files and access them flawlessly when requested. They process data at a fixed rate with limited programs. If you need them to process data faster, you'll need to get a (literally) more powerful computer. Run the same program on the same file a thousand times and you'll get the same result in the same speed, using exactly the same amount of power each time.
Not so the brain. The brain doesn't even have any sort of "code" that's run, it has a series of neurons which are chemically linked with neurotransmitters. The level of these transmitters varies - it's not analogous to a computer where everything is linked via electrons along fixed connections. It doesn't even use the same amount of power each time - the more a task is done, the more the brain optimises itself to do that task. So the things the brain does best - even very complicated tasks - use the least power. A computer program that can solve one equation can't learn how to solve others by itself, or optimise itself to be more efficient. And it doesn't have any preferences and it certainly doesn't have a mood.
Mood matters a great deal to the brain, especially fear. Which makes a lot of evolutionary good sense, because in the not so distant past, we faced a lot of very real existential threats. The brain still thinks we're in a world full of sabre tooth tigers and enraged mammoths hell bent on turning us into something squidgy. Or, as Burnett puts it :
To our brains, daily life is like tightrope-walking over a vast pit full of furious honey badgers and broken glass; one wrong move and you'll end up as a gruesome mess in temporary but exquisite pain.
The brain treats everything unpleasant as a sort of threat. So being criticised activates (to some degree) the same areas that prepare the infamous "flight or flight" response, allowing quick, decisive action but not necessarily the most reasoned or logical choices. And different areas of the brain are interconnected in incredibly complex ways, leading to bizarre conclusions drawn up by the pattern recognition system : "it's either a dressing gown or a bloodthirsty axe murder".
Then there's memory. Never mind that the brain processes sensory information in a totally different way to a computer, it also rewrites history, doesn't ever label the video tapes, and replays memories more or less at random :
Imagine a computer that decided that some information... was more important than other information, for reasons that were never made clear... Or that filed information in a way that didn't make any logical sense... Or that kept opening your more personal and embarrassing files, like the ones containing all your erotic Care Bears fan fiction, without being asked, at random times. Or a computer that decided it didn't really like the information you've stored, so it altered it for you to suit its preferences.
The brain likes itself. According to Burnett, it alters memories to appeal to the ego, making us more important in our own memories than in real life. A bit like this :
Well, not quite. In a normal functioning adult brain, it's more subtle than that - it doesn't create entirely false memories, it just tweaks them. But it does happen. Sometimes people aren't deliberately lying or exaggerating, they genuinely believe things happened a bit differently because that's how the brain works. Burnett says throughout the book that the only thing the brain really trusts is itself, so it's got to have self-confidence. It can't be re-evaluating everything all of the time... but that doesn't mean that twenty year old memories of past mistakes can't suddenly re-emerge and cause acute embarrassment and self-doubt, from time to time.
Finally, if you tell a computer to do something it will do it. The brain might not. The amount of attention it can give to a problem is very limited. While long-term memory capacity is unknowably vast, short-term memory (which is where all the hard processing happens) is very limited. It can hold maybe four "items" at once. Admittedly, the definition of an "item" is hard to pin down, and the brain has a lot of tricks for maximising this (as well as being able to draw on the formidable resources of long-term memory) :
If you were asked to remember the words, 'smells', 'mum', 'cheese', 'of' and 'your', that would be five items. However, if you were asked to remember the phrase, 'Your mum smells of cheese', that would be one item, and a possible fight with the experimenter.
Which explains how you can forget why you went into a room in just a few seconds. Any distraction can easily push your reason out of your tiny short-term memory (and it's very, very short : about a minute - anything that lasts longer is already in long term memory). Even if it got stored in long-term memory, there's no guarantee it will be easy to recall. That can happen by (amongst other methods) repetition of the item, which, it's thought, puts a sort of marker flag on the memory making it easier to find, like clearly labelling something in a filing system. It's possible that everything we experience is stored somewhere in this vast long-term memory, it's just very badly labelled.
Circumstance also matters - it's easier to recall something in a similar situation. So to some extent, even drinking alcohol can help you access memories that formed when you were having a drink.
Why People Believe Stupid Things
As will now be obvious, the brain doesn't accept information at face value. It's evaluating the external evidence just as much as it's evaluating itself. It's egotistical and doesn't like calling its established ideas into question - and the more personally important the idea, the less it's likely to want to re-evaluate it. If a belief is under attack, the brain doesn't always distinguish that from a personal attack - even if the attacker had no personal hostile intent at all.
So when you're trying to persuade someone not merely that some particular fact is true, but that their core values are wrong, you're literally fighting against biology. They're resisting you not because they're a stubborn idiot (although they might be), but because millions of years of evolution are screaming inside them that they're basically right.
Raw intelligence is only part of this. Burnett basically confirms my speculation than intelligent (meaning raw processing power) people aren't always the easiest to persuade. There's some evidence - again limited - that there may be a specific area of the brain which attempts to do objective self-analysis. So there may not be a tight correlation between ability and self-evaluation - you can get intelligent people who think they're good at everything, even though their abilities are limited to specific areas. By and large though, the Dunning-Kruger effect is pretty strong.
Burnett describes how the nature of intelligence isn't settled. Do we have just one general sort of intelligence that we apply more vigorously in areas we're particularly interested in, or are there multiple, disparate sorts of intelligence - e.g. being good at painting versus being good at maths ? We don't yet know, although currently the prevailing view is the former. Intelligence seems to be dependent on how interconnected parts of the brain are. While it does seem to be possible to actually increase raw intelligence (so my notion that everyone has an upper limit of intelligence was wrong), Burnett says this takes a lot of effort and gains are limited, especially later in life. You can, however, become extremely good at certain tasks. "Brain training" programs help you get very good at specific games, but there's no evidence they really increase your overall intelligence.
Anyway, there's an even more fundamental reason why people can reach silly conclusions. Pattern recognition in the brain is very, very advanced, but of course it's better to have false positives and see connections that aren't there - the classic idea that it's better to run away from a tiger that isn't there than not run away from one which is. There's some evidence, albeit limited, that certain parts of the brain involved in this pattern recognition function differently in individuals. So conspiracy theorists may be more prone to wacky beliefs because at a physiological level, their brain isn't as good at checking whether correlations are meaningful. They have more difficulty accepting random coincidence as just random - the internal checks that normally prevent pattern recognition from going overboard aren't working so well.
If this is the case - and it's by no means certain - it's only part of the story. Another important element is fear. Again evolution has forced the development of a brain that's hyper-attuned to seeking out threats :
Consider how many superstitions are based on avoiding bad luck or misfortune. You never hear about conspiracies that are intended to help people. The mysterious elite don't organise charity bake sales.
So there's a squishy, egotistical blob of goop in our heads that's incredibly good at detecting patterns, not always so good at spotting meaningful connections, doesn't like uncertainty, and is convinced that everything it doesn't like is probably out to kill us - for the very good reason that in the not so distant past, this was true. Given all this, it's easy to see how people can come to believe in outlandish nonsense. Yes, raw intelligence can also play a part, but by no means does a belief in a conspiracy theory automatically equate with true stupidity.
Then again, it's not quite true that you never hear of helpful conspiracies. I've encountered people who are utterly convinced that aliens are deactivating our nuclear weapons. "I blame the parents", says Burnett. Well, he doesn't really, but almost. While the egotistical nature of our brain wants to blame other people for our misfortune, paradoxically it also likes to think that someone's in control. During our formative years, people are in control - most of us grow up in a world that's relatively well organised by powerful authority figures. So it's not so difficult to see how that could translate into an underlying belief in all-controlling lizard people. As a BBC article on impostor syndromne puts it :
Yet if it's terrifying to feel like the only fraud in your field or organisation, it's equally terrifying to confront the truth that everyone is winging it.That's another reason why it can be hard to accept that the impostor phenomenon is universal: we desperately want to believe that there are grown-ups in control - especially in fields such as government, medicine or law.
Indeed, it has been argued that this is one reason people believe in otherwise ridiculous conspiracy theories. In some sense, it's actually more reassuring to believe that a sinister cabal is manipulating the course of history than that they aren't: that way, at least someone would be indisputably in charge.
The final aspect that should be mentioned is groupthink - the unconscious need to agree with other people so as not to cause trouble. It's nicely illustrated in the following video, although I prefer this version :
Groupthink is something that pseudoscientists and conspiracy theorists routinely chant as a mantra as to why everyone else thinks they're mad. And in some circumstances (we'll get to that in a minute) it's very powerful... but it is not an absolute. It's doubtful the woman in the video really believes it's necessary to stand up when the alarm goes off. Similarly, Burnett describes an experiment in which participants were encouraged to give an obviously wrong answer to a simple question, but 25% of the participants gave the correct answer anyway.
Under normal conditions, groupthink only has so much influence. Science isn't immune to this, but it does have safeguards. Competitive collaborations combine the power of group intelligence with a desire to disprove each other. Sensible academic institutions foster an informal atmosphere where people of different disciplines talk to each other. While any one institute might fall victim to herd mentality, the idea that they all do, on a global scale, is pure farce. As far as I can tell, science has an amazingly good track record of considering radical alternatives, though the media don't like to report it like this. The "one man against the system" narrative is very powerful. And wrong.
Not, perhaps, that it will do me any good to tell people this - assuming for argument's sake that my position is right and the psuedoscientists are wrong. Burnett suffers no false modesty about intelligence - some people are just more intelligent than others, but intelligence is a hard thing for the brain to evaluate in others. Consequently, it's hard to trust. Strength ? Easy to judge, and easy to empathise with - anyone can become stronger through training. Intelligence ? Much harder to assess, and very much harder to increase.
I'm a neuroscientist by training, but I don't tell people this unless directly asked, because I once got the response, "Oh, think you're clever, do you ?" Do other people get this ? If you tell people you're an Olympic sprinter, does anyone ever say, "Oh, think you're fast, do you ?" This seems unlikely... Someone who is more intelligent than you presents an unknowable quantity, and as such they could behave in ways you can't predict or understand. This means the brain cannot work out whether they present a danger or not, and in this situation the old, "better safe than sorry" instinct is activated, triggering suspicion and hostility.
Proclaiming yourself an Olympic sprinter is all well and good, but standing up and saying you're more intelligent than someone else ? That's considered to be the height of arrogance.
Which means my previous article on why people don't trust science is missing something very fundamental : intelligence is perceived as a threat. Trying to disprove experts may partly be a way to deal with that threat, to show that their intelligence isn't so great after all. So I was probably right to say that we shouldn't let specialists do outreach about any topic under the sun, but try and keep them focused on their specialist area. Don't present scientists as some sort of authoritarian philosopher kings, but as people with specialised abilities that have been honed through years of training. Just as an Olympic sprinter can run 100 m in 10 seconds but can't run a particle accelerator, so a scientist can operate a telescope but knows bugger all about home decorating or the fine details of social welfare programs.
Why People Aren't Nice
While pseudoscience is personally extremely irritating to me, the fact that people can be jerks is bad for everyone. But most of the time, contrary to the impression one gets from an internet comments section, they're not. The brain likes to be liked, it likes doing things that other people like, and as a social animal it likes communicating and forming relationships. But even these nicer aspects of the brain can have very nasty consequences indeed.
Even though there are plenty of people willing to go against groupthink, there are still enough who are willing to do terrible things just to conform. But in some situations, the need to not just fit in but be valued by the group can lead to absurd extremes :
The brain is built for this. Other people are one its primary sources of information about the world - if other people seem tense, then we pick up on this because there's probably a good reason for the tension - it saves us from having to do any extra processing ourselves helps us avoid threats more quickly. It's an eminently sensible evolutionary strategy.
But more than that, individual identity can be subsumed by group membership. The same areas of the brain involved in giving us a sense of personal identity are also important in social interactions - we're hard-wired for group identity. A threat to the group becomes a threat to us. The Stanford Prison Experiment is surely the best known example of how quickly and severely this can happen to perfectly ordinary people. It's not exactly that people can't fight their inner natures, it's that they won't unless they're aware of what's happening.
Groupthink can be extremely powerful in an angry mob but individuals can also be abhorrent. One of the weirder hypothesis as to how this can happen is because the brain also wants to believe that the world is basically just and fair. The brain is known to like certainty - it wants to be sure it's got things basically right, perhaps because altering all those neural pathways is extremely difficult and might drive us insane.
An extension of this is that it might also be prone to believing that the world is basically logical. Which makes sense, because if it wasn't, there wouldn't be much point in anything if actions resulted in random consequences. But this sense of an underlying justice to the world can mean that when we see someone suffering, we automatically assume they must have done something to deserve it. Which is common in many religions - not because that's what religion teaches, but because that's a fundamental part of human nature.
People are also far more likely to blame a victim they strongly identify with. If you see someone of a different age/race/gender get hit by a falling tree, it's much easier to sympathise. But if you see someone of your age, height, build, gender, driving a car just like yours and colliding with a house like the one you live in, you're far more likely to blame that someone for being incompetent or stupid, despite having no evidence of this...
It seems that, despite all the inclinations towards being sociable and friendly, our brain is so concerned with preserving a sense of identity and peace of mind that it makes us willing to screw over anything and anyone that could endanger this. Charming.
The brain produces a mass of contradictions - on the one hand, people of another race are perceived as different so it might not blame them when bad things happen to them, but on the other, they can be seen as so different that it willingly treats them as sub-human.
And yet... while no-one is born racist, it seems that everyone is born with the mental equipment to become racist. Burnett mentions the cross-race effect, where people struggle to remember the faces of those of a different race (or as granny used to say - quite literally - "I don't know how people from China all tell each other apart"). Whether this is an inbuilt bias, possibly related to the egocentric memory, or simply an effect of being immersed in one particular race, is unknown.
Another effect is that although the brain has the egocentric bias, it also has ways of dealing with this to stop it getting out of hand. But this requires thinking, so if someone says something is forced to respond rapidly, they might come across as a much worse person than they really are. There's even a specific part of the brain which deals with empathy, and if this is disrupted, it's easy to see how people can become jerks.
External stimuli also matter, which brings us back to the brain not being a computer again. The bias compensation doesn't work so well if we're experiencing pleasure - we underestimate suffering not because we're fundamentally bad people, but because that's how we're wired.
The more privileged and comfortable someone's life is, the harder it is for them to appreciate the needs and issues of those worse of. But as long as we don't do something stupid like put the most pampered people in charge of running countries, we should be OK.
So there you have it - being rich really can make you a bastard.
I've cherry-picked the topics most interesting to me for this "review", but I've hardly scratched the surface. I haven't told you about the complicated ways the brain deals with ostensibly simple thinks like checking whether the stomach is full, or how cooking may have had an important role in the development of the human brain itself. Or what happens when things go wrong - how the brain can become unable to distinguish fantasy from reality. Or, fascinatingly, how clinical delusions are a relative state - a religious fanatic claiming to hear the voice of God wouldn't be considered delusional, but "an agnostic trainee accountant from Sunderland" claiming the same thing probably would.
I also haven't mentioned just how damn readable the book is, or how it had me snorting out loud in the bookshop or explains (yes, really) why Gordon Ramsay is so angry. It's very, very funny. Even more importantly, the books makes it very clear when and where there are uncertainties, sometimes huge gaping holes in our knowledge - like what on earth sleep is for, or the nature of intelligence. It doesn't try and brush these under the carpet or hide controversies. But nor have I mentioned any of the technical details about which chemicals cause which reactions in which part of the brain, at least some of which are well understood. Or the wonderful, empathetic chapter on mental health, which would make the world a happier place if more people read it.
Not all of human behaviour is due to these unconscious biases, but they have a huge effect on us. For the most part the brain does an astonishingly successful job of managing a bewildering assortment of external signals. That it can come up with bizarre conclusions isn't as impressive as the fact that it can come up with conclusions at all. So for goodness bloody sake, buy the book, try and understand that "the brain apparently thinks that logic is a precious resource to be used only sparingly" and maybe you'll end up with a bit more empathy for all those angry people on the internet. At the very least, it'll change the way you think about yourself. The only real problem with the book is that it's far too short.
Michael Gove's most famous quote is undeniably, "The British people have had enough of experts". But before we get carried away with this, let's remember that this is the moron who also said, "I set my personal ambition aside in my bid to become Prime Minister". God knows what the real limits of his ambition were. My money's either on being crowned king or becoming President of the Zoological Society.
Still, the good people of Britain did vote against all the expert advice. The anti-vaccine movement continues to be a thing. The internet is full of people who think the Earth is flat. What's going on here ? What happened to a basic sense of good old-fashioned trust ?
Science has many peculiarities. New ideas require criticism, but not too much. Established ideas are venerated. but don't get spared attacks when new evidence undermines them. Many ideas are totally discredited, few are seen as indisputable. Similarly individual scientists often hold each other in varying degrees of regard, from, "that utter moron" up to, "the next Einstein", but most of us are aware that Einstein got things wrong. Science doesn't have authority figures, but it certainly does have more and less credible sources.
Science itself is not about trust. We don't leave our scientific pancakes papers out in the hope that the kitties other scientists will leave them well alone. If anything it's a fundamentally mistrustful process, the results are there (if people are doing things correctly) so that other people can hack them into little bloody pieces. Was the data properly measured ? Does the model work under these conditions too ? If I do things in exactly the same way will I get the same result or was there something wrong ? What if I change things just a little bit ?
It's not that we assume every other scientist is lying, it's simply that reality is complicated. The history of science is not just one of learning more facts, but one of models being replaced with other models - usually better models that give more accurate, more precise predictions. This is profoundly different from discovering the model which is the correct, complete solution, which is a very much rarer event.
But while science itself is not about trust, science communication is a different story. Between experts trust is not usually such an issue - we may trust a skilled theorist hasn't had any silly bugs in their code or an observer hasn't measured the wrong thing, but we know their ideas may be incomplete or misinformed. In small communities we become aware - either through personal interaction or by reading papers - that some people are extremely skilled at some things but are biased in other areas. And everyone knows that everyone else is doing this to them, which is why the system is (to a large but by no means perfect extent) self-correcting.
Communication to the public is different. For me as a scientist it's helpful to remember just how incredibly ignorant I am as soon as I stray from my specialist area. I can tell you with some degree of expertise what the latest thinking is about gas stripping from galaxies or the formation of long tidal tails. But stellar spectroscopy ? Nope, never done that. Cratering studies ? Nope. Pulsar timing ? Nope. High redshift galaxies, even ? Nope. But when I listen, for interests' sake, to experts in these areas describing their findings, I expect them to be fundamentally truthful. I expect them to tell me what they all agree on and which bits are controversial. There's no way I can judge these for myself, I don't have the years of experience needed.
In many ways, in a vast number of areas, I'm not much better off than the general public even within astronomy.
Tell me again about baryon acoustic oscillations, no really, I'll definitely try to listen this time...
There are some advantages to being a specialist though. For starters I'm aware of the true complexities of my own field. If I hear experts in other fields making things sound simple, I get very suspicious. And more pertinently, I know when experts are doing a shoddy job of public communication in my own field.
The state of astronomy research is, in my opinion, in pretty good shape. Having worked at three different institutes and met researchers from countless others, I've encountered a diverse range of people - some are closed-minded idiots, some are open-minded idiots, most are in the happier middle ground of, "won't change my mind without evidence". But the job of public communication of astronomy is not always in such good shape, and it seems to me that in other areas of science it's in a positively dreadful situation.
Before we go any further, let me make one thing absolutely clear because I know you People Of The Internet just love to go to extremes. So listen very carefully, because I shall say this only once : not everything is dire. Got that ? Good. Now we can talk about the things which are dire without worrying that the world is going to hell, we had better weather in those days, get those kids off ma lawn and all that gubbins.
I hate them. Not all of them, but I have to say I find the majority of them to be awful. They're prone to huge exaggeration to make everything sound exciting and new and most of them are completely unnecessary cries for attention. Take this one. It opens with this wonderful gem :
The solar system could be thrown into disaster when the sun dies if the mysterious ‘Planet Nine’ exists, according to research from the University of Warwick.
Riiiight. Because the death of the Sun would just be a bit of a mild inconvenience, I suppose. There are too many things wrong with this opening statement so I stopped reading at that point. I have to add that one aspect of research that does worry me is fashion. As soon as a major discovery - or even not a discovery in this case but mere inference - is made, the next 18 months or so are chock-full of people trying to jump on the bandwagon to publish sexy, trendy papers. We don't even know Planet X exists, but now we're speculating on its long-term impact on the Solar System ? Come on.
But I digress. The communication problem we have here is that whenever a paper is published, there's a magic little checkbox : "do you want assistance with a press release ?". If you tick it, the journal will help you write a press release and distribute to major news sites for distribution : regardless of the importance of the science content*. These sites being extremely large, people rely on them for their science news. And having an established, trusted source is a good thing, because it's damn difficult to trawl the internet for sensible blogs by sensible scientists**.
* I would hazard that a simulation of what a hypothetical planet might do to our Solar System billions of years from now is not really something that needs a press release. On the other hand there is a real, unavoidable need for researchers and their departments to make a name for themselves to attract funding. ** What ? I'm very sensible. Stop laughing. You shut your face.
Clearly, though, no-one stopped to think how potentially ridiculous this opening comment could sound. Still, at least in this case research was done and the paper published. Not so in other cases, like this very recent example of a supposed SETI signal. As far as I can tell, no paper was published at all, they simply decided to do a press release on a detection more than a year later. This "science by press release" is tantamount to pseudoscience, often done by people whose research is... well, sub-par. Informing the media before the paper is accepted is a massive warning sign, the kind you should take very seriously indeed.
No, not even to see what happens.
A final example I've already covered in somewhat more detail : Dragonfly 44, a supposedly Milky Way mass galaxy without many stars. Now I actually have no major problems with the original paper, but the press release annoys me intensely. First, we have this bizarre statement :
“Very soon after its discovery, we realised this galaxy had to be more than meets the eye. It has so few stars that it would quickly be ripped apart unless something was holding it together,” said Yale University astronomer Pieter van Dokkum, lead author of a paper in the Astrophysical Journal Letters.
Yeah, so, just like every other galaxy then. This is a bloomin' daft thing to say. It may be literally true as stated, but there's no way people aren't going to interpret this as meaning that's what's unusual about the galaxy. Which it isn't. Did no-one stop to think, "what does this statement imply in this context ?" instead of just thinking, "is this literally true ?". Because that seems like high school English lesson 101 to me.
Then we have a slightly subtler but perhaps more important miscommunication :
“Amazingly, the stars move at velocities that are far greater than expected for such a dim galaxy. It means that Dragonfly 44 has a huge amount of unseen mass,” said co-author Roberto Abraham of the University of Toronto... Dragonfly 44’s mass is estimated to be 1 trillion times the mass of the Sun, or 2 tredecillion kilograms (a 2 followed by 42 zeros), which is similar to the mass of the Milky Way."
Also literally true. But it completely misses out from the original paper the statement that this mass estimate is an extrapolation from the directly measured mass by a factor of 100 ! And that relies on models which are known to have severe problems. To bluntly state the mass estimate without any hint of the significant uncertainties is downright disingenuous.
This Press Release Will Shock You
Which leads me on to the ugly and deformed cousin of press releases : clickbaiting. Or rather, secondary journalism where the journalist doesn't bother to check the details of the original source. "Mystery solved", "we finally know", "scientists baffled" are all common staple headlines of this so-called reporting. These are all the more damaging when you realise that most people don't even read the article, all they see is the headline.
"Scientists baffled" is not all that bad by itself. It's good to emphasise that science is about uncertainty and not knowing how the universe works, of finding things out. "Baffled", though, is a rather strong word to use so often. If we were baffled half as much as the clickbait suggests, we'd have no friggin' clue about how things work at all. And that's not right, because while science indeed doesn't know everything, it most assuredly does know some things. To claim that we're baffled by the tiniest anomaly is almost to say we've just given up. Nope, can't solve any more problems, let's let the pseudoscientists have a go instead. We're all just a bunch of incompetents.
But far, far worse than this is, "mystery solved" and lately, "we finally know". No, the mystery bloody well has not been solved, no, there's nothing "final" about the latest piece of evidence whatsoever. This is a hideous thing to say, because 99% of the time a new piece of evidence will come along and disprove the apparently solved solution. Keep doing this - keep telling people we definitely know the answer then five minutes later tell them that answer was bunk - and it inevitably leads to mistrust. I know I wouldn't trust anyone who kept insisting that they were definitely, definitely right this time even though they said that fifteen times already.
The "experts were wrong" card becomes extremely, legitimately powerful if people are told that the mystery is decisively solved when it was really nothing of the sort - and that allows them to justify any ridiculous idea they want. The danger of this should be obvious : expert opinion loses the proper weight it should be given.
Worse than this, perhaps, is the combination of "scientists baffled" and "mystery solved". It gives such a starkly black and white picture of research - either something is baffling, or it's solved and therefore interesting for five minutes until we can find something else to be baffled about. And as we all know...
Research is nothing like the impression one gets from clickbait - or, for that matter, from rote learning in schools. There are only rarely secure, final answers, and most of the time the mysteries would be more fairly described as "a bit puzzling". Even the really major problems usually have plenty of possible solutions.
It's often said (or at least it was to me in schools) that scientists are like detectives, which is not entirely inaccurate. The difference is that in science you expect your conclusions to keep changing as new evidence is gathered - especially in something like astronomy or particle physics, which can never access the full information about the systems they study. The detective can prove whodunnit with a degree of confidence astronomers can only dream about. If they're very lucky the detective might have CCTV showing the entire murder. They don't expect that if they use a camera sensitive to another wavelength they'll find that there was a hitherto invisible elephant in the room, with murderous intent in his heart... but that happens in astronomy all time time. Only not with elephants.
Not stressing the uncertainties is hugely damaging for public confidence in science. The, "we're really very sure about this" card is one that should be played only very rarely. What you want is to emphasise the uncertainties, not brush them under the carpet. That gives the, "everyone agrees about this" card much, much more strength when you really need to play it. There's nothing wrong with an individual scientist saying, "I think this because..." but there are a hell of a lot of problems with them saying, "I know this is true because blah blah and all of my other colleagues are just wrong". They don't even have to say that directly, they can just imply it - as they do so, all too often in press releases - by failing to acknowledge other possible interpretations.
Compare and contrast :
1) The situation we have now, in which scientists say (or are presumed to say) that they're totally confident about every small detail - almost all of which are invariably discredited, telling the public that they need to trust them.
2) Scientists stressing the uncertainties involved and the sometimes very strong conflicts with other scientists, but very occasionally united about a major issue. That's the real power of a scientific consensus, that's why it's trustworthy - precisely because most of the time scientists disagree with each other, those rare occasions when there's near-unanimous agreement should not be ignored. Yet that's precisely what's happening as "journalists" (i.e. clickbait writers) and some scientists abuse the idea of certainty for quick, attention-grabbing headlines.
Who Said That ?
While people seem all too happy to swallow clickbait thinking it comes from a credible source, there are other people who few people ever trust : politicians. Now, there are a few politicians out there who I think are fundamentally decent people. But would I trust them ? Hell no. It's in the nature of a politician's job to be untrustworthy. Winning votes requires appealing to people who want different things, I expect them to have an agenda because they damn well do. I expect them to use hyperbole and spin to make their point, because they damn well do that too.
Which is why although I rather like Al Gore, An Inconvenient Truth was an absolute disaster. We should never, ever, ever have to resort to getting politicians to explain science to the public. If you already believed in global warming, you probably took the main message to heart and got fired up to hear, at long last, a major politician really trying to do something. If you didn't... well, here was a message about science coming from one of the least scientific and most biased sources possible. Of course you wouldn't bloody trust it. And that's more than enough to turn an issue that should have been as detached from political reality as general relativity and electromagnetism into a major, divisive political issue.
Science, of course, has a role in politics, because science provides the facts (or more often the most likely possibilities) on which to base a decision. There's no getting around that. But a politician trying to tell us not how to act on the evidence, but what the evidence actually is ? No no no no no nononononono ! NOOOOO ! If you start to see one group of scientists as politically motivated, it becomes a hell of a lot easier to doubt the others as well.
Al Gore is an extreme example, but the use of "science advocates"* to explain complex issues is considered perfectly normal. For some reason Neil deGrasse Tyson, Bill Nye, Brian Cox, Stephen Hawking and the like are routinely trotted out not just to tell us about the wonders of the universe, but climate change. This is crazy. Imagine if the situation were reversed and you only ever heard about astronomy from climatologists. Would that engender confidence in astronomy ? No. Of course it wouldn't, because that's ridiculous. You'd be thinking :
*I include real scientists in this list when they're speaking about areas they're not qualified in.
Climatology also suffers from (quite unintended) associations with environmental activists. Science is about uncertainty and learning. Greenpeace and the like, on the other hand, are about fanatical certainty : rather than examine the evidence, let's just burn down these genetically modified crops. Instead of investigating how dangerous nuclear power really is - which it basically isn't very much - let's just call for it to be banned. Every time a way is found to allow us to reduce the damage to the planet, nope, let's just insist we need to go back to living in the trees.
I am an environmentalist. I like cute fluffy animals and want biodiversity and to prevent climate change and all that stuff. But I also want to use a computer as often as I damn well please. I want to be able to travel to anywhere I want as often as I want without feeling guilty. I want to live my life the way I see fit, not the way some tree-hugging ecowacky wants me to behave. These people are perverting the course of science for their own agenda and ideologies, not out of any desire for my welfare or because the evidence forces them to that conclusion. With enough bullshitting, you can make evidence suggest anything you like.
One Of Us
Which brings me to my final point : tribalism. Politics is of course inherently tribal : politicians usually have to toe the line for the (important) sake of party unity. Science as a whole is strongly anti-tribal : competitive collaborationsensure that no one group dominates for very long, there's a strong incentive to disprove theories that means any long-held theory isn't merely entrenched thinking, it's just a damn good theory. Science advocates, on the other hand, are much more of a grey area, blurring the lines between science outreach and political campaigning.
A recent article in the Guardian illustrates this greyness quite well. On the one hand :
Most science communication isn’t about persuading people; it’s self-affirmation for those already on the inside. Look at us, it says, aren’t we clever? We are exclusive, we are a gang, we are family. That’s not communication. It’s not changing minds and it’s certainly not winning hearts and minds. It’s tribalism.
I don't necessarily agree that this is true of most science communication but it's certainly true of some, and certainly of the majority of the most influential sources. On the other hand, this particular article picks the wrong target - to wit, Brian Cox, who produced a graph in a debate with a climate change denier. I'm no fan of Brian Cox*, and even though as a particle physicist his defence of climatology is the very thing I'm trying to persuade you is a very bad idea (more on that shortly), I don't accept some of the other points :
*Though not for the reasons I dislike other science activists, which are far more serious. I just don't like his silly accent and presentation style.
Most people simply want to know that someone is listening, that someone is taking their worries seriously; that someone cares for them. It’s more about who we are and our relationships than about what is right or true. This is why, when you bring data to a TV show, you run the risk of appearing supercilious and judgemental. Even – especially – if you’re actually right.
No. Discussing the facts isn't somehow elitist, and Cox's opponent was claiming that the evidence didn't support the science - leaving Cox with no choice but to discuss the evidence. The article also notes : "On the whole, I don’t think people who object to vaccines or GMOs are at heart anti-science." Fair enough, but if that's true then it's an absurdity to suggest we shouldn't discuss the science with people who don't accept it. You can't have it both ways. Don't like the science or the fact that the graph disproves something ? Tough. Ugly facts are the heart and soul of science.
But this doesn't change the fact that the debate was between two non-experts. As I've mentioned before many times, St Augustine understood why this was important as far back as the 5th century :
Now, it is a disgraceful and dangerous thing for an infidel to hear a Christian, presumably giving the meaning of Holy Scripture, talking nonsense on these topics; and we should take all means to prevent such an embarrassing situation, in which people show up vast ignorance in a Christian and laugh it to scorn.
Their are many problems with relying so exclusively on science advocates as in the current situation. First, they can make mistakes that real experts wouldn't, as St Augustine understood. Sometimes, bizarrely, this is used to somehow try and defend science advocates : "well of course they got that wrong, they're not experts in that field." This is completely wrong-headed. Don't defend non-experts for making mistakes, get a bloody expert in to talk about it !
Second, because they're not directly involved in the details, they won't have the same knowledge of the uncertainties. Which leads to all the problems I've already discussed : "you scientists told us blah, but blah was wrong, again !". And because clear answers are easier to understand, science advocates get trotted out on a whole range of issues they really have no business speaking about as though they were the voice of all science. They become over-confident authority figures, which is the very last thing science needs. It also doesn't help when such incredibly influential figures speak in their professional capacity about purely political issues that have nothing much to do with science (as opposed to, say, a scientist with a blog written in his spare time with 28 followers) - they're seen as political figures, with all the mistrust that implies.
Third, there is a worrying, downright insane tendency among science advocates to dismiss philosophy as "useless". Neil deGrasse Tyson, Lawrence Krauss, Stephen Hawking, Bill Nye... all seem to have views on philosophy that are utterly moronic. Nye's response to a philosophy student, which borders on, "philosophy won't make you rich", is just so ridiculous I can think of only one appropriate reaction :
Screw you, Bill Nye. Screw you.
It should be self-evident to any half-witted scientist* that philosophy is not merely useful for science, it's indispensable. I don't have time to go into that today so I'll just leave this excellent little piece here for the enthusiasts. More importantly for now, this attitudereeks of tribalism : only science (read that link, it's a good one) provides the correct way of thinking, any other form of thought is irrational, anyone who disagrees with me isn't merely wrong - they're unable to think logically. That kind of attitude inevitably means they see philosophy as useless, because philosophy can question what logic itself is, while followers of scientism are certain they know the answer already.
*And in my experience, which is now pushing 10 years in professional academia, it is. There was a well-attended weekly discussion group on the philosophy of science in Cardiff, for example.
Why do so many science advocates have this absurd attitude ? Well, firstly there's probably a strong selection effect at work. It's a lot easier to be charismatic when you're convinced your position is right, because uncompromising men* are easy to admire - they have a dark charisma, and it feels good to have enemies because all your problems are now someone else's fault. Hence they make great public speakers, whereas the "mealy-mouthed" ordinary scientists, who know that everything is subject to errors and uncertainties, aren't always such good communicators. But if all you see of science are the arrogant bastards who get on TV, no wonder people think we're a bunch of arrogant bastards. *That linked article was originally planned to target Neil deGrasse Tyson, whose incredibly smug, superior attitude is by far and away the most tribal of all science popularisers. I ended up gearing it toward Donald Drumpf instead, who was much more (worryingly) popular at the time. Yes, I would go so far as to compare Tyson to Drumpf, one of them is just a little bit nastier than the other.
Scientism, the peculiar view of so many big-name science activists and advocates, has it that only its definitions of logical, rational thought provides meaningful insight into the world around us. This makes it all too easy to label anyone who has an unusual opinion about anything to be "anti-science", even if they're not really anything of the sort. The most common variety of this I see online revolves around genetically modified foods, with a strong reaction of social media saying, "I agree with Tyson on most things but he's wrong about this one". Not that an astrophysicist deserves to be given the slightest bit of authority in pronouncing judgements on GMOs anyway. But are concerns about health automatically driven by, "anti-science" ?
No. Tell someone, "this food is unsafe !" and of course they'll react. It's also worth remembering the "flaw of averages" - hardly anyone is perfectly average, everyone's got some weird ideas. But if you invest an enormous amount of time and effort demonstrating, unequivocally, that vaccines are a damn good idea or that the Moon landings weren't faked or that the Earth isn't flat, if they listen carefully but still say, "scientists are all liars" or whatever - then yes, fair enough, it's hard to see those people as anything other than anti-science. It's probably a mistake to say there isn't any sort of war on science, but the majority of people who reject specific scientific findings tend, in my view, to be perverting the course of science - trying to use it to fit their ideology instead of accepting its most natural conclusion. That's very different to an all-out war against logic.
There's another cliché that I can't avoid here : in the age of information, ignorance is a choice. Or is it ? Despite all the problems I've described, there's nonetheless a wealth of great science information available on the internet. The problem is that it's drowned in bullshit. At the time of writing, the second search result in Google for, "Is climate change real ?" is a website, "Top Ten Reasons Climate Change is a Hoax". Type in, "vaccines are dangerous" and the whole page delivers total bullshit direct to your brain. You can find ludicrous claims to justify whatever idiotic belief you want. Rebuttals and counter-rebuttals... in the age of information overload, is ignorance really a choice ?
Summary and Conclusions
It's sensible to respect science and expertise in general. It's not good to view any scientist as an authority figure.
There's no single reason for the mistrust of science today, but if I had to pick just one it would be the failure of scientists to convey the rigours and difficulties of the scientific method. We don't talk publically about the uncertainties, we just say, "we've done this, let's move on". But it's those uncertainties and the controversies that make a scientific consensus one of the most powerful tools for discovery that we have. Where agreement does occur, it's for very, very good reasons.
Instead what we often seem to be doing (in public communication - within the expert community it's another matter entirely) is claiming that we're certain about everything, only to inevitably find that we were wrong. There's nothing wrong with being wrong, but it's a crazy mistake to present this unwarranted level of confidence the whole time. The most extreme form of this are the science activists, who are so confident that they're right they have this highly divisive, tribal approach that's so vastly removed from real science it's not even funny.
This doesn't matter so much if you're already happy to trust the experts. But what you have to remember is the perspective of those who don't trust them. To have a bunch of non-experts in the field you're skeptical of just accusing you of being anti-science... I should imagine that's not really likely to win you over. And if you do already trust the experts, having your main figurehead telling you that nonbelievers (and yes, these figureheads do believe in science as a religion) are in effect your enemies... well what could be more polarising than that ?
Solutions : break the authority figure image. If a science activist wants to talk about something outside their specialist area, they should do so the way a good journalist would : by finding an expert who is (perhaps) not so skilled at public communication and helping them put things in a way the public can understand. They should be telling you what the experts are saying, not claiming (either explicitly or implicitly) to be an expert themselves. Stop saying, "Neil deGrasse Tyson knows the answer" and start saying, "Neil deGrasse Tyson knows someone who knows the answer."* Instead of defaulting to a regular friendly expert to talk about anything, get as many experts as possible to become more publically visible. If 97% of climatologists agree on humans as the cause of climate change, stop getting astrophysicists to tell us this ! Find some damn climatologists and get them to do some public outreach ! Surely there must be at least one charismatic figure among them ?
* There, I've found a way to let you keep Tyson and Dawkins. Like keeping venomous snakes as pets, you gotta treat 'em responsibly and don't go letting them loose without proper precautions.
Which is not to say that other fields don't have communication problems too. Press releases, as we've seen, can be absolutely dire. I'm not really sure why, public communication isn't really that hard and certainly not as hard as the science itself. Did basic language skills pass so many people by ? Are they only thinking of whether their statements are literally true and not considering their direct implications ?
I am of course a strong proponent of education in the humanities subjects as well as the sciences, partly because poetry is justdamncool, partly because it's important to understand that people are emotionally-motivated and not robots, and partly because it's a great way to teach critical thinking skills. So too is basic statistics. But that will of course take a science journalist only so far, you do need to actually have at least a basic understanding of science as well. The thing is that by the time you get to PhD level science stuff you really ought to have a decent grasp of communication skills anyway. Yet apparently all the routine questions one must ask while doing research - what does this imply, am I certain of this, what are the other options - get thrown out of the window when doing a press release. They could probably do with some form of peer review.
It's great that society has some public figures celebrated for their scientific prowess, even if said prowess is not always what it appears. What's not good is that this is taken to mean, "these are some of the smartest guys on the planet". No they are not. If we're talking first-author publications, for example, then I have more than Tyson. They are often teachers and educators, but not researchers.
Science doesn't have authority figures, and the public don't need scientific authority figures either. What they need is a respect for the scientific method itself, to understand that most major breakthroughs happen not (just) because of lone geniuses but because of the actions of a great many individual researchers, to celebrate scientific achievement and yes, to make the occasional hero of individuals. But this constant, extreme veneration of a very small group of individuals, to give their opinions undue respect on issues they really know nothing about, to allow this tribal attitude and the dogma of scientism to flourish unchallenged (regular readers are well aware that I don't use the word "dogma" lightly) - that's not helping. Lay off the activists, give real scientists a voice instead.