Too long didn't read version : "So that's the brain. Impressive, isn't it ? But, also, a bit stupid."
That's the afterword and essentially the short version of Dr Dean Burnett's hugely impressive book, The Idiot Brain. Anyone interested in rational thinking - and if you're reading this blog that's probably you - is going to want to buy this book and probably subscribe to the author's blog. Unless of course you're a neuroscientist yourself and/or you don't like jokes for some reason.
The Idiot Brain is a mixture of neuroscience and psychology exploring the many quirks of human behaviour, from the mundane but interesting (why do we go into a room and immediately forget why we went in ?) to the complex and annoying (why do we believe things despite overwhelming evidence to the contrary ?). Much of the pop psychology on the internet turns out to be basically correct, but where Burnett goes deeper is in exploring the physiological reasons why this happens. While never really tackling the issue of what consciousness actually is, Burnett takes a no-nonsense approach to any ideas about mystical woo-woo :
... if you feel the brain is a mysterious and ineffable object, some borderline-mystical construct... etc., then I'm sorry, you're really not going to like this book. Don't get me wrong, there really is nothing as baffling as the human brain; it is incredibly interesting. But there's also this bizarre impression that the brain is 'special', exempt from criticism, privileged in some way, and our understanding of it is so limited that we've barely scratched the surface of what it's capable of. With all due respect, this is nonsense. The brain is still an internal organ of the human body, and as such is a tangled mess of habits, traits, outdated processes and inefficient systems.By the end of the book I emerged feeling astonished not that people believe in crazy things, but that they believe anything at all. That online conversations often degenerate into insulting some stranger's questionable parentage is nowhere near as impressive as the fact that we're even able to hold self-consistent conversations without constantly collapsing into a dribbling wreck and pooping everywhere.
The worst thing I can find to say about this book is that there are numerous obvious typos. It's not like it's War & Peace or anything, it took me about two days to read, so it should have been easy to find someone to proof read it and spot these. But who cares ? This dude has succeeded in making neuroscience funny. That he's also got a few typos is testament to the fact that the brain isn't a computer. So let's start with that. I'm going to keep this review relatively short, because I really think you should actually go out and buy this book. If you don't enjoy it, I personally will come to your house and bake you some cookies. Or not.
The Brain Is Not A Computer
|Incidentally, it's true that we don't know what sleep is for. Although we know a lot more about the brain than is often reported, there's a huge amount we still don't know.|
Not so the brain. The brain doesn't even have any sort of "code" that's run, it has a series of neurons which are chemically linked with neurotransmitters. The level of these transmitters varies - it's not analogous to a computer where everything is linked via electrons along fixed connections. It doesn't even use the same amount of power each time - the more a task is done, the more the brain optimises itself to do that task. So the things the brain does best - even very complicated tasks - use the least power. A computer program that can solve one equation can't learn how to solve others by itself, or optimise itself to be more efficient. And it doesn't have any preferences and it certainly doesn't have a mood.
Mood matters a great deal to the brain, especially fear. Which makes a lot of evolutionary good sense, because in the not so distant past, we faced a lot of very real existential threats. The brain still thinks we're in a world full of sabre tooth tigers and enraged mammoths hell bent on turning us into something squidgy. Or, as Burnett puts it :
To our brains, daily life is like tightrope-walking over a vast pit full of furious honey badgers and broken glass; one wrong move and you'll end up as a gruesome mess in temporary but exquisite pain.The brain treats everything unpleasant as a sort of threat. So being criticised activates (to some degree) the same areas that prepare the infamous "flight or flight" response, allowing quick, decisive action but not necessarily the most reasoned or logical choices. And different areas of the brain are interconnected in incredibly complex ways, leading to bizarre conclusions drawn up by the pattern recognition system : "it's either a dressing gown or a bloodthirsty axe murder".
Then there's memory. Never mind that the brain processes sensory information in a totally different way to a computer, it also rewrites history, doesn't ever label the video tapes, and replays memories more or less at random :
Imagine a computer that decided that some information... was more important than other information, for reasons that were never made clear... Or that filed information in a way that didn't make any logical sense... Or that kept opening your more personal and embarrassing files, like the ones containing all your erotic Care Bears fan fiction, without being asked, at random times. Or a computer that decided it didn't really like the information you've stored, so it altered it for you to suit its preferences.The brain likes itself. According to Burnett, it alters memories to appeal to the ego, making us more important in our own memories than in real life. A bit like this :
Well, not quite. In a normal functioning adult brain, it's more subtle than that - it doesn't create entirely false memories, it just tweaks them. But it does happen. Sometimes people aren't deliberately lying or exaggerating, they genuinely believe things happened a bit differently because that's how the brain works. Burnett says throughout the book that the only thing the brain really trusts is itself, so it's got to have self-confidence. It can't be re-evaluating everything all of the time... but that doesn't mean that twenty year old memories of past mistakes can't suddenly re-emerge and cause acute embarrassment and self-doubt, from time to time.
Finally, if you tell a computer to do something it will do it. The brain might not. The amount of attention it can give to a problem is very limited. While long-term memory capacity is unknowably vast, short-term memory (which is where all the hard processing happens) is very limited. It can hold maybe four "items" at once. Admittedly, the definition of an "item" is hard to pin down, and the brain has a lot of tricks for maximising this (as well as being able to draw on the formidable resources of long-term memory) :
If you were asked to remember the words, 'smells', 'mum', 'cheese', 'of' and 'your', that would be five items. However, if you were asked to remember the phrase, 'Your mum smells of cheese', that would be one item, and a possible fight with the experimenter.Which explains how you can forget why you went into a room in just a few seconds. Any distraction can easily push your reason out of your tiny short-term memory (and it's very, very short : about a minute - anything that lasts longer is already in long term memory). Even if it got stored in long-term memory, there's no guarantee it will be easy to recall. That can happen by (amongst other methods) repetition of the item, which, it's thought, puts a sort of marker flag on the memory making it easier to find, like clearly labelling something in a filing system. It's possible that everything we experience is stored somewhere in this vast long-term memory, it's just very badly labelled.
|Circumstance also matters - it's easier to recall something in a similar situation. So to some extent, even drinking alcohol can help you access memories that formed when you were having a drink.|
Why People Believe Stupid Things
As will now be obvious, the brain doesn't accept information at face value. It's evaluating the external evidence just as much as it's evaluating itself. It's egotistical and doesn't like calling its established ideas into question - and the more personally important the idea, the less it's likely to want to re-evaluate it. If a belief is under attack, the brain doesn't always distinguish that from a personal attack - even if the attacker had no personal hostile intent at all.
So when you're trying to persuade someone not merely that some particular fact is true, but that their core values are wrong, you're literally fighting against biology. They're resisting you not because they're a stubborn idiot (although they might be), but because millions of years of evolution are screaming inside them that they're basically right.
Raw intelligence is only part of this. Burnett basically confirms my speculation than intelligent (meaning raw processing power) people aren't always the easiest to persuade. There's some evidence - again limited - that there may be a specific area of the brain which attempts to do objective self-analysis. So there may not be a tight correlation between ability and self-evaluation - you can get intelligent people who think they're good at everything, even though their abilities are limited to specific areas. By and large though, the Dunning-Kruger effect is pretty strong.
Burnett describes how the nature of intelligence isn't settled. Do we have just one general sort of intelligence that we apply more vigorously in areas we're particularly interested in, or are there multiple, disparate sorts of intelligence - e.g. being good at painting versus being good at maths ? We don't yet know, although currently the prevailing view is the former. Intelligence seems to be dependent on how interconnected parts of the brain are. While it does seem to be possible to actually increase raw intelligence (so my notion that everyone has an upper limit of intelligence was wrong), Burnett says this takes a lot of effort and gains are limited, especially later in life. You can, however, become extremely good at certain tasks. "Brain training" programs help you get very good at specific games, but there's no evidence they really increase your overall intelligence.
Anyway, there's an even more fundamental reason why people can reach silly conclusions. Pattern recognition in the brain is very, very advanced, but of course it's better to have false positives and see connections that aren't there - the classic idea that it's better to run away from a tiger that isn't there than not run away from one which is. There's some evidence, albeit limited, that certain parts of the brain involved in this pattern recognition function differently in individuals. So conspiracy theorists may be more prone to wacky beliefs because at a physiological level, their brain isn't as good at checking whether correlations are meaningful. They have more difficulty accepting random coincidence as just random - the internal checks that normally prevent pattern recognition from going overboard aren't working so well.
If this is the case - and it's by no means certain - it's only part of the story. Another important element is fear. Again evolution has forced the development of a brain that's hyper-attuned to seeking out threats :
Consider how many superstitions are based on avoiding bad luck or misfortune. You never hear about conspiracies that are intended to help people. The mysterious elite don't organise charity bake sales.So there's a squishy, egotistical blob of goop in our heads that's incredibly good at detecting patterns, not always so good at spotting meaningful connections, doesn't like uncertainty, and is convinced that everything it doesn't like is probably out to kill us - for the very good reason that in the not so distant past, this was true. Given all this, it's easy to see how people can come to believe in outlandish nonsense. Yes, raw intelligence can also play a part, but by no means does a belief in a conspiracy theory automatically equate with true stupidity.
Then again, it's not quite true that you never hear of helpful conspiracies. I've encountered people who are utterly convinced that aliens are deactivating our nuclear weapons. "I blame the parents", says Burnett. Well, he doesn't really, but almost. While the egotistical nature of our brain wants to blame other people for our misfortune, paradoxically it also likes to think that someone's in control. During our formative years, people are in control - most of us grow up in a world that's relatively well organised by powerful authority figures. So it's not so difficult to see how that could translate into an underlying belief in all-controlling lizard people. As a BBC article on impostor syndromne puts it :
The final aspect that should be mentioned is groupthink - the unconscious need to agree with other people so as not to cause trouble. It's nicely illustrated in the following video, although I prefer this version :Yet if it's terrifying to feel like the only fraud in your field or organisation, it's equally terrifying to confront the truth that everyone is winging it.That's another reason why it can be hard to accept that the impostor phenomenon is universal: we desperately want to believe that there are grown-ups in control - especially in fields such as government, medicine or law.Indeed, it has been argued that this is one reason people believe in otherwise ridiculous conspiracy theories. In some sense, it's actually more reassuring to believe that a sinister cabal is manipulating the course of history than that they aren't: that way, at least someone would be indisputably in charge.
Groupthink is something that pseudoscientists and conspiracy theorists routinely chant as a mantra as to why everyone else thinks they're mad. And in some circumstances (we'll get to that in a minute) it's very powerful... but it is not an absolute. It's doubtful the woman in the video really believes it's necessary to stand up when the alarm goes off. Similarly, Burnett describes an experiment in which participants were encouraged to give an obviously wrong answer to a simple question, but 25% of the participants gave the correct answer anyway.
Under normal conditions, groupthink only has so much influence. Science isn't immune to this, but it does have safeguards. Competitive collaborations combine the power of group intelligence with a desire to disprove each other. Sensible academic institutions foster an informal atmosphere where people of different disciplines talk to each other. While any one institute might fall victim to herd mentality, the idea that they all do, on a global scale, is pure farce. As far as I can tell, science has an amazingly good track record of considering radical alternatives, though the media don't like to report it like this. The "one man against the system" narrative is very powerful. And wrong.
Not, perhaps, that it will do me any good to tell people this - assuming for argument's sake that my position is right and the psuedoscientists are wrong. Burnett suffers no false modesty about intelligence - some people are just more intelligent than others, but intelligence is a hard thing for the brain to evaluate in others. Consequently, it's hard to trust. Strength ? Easy to judge, and easy to empathise with - anyone can become stronger through training. Intelligence ? Much harder to assess, and very much harder to increase.
I'm a neuroscientist by training, but I don't tell people this unless directly asked, because I once got the response, "Oh, think you're clever, do you ?" Do other people get this ? If you tell people you're an Olympic sprinter, does anyone ever say, "Oh, think you're fast, do you ?" This seems unlikely... Someone who is more intelligent than you presents an unknowable quantity, and as such they could behave in ways you can't predict or understand. This means the brain cannot work out whether they present a danger or not, and in this situation the old, "better safe than sorry" instinct is activated, triggering suspicion and hostility.Proclaiming yourself an Olympic sprinter is all well and good, but standing up and saying you're more intelligent than someone else ? That's considered to be the height of arrogance.
Which means my previous article on why people don't trust science is missing something very fundamental : intelligence is perceived as a threat. Trying to disprove experts may partly be a way to deal with that threat, to show that their intelligence isn't so great after all. So I was probably right to say that we shouldn't let specialists do outreach about any topic under the sun, but try and keep them focused on their specialist area. Don't present scientists as some sort of authoritarian philosopher kings, but as people with specialised abilities that have been honed through years of training. Just as an Olympic sprinter can run 100 m in 10 seconds but can't run a particle accelerator, so a scientist can operate a telescope but knows bugger all about home decorating or the fine details of social welfare programs.
Why People Aren't Nice
While pseudoscience is personally extremely irritating to me, the fact that people can be jerks is bad for everyone. But most of the time, contrary to the impression one gets from an internet comments section, they're not. The brain likes to be liked, it likes doing things that other people like, and as a social animal it likes communicating and forming relationships. But even these nicer aspects of the brain can have very nasty consequences indeed.
Even though there are plenty of people willing to go against groupthink, there are still enough who are willing to do terrible things just to conform. But in some situations, the need to not just fit in but be valued by the group can lead to absurd extremes :
The brain is built for this. Other people are one its primary sources of information about the world - if other people seem tense, then we pick up on this because there's probably a good reason for the tension - it saves us from having to do any extra processing ourselves helps us avoid threats more quickly. It's an eminently sensible evolutionary strategy.
But more than that, individual identity can be subsumed by group membership. The same areas of the brain involved in giving us a sense of personal identity are also important in social interactions - we're hard-wired for group identity. A threat to the group becomes a threat to us. The Stanford Prison Experiment is surely the best known example of how quickly and severely this can happen to perfectly ordinary people. It's not exactly that people can't fight their inner natures, it's that they won't unless they're aware of what's happening.
Groupthink can be extremely powerful in an angry mob but individuals can also be abhorrent. One of the weirder hypothesis as to how this can happen is because the brain also wants to believe that the world is basically just and fair. The brain is known to like certainty - it wants to be sure it's got things basically right, perhaps because altering all those neural pathways is extremely difficult and might drive us insane.
An extension of this is that it might also be prone to believing that the world is basically logical. Which makes sense, because if it wasn't, there wouldn't be much point in anything if actions resulted in random consequences. But this sense of an underlying justice to the world can mean that when we see someone suffering, we automatically assume they must have done something to deserve it. Which is common in many religions - not because that's what religion teaches, but because that's a fundamental part of human nature.
People are also far more likely to blame a victim they strongly identify with. If you see someone of a different age/race/gender get hit by a falling tree, it's much easier to sympathise. But if you see someone of your age, height, build, gender, driving a car just like yours and colliding with a house like the one you live in, you're far more likely to blame that someone for being incompetent or stupid, despite having no evidence of this...
It seems that, despite all the inclinations towards being sociable and friendly, our brain is so concerned with preserving a sense of identity and peace of mind that it makes us willing to screw over anything and anyone that could endanger this. Charming.The brain produces a mass of contradictions - on the one hand, people of another race are perceived as different so it might not blame them when bad things happen to them, but on the other, they can be seen as so different that it willingly treats them as sub-human.
And yet... while no-one is born racist, it seems that everyone is born with the mental equipment to become racist. Burnett mentions the cross-race effect, where people struggle to remember the faces of those of a different race (or as granny used to say - quite literally - "I don't know how people from China all tell each other apart"). Whether this is an inbuilt bias, possibly related to the egocentric memory, or simply an effect of being immersed in one particular race, is unknown.
Another effect is that although the brain has the egocentric bias, it also has ways of dealing with this to stop it getting out of hand. But this requires thinking, so if someone says something is forced to respond rapidly, they might come across as a much worse person than they really are. There's even a specific part of the brain which deals with empathy, and if this is disrupted, it's easy to see how people can become jerks.
External stimuli also matter, which brings us back to the brain not being a computer again. The bias compensation doesn't work so well if we're experiencing pleasure - we underestimate suffering not because we're fundamentally bad people, but because that's how we're wired.
The more privileged and comfortable someone's life is, the harder it is for them to appreciate the needs and issues of those worse of. But as long as we don't do something stupid like put the most pampered people in charge of running countries, we should be OK.So there you have it - being rich really can make you a bastard.
I've cherry-picked the topics most interesting to me for this "review", but I've hardly scratched the surface. I haven't told you about the complicated ways the brain deals with ostensibly simple thinks like checking whether the stomach is full, or how cooking may have had an important role in the development of the human brain itself. Or what happens when things go wrong - how the brain can become unable to distinguish fantasy from reality. Or, fascinatingly, how clinical delusions are a relative state - a religious fanatic claiming to hear the voice of God wouldn't be considered delusional, but "an agnostic trainee accountant from Sunderland" claiming the same thing probably would.
I also haven't mentioned just how damn readable the book is, or how it had me snorting out loud in the bookshop or explains (yes, really) why Gordon Ramsay is so angry. It's very, very funny. Even more importantly, the books makes it very clear when and where there are uncertainties, sometimes huge gaping holes in our knowledge - like what on earth sleep is for, or the nature of intelligence. It doesn't try and brush these under the carpet or hide controversies. But nor have I mentioned any of the technical details about which chemicals cause which reactions in which part of the brain, at least some of which are well understood. Or the wonderful, empathetic chapter on mental health, which would make the world a happier place if more people read it.
Not all of human behaviour is due to these unconscious biases, but they have a huge effect on us. For the most part the brain does an astonishingly successful job of managing a bewildering assortment of external signals. That it can come up with bizarre conclusions isn't as impressive as the fact that it can come up with conclusions at all. So for goodness bloody sake, buy the book, try and understand that "the brain apparently thinks that logic is a precious resource to be used only sparingly" and maybe you'll end up with a bit more empathy for all those angry people on the internet. At the very least, it'll change the way you think about yourself. The only real problem with the book is that it's far too short.