Follow the reluctant adventures in the life of a Welsh astrophysicist sent around the world for some reason, wherein I photograph potatoes and destroy galaxies in the name of science. And don't forget about my website, www.rhysy.net



Thursday, 3 December 2015

The Value Of Nothing

Or, What's The Point Of Knowing Useless Things ?

It's not the greatest movie ever made, but there is at least one valuable scene in Kingdom of Heaven :


We might, perhaps, say the same about knowledge. Not knowledge of any particular topic, but knowledge in general - and more than that, understanding and wisdom.

There are some who claim a monopoly on truth - that their theory trumps that of mainstream science and that conventional research is nothing but a waste. They utterly fail to understand that progress without disagreement, without permitting and encouraging the exploration of the other point of view, is impossible. Worse, they insist that anyone who disagrees with them must be a closed-minded fool, and fail to appreciate the massive irony of the situation since their own theory has produced no tangible consequences whatsoever. That their precious idea might be dismissed by so many people because it is simply wrong never enters their darkest dreams.

I have dealt with such notions at great length here, here, here, here, here and also here. This post is a bit different. I'm not going to look at why people are wrong to assume science is dogmatic. Instead I'm going to look at a particular and very widespread type of dogmatic thinking and its consequences.

There are those who believe that we should concentrate only on researching specific topics : tackling climate change, curing cancer, superconductivity... things with obvious, immediate and above all practical benefits. Who cares about some distant galaxy ? What possible use is knowledge of the mating habits of flamingos ? One of my grandmothers couldn't see the point in spending so much money on the Cassini mission. Even my own mother, who in most other respects I will proudly defend as one of the wisest individuals you could ever hope to meet, doesn't think we should bother with space colonization "until we've solved the problems on this planet".

Well mum, I hope you're reading.


Spin-offs

Technologies which do have direct social benefits are by far the easiest way of selling the value of pure, blue-skies research. Unexpected developments certainly do happen, and the consequences can be literally unimaginable until they actually occur. You're reading this thanks to the internet, partially developed during research into particle physics. Twenty years ago the net was mediocre at best. Today, it's probably impossible to calculate its contribution to the global economy, let alone the societal changes it's brought about. Think about that next time you want to say, "this research can't possibly produce anything useful". You're not psychic, you twerp.

Sometimes, science does not so much improve the economy (though it certainly does do that) as it does transform it out of all recognition. Can you imagine the world without modern medicine, electricity, fertilizers, disinfectants, mass transit, or telecommunications ? It would be, in that famous phrase, nasty brutish and short.


And yet in part because people fail to think statistically and take the long view, there is a belief that "chemicals" are bad for you, that anything "natural" is more beneficial... a damn stupid idea that takes all of three seconds to disprove.

Snake therapy : because acupuncture is too mainstream.
But I digress. Sorry. Anyway, while it's true that not all major technological breakthroughs have occurred as spin-offs from pure research, many have. Many that have occurred through more deliberate projects have still relied on theory developed through pure research. Even the value of spin-off discoveries from scientific endeavours is impossible to calculate - yes, even from the space program we should apparently ignore until we've solved everything down here.


It is true that for poor counties, developing a space program should hardly be their first priority - just as it might not have been sensible for them to have developed programs of world exploration back in the Middle Ages. But for those who can afford it, the riches up for grabs are almost literally beyond imagining.


Leaving aside the highly dubious issue of whether we can or will ever solve all the social problems here on Earth, space exploration is a means to improving life here on Earth - not an excuse to avoid its difficulties.


No, don't do that research ! Do this research instead !

OK, maybe the method of spin-offs isn't the most efficient method possible of making those practical technologies. This is very hard to prove. We already have - indeed have always had - a mixture of public and private research, sometimes focused on practical goals and sometimes on blue skies projects. But because of the interplay between the theoretical development (which is largely the province of pure research) and the practical development (which is often done by companies) it's extremely hard to judge if things could be made more efficient or not.

The problem with the idea that we should only focus on  "important, practical" research is that I have neither the interest nor the ability to solve medical problems. Non-scientists sometimes seem to think that "science" is some generic discipline, and that if you can do one sort of science you can do another. This is like saying that if you can row a boat, you can drive a Formula One racing car - or that you'd want to drive a Formula One racing car.


Medical problems bore me. Chemical reactions are incredibly tedious. Atmospheric physics leaves me cold. And to be honest, there are even huge sections of astronomy that leave me yawning. I'm just not interested in them, and I'm not going to become interested in them just because someone else thinks they're important. They are important, but that doesn't mean I want to research them any more than it means I want to become a pilot, an ambulance driver, a fireman or a policeman or any of the other very important professions available. We all have the right to pursue happiness.

(And while we're at it, how often do you see astronomers saying, "You mid-level accountants should stop being accountants and concentrate on fighting cancer ?")

So if all you're interested in is the economics, you can stop reading. Science's contribution to the economy cannot be overestimated. Prioritising specific areas of research is an incredibly arrogant approach, because you cannot possibly know which area will generate an unpredictable spin-off or discovery that might be important elsewhere. And sometimes those spin-offs, like the internet, are quite literally invaluable.

But while the technological benefits of science are innumerable, there are other reasons for pure research which are harder to sell but, perhaps, even more important. The great Arabic philosopher Al-Biruni summarised it thus :
"The stubborn critic would say : 'What is the benefit of these sciences ?' He does not know the virtue that distinguishes mankind from all the animals : it is knowledge, in general, which is pursued solely by man, and which is pursued for the sake of knowledge itself, because its acquisition is truly delightful, and is unlike the pleasures desirable from other pursuits. For the good cannot be brought forth, and evil cannot be avoided, except by knowledge. What benefit then is more vivid ? What use is more abundant ?”


Humility

If there's one inescapable lesson from astronomy, it's that we are small. No, not small. Pathetic.

Or rather, astronomy should be a humbling and character-building
experience. I, for one, can't stand character-building experiences.

Not just in space, but in time. Not only is our pale blue dot nothing more than a speck of dust, but we've inhabited it for only the briefest moment. To think - or rather, to be certain - that we are the pinnacle of creation or that the Universe was created especially and only for us is an act of monumental, breathtaking arrogance. To the Universe, were are nothing.


And yet while our total insignificance in the face of this vast, cool and unsympathetic cosmos demands our humility, and probably also means we shouldn't take life too seriously, it doesn't mean we have to cow down in fear or be stupefied into inaction.


So what's the value of astronomy ? Nothing ! Everything. If you don't think humility is an important lesson, then I can't help you.


So should we stop there ? We already know the Universe is unimaginably big and incomprehensibly old, what's the point of studying the boring details ? Well, we also know that most of the Universe is (most likely) made of a material about which we know almost nothing, we know an unknown force is driving its expansion, there are places in the Universe where time stands still, where mountains can't grow a millimetre high because the gravity is so strong, planets where it rains sulphuric acid and moons that tear themselves inside-out, we suspect that ours is not the only Universe and there might, just might, be connections between other realities, and we have absolutely no clue if we're alone in all this or if there are hordes of other intelligences waiting to be discovered. So, should be stop there ? Or shall we keep going, and find out how deep the rabbit hole goes ?


All we are is a kilo or so of warm blood-soaked goop sitting inside a skull. It is entirely possible that the true nature of the Universe is forever beyond our comprehension. But if we'd given up trying because the magnitude of the problem seemed too vast, we'd still be swinging through the trees.


To slightly modify an earlier quote of mine :
The thing is though, the real universe is full of giraffes, exploding stars, Scarlet Johannson, worlds covered in methane, pandas, stars so dense they slow down time, and cabbage. We can only survive in a minuscule region of space barely five miles thick, on top of a rock hurtling around an almost 100,000 mile-wide ball of plasma, and we think this is normal. Anyone who thinks astronomy is nothing but escapism should have their head shoved into a telescope until they realise just how dreadfully, pitiably small the so-called "real world" is.
Although teaching everyone to be humble, to step back and remember that they're just another stupid ape, is something an awful lot of people would do well to remember, there is perhaps another aspect of knowledge which is even more important. To really understand the value of knowledge, you must first be aware of the price of ignorance. But there is one final pragmatic concern I need to address first.


Won't someone please think of the children !

Specifically, the starving children. How, one might ask, can we afford to spend money on space probes when there are people dying of starvation in the world ? Is our "humility" really worth that much ? It's a legitimate question. I've already hinted at the answer, but just in case it isn't clear, let me spell it out more clearly.


First, research has produced the farming techniques which are so essential to modern life : fertilizers, pesticides, harvesting technologies, etc. Second, food production has been growing over time, and yet there are still starving people in the world. That alone proves that it isn't about prioritizing the wrong research or that money is being spent on the wrong areas - the problem of starvation is political. Science is responsible for generating greater food production, better medical care, and longer lifespans - bringing those benefits to everyone is the mandate of politicians, not that of researchers. Scientists are in no more a position to affect political change than anyone else.

And again, how many astronomers do you see saying, "You mid-level accountants are just a waste of resources and we should give all your money to starving children" ?


Thirdly, the specific charge that money is being "wasted" on frivolous research like geological and space exploration. Yes, in principle, you could stop spending money on rockets and spend it on food instead. But it makes no sense to pick on astronomers or geologists. Neither are particularly well-paid (except possibly privately-funded geologists who discover oil or valuable minerals). And the cost of the space program, even in America, is utterly dwarfed by other costs :

America spends less than $20 billion on NASA annually, but over $600 billion on defence. What's more sensible - scrapping NASA completely, or "slashing" the defence budget by 3% ? Privately, Americans spend more on tobacco, alcohol, illegal drugs, treatment for all those drugs, gambling, and even pizza than they do on space - and this is by far the largest space program in the world. Anyone who's genuinely concerned to solve social problems would do far better to cut down on junk food, booze and drugs than scrap the technology-generating space program. That, at a stroke, would simultaneously prevent a lot of health problems and free up money for the needy.


So pure research with little or no obvious practical benefits costs very little, doesn't prevent us from solving more urgent problems, and in fact offers solutions to such problems provided we have the wit to use its unintended breakthroughs correctly. And it teaches us humility, cautions us against arrogance, and warns us that our most deeply-held views could be overturned at any moment. Perhaps most importantly of all, we should remember what happens in a world without such free-thinking inquiry.


The Demon-Haunted World

There are some mistaken beliefs about the medieval world that refuse to die - they never believed the Earth was flat, for example. They did think it was the centre of the Universe, but not because they believed it was the most important place. Actually it was partly tradition and partly a curious way of emphasising human mediocrity : neither close to heaven or hell, or as Terry Pratchett put it, "the place where the falling angel meets the rising ape."

Yet they did have some truly weird ideas. They didn't believe in parabolic trajectories : they thought that arrows kept going in a straight line until they ran out of energy. They thought the equator was uncrossable because of a ring of fire around the world. And they also believed in evil spirits, witches, and demons.



Science is the means by which we exorcise this demon-haunted world, as Carl Sagan so eloquently put it. We have no fear of crossing the equator any more, or that lonely old women might turn us into frogs or eat our children. We aren't afraid of fairies or goblins or werewolves or dragons. We don't throw human faeces into the street and we don't try and bleed people when they're unwell, or drill holes in their heads, or give them something to bite on while we saw their legs off. And we don't enslave people on the grounds that they are sub-human.

It's a shame about the dragons though.
That is the price of ignorance, and it's a high one : a world of magic and miracles, rituals and superstition. When knowledge is not required, there is no check or limit on stupidity. Even in modern times, there is no limit on how stupid even very intelligent people can become - but the safeguard is that we no longer have a handful of intellectuals trying to solve problems, we've got thousands.

When you insist that something isn't worth knowing, you allow truth to be replaced with doctrine. "No, of course the other side of the world isn't inhabited, how could anyone have crossed the ring of fire at the equator ?" You don't know what the world is really like unless you check. The ancient Greeks didn't believe in atoms because the space between them would be a vacuum, which "obviously" couldn't exist. The truth is that there is absolutely no compunction for the Universe to be what a blood-soaked lump of goop thinks it should be.
You could say to the Universe, this is not fair. And the Universe would say, "Oh, isn't it ? Sorry." - Terry Pratchett, Interesting Times
If you think some piece of research is "obviously" wrong or useless without having studied it, you are a moron. You are here at the behest of the Universe, not the other way around. You do not know the truth because your intuition tells you it. You have the right to your opinion, but fortunately for the rest of us your opinion doesn't count for turnips unless you produce actual evidence.


Even when knowledge doesn't produce direct, tangible results, it can still bring great benefits. Knowing that earthquakes are natural phenomena hasn't helped us predict them yet, but we no longer fear them as the will of the gods. We aren't sacrificing people any more whenever catastrophes threaten. Sometimes, knowing what doesn't work is at least as valuable as knowing what does.

This, then, is the value of knowledge for knowledge's sake : preventing ignorance and all its terrible consequences. Knowledge isn't a an impenetrable shield against fear, because humans are irrational, but it's is a pretty effective one nonetheless - and a damn sight better than the alternative.


Clarity Without Certainty

"Knowledge" is a loaded term. Actually what it's really all about is not knowing facts, but thinking. Simply thinking. "I think this is true" rather than, "I know this is true". Knowing facts is only the beginning - being able to derive Maxwell's equations in your sleep won't help you one bit to be compassionate. But it is a vital first step. Facts lead to models about how the Universe works, allowing you to make predictions and therefore the really interesting part : choices.

I recently attempted to explain to someone that science doesn't claim a monopoly on truth, that it simply tries to get the best information available at the current time. They were quite unable to understand that science does not - almost all of the time - deal in certainties. It only deals with the most likely explanations given the available evidence. Emphasis on "most". Science can't tell you what the correct decision to make is, but it can help you (and only help you) make the best decision possible at the time.

Certainty can even be dangerous. Frank Herbert once wrote that, "Knowledge is the most perfect barrier against learning." Actually, it's certain knowledge. What science provides is not unchangeable knowledge, it's information. Without evidence to disprove it it's as unyielding as stone. With firm evidence against it melts away like... umm... butter ? Yeah. Slowly, and it's quite sticky, and some people would rather it didn't go at all, but ultimately it's gone.

There have been many martyrs who believed in the certainty of their cause. One of my long-standing heroes is Socrates, the philosopher who died not for knowledge but for the right to think, to question, to admit ignorance where it exists and not proclaim false certainty, in short - to doubt. Doubt, in moderation, is not a weakness - it is the most potent weapon available in the arsenal of human intelligence. It is the force that has taken us from an era in which we were certain that we should bleed people who were half-dead into one in which the only certainty is that we can create better and better treatments. It is the means by which we improve ourselves and the world around us.

And so I ask again : what is the value of learning useless information ? What worth is there in learning for the sake of learning ?

Wednesday, 2 December 2015

Because I Said So


No-one is immune to being irrational. From time to time, everyone - without exception - commits logical errors. Some of these are trickier to understand than others, and one of them made little real sense to me until yesterday. In the hope that this will help anyone else who's confused about it, here's my short interpretation of the "appeal to authority" fallacy.

Asking for an expert opinion is not a fallacy in and of itself, but there are several ways in which it can become a fallacy :

1) They're not really an expert
Asking an expert for their opinion about something well outside their specialist field (sciolism). Stephen Hawking isn't especially qualified to talk about global warming, Richard Dawkins doesn't know jack about theology, and generally speaking engineers know nothing about flower arranging. You can certainly feel free to ask their opinion anyway, but there's not really any good reason to give Hawking's opinion on genetic engineering any more weight than that of Boris Johnson. It's an easy mistake to make, and possibly the most common form of the fallacy.

2) They're not an authority
If you want to build an aircraft, you hire an engineer, not a florist. When someone has a proven track record of successfully building aircraft, it makes sense to consider them to be both an expert and an authority on their subject. Engineers might not be correct 100% of the time, but they are vastly more reliable in their knowledge of what works and what doesn't than a team of a thousand florists.

But there's a difference between expertise and authority, especially when it comes to active research. Aircraft design is an incredibly well-tested and proven field. Meteorologists, on the other hand, make mistakes about tomorrow's weather all the time - there are no authority figures in meteorology. Of course, there are experts, and their opinions still count for more than those of pigeon fanciers. It's just that you can't use the opinion of a meteorologist to be certain of tomorrow's weather in the same way you can use an engineer's opinion as to whether a plane will be able to fly. An engineer can give you something close enough to certainty, a meteorologist deals in probabilities.

3) Because I said so
This seems to be the way the fallacy is most often stated but also explained the most badly. If you ask an expert for an opinion and they say, "well, I'm an expert, so I must be right", then they have committed a serious error. Their expertise doesn't by itself guarantee that they are correct. What their expertise should be able to do is allow them to rigorously justify their assertion - it does not entitle them to make sweeping claims based solely on their expert status and have everyone else kowtow to their infinite wisdom.

EDIT : Long after I originally posted this, I encountered a curious phenomena which may explain why people make the above error. It seems they sometimes confuse being asked (in good faith) for a demonstration of their expertise with having their expertise questioned. Of course no-one is obligated to participate in all debates, but when they choose to do so they should be careful to try and understand if someone is just trying to learn something or is using this as a veil for an attack.

Sometimes the experts themselves do this, worst of all when it's from one expert to another. Sometimes it's laymen looking to justify their own argument. "Well, this astronomer told me that the Moon isn't made of cheese, so it can't be." If the explanation stops there, then technically this is a fallacy. It amounts to saying, "I know this isn't true, so it can't be true" (or even more succinctly, "I must be right") which is obviously circular and doesn't even admit the possibility of error.

If, on the other hand, the explanation continues : "They said they measured the spectrum of the Moon and found it was inconsistent with that of any known cheese and actually is pretty similar to rocks, and I think they mentioned something about some sample return mission thingy..." then no fallacy has been committed. The astronomer has used their expert knowledge to rule out the Lunar Cheese Hypothesis, not relied on their status as an expert to quash the discussion.


The Appeal To Stupidity


On the other extreme, I also see people claiming the exact opposite : "they're an expert, so they must be wrong." Well, almost. The "I'm not a scientist, but..." defence is the idea that non-expert opinion can trump that of an expert in their own specialist field. Reality check time : unless that expert is exceptionally biased or has made a terrible mistake, it can't. True, experts might be wrong, and make mistakes just like anyone else. But though not infallible, they are far more likely to be right than any non-expert. That is, after all, their job. It's why we have them. There are cases where it is absurd to give equal weight to non-expert opinions.

So I rather disagree with the above meme, provisionally. There are some areas in which not being an expert does automatically make your point invalid. So you think you can build a rocket, and your qualification is, what... a degree specialising in Mesopotamian art ? You literally can't do basic arithmetic ? Then nope, sorry, your opinion is invalid. Of course there are areas in which intelligent non-experts can make valuable contributions and sometimes outdo the specialists, but sometimes it really is appropriate to dismiss non-expert opinions. It depends heavily on the details of the situation.


Summary

The "appeal to authority" argument, then, is somewhat subtle. When a subject is so well-established that there are figures who can be considered authorities, it's not a fallacy... as long as they're speaking about their specialist area. If the subject is controversial, then you should still give more weight to expert opinion, but you can't rely on one expert to be correct. This is especially true when the expert refuses to explain their reasoning. Being an expert is supposed to enable you to reason, not entitle you to pronounce judgement.

The "appeal to the consensus" argument is only a fallacy when experts do it in their own discipline. This creates a false consensus : it must be right because everyone else agrees with it, whereas a true consensus should be everyone coming to the same conclusion independently. Assuming the consensus has been arrived at properly, it's not a fallacy for non-experts to place more weight in the expert consensus than the opinions of non-experts - but it is usually wrong to assume it's 100% certain.

The "appeal to stupidity" argument is however very much worse than appeals to authority or the consensus. That experts are fallible in no way means that non-experts are somehow magically able to skip years of hard work. This doesn't mean that experts can use the "because I said so" fallacy to avoid discussing why they've come to their conclusions. Nor does it mean that when they take the time to explain the details to an intelligent non-expert, they can dismiss their opinion out of hand. It only means that if it's a choice between the opinion of an expert and an uninformed non-expert, the latter really is an appeal to stupidity.

Friday, 27 November 2015

Sense and Sensible Statistics

Recently I wrote about the value of the humanities classes in teaching rational thinking. Understanding why something evokes an emotional response is critical to understanding the intent of an article, i.e. how the author is trying to manipulate you. And once you have that, you can begin to step outside the emotions and think logically.

But that's only the beginning. While some of the humanities courses in schools are very good, what we really seem to be lacking almost entirely is a good course on statistics for the under 16's. Because if you take a statistical view of the world, then it's really not as bad as you might think.

Of course I don't mean that we need more emphasis on teaching mathematical techniques - yes, it's important to understand the mean and median and why the median is usually the better, but those are hardly the most important aspects. What I'm on about are the much less mathematical, more philosophical aspects of statistical measurements. You don't need any maths to understand them, but I see a great many people who just don't seem to get them at all. Yet while they're really a lot simpler to explain than the mathematical aspects, they're probably even more important. So, in no particular order :


Anecdotes Are Not Evidence

I just said, "I see a great many people". But what does that mean ? How many people ? Where ? Did I actually talk to them in detail or just form a snap judgement based on one short quote ? You see, by itself, my observation that I personally have witnessed some number of people not understanding statistical methods proves precisely nothing about how many people overall really do not understand statistical methods. Without more details (which I'll get on to in a minute) there could be any number of reasons why my casual observation is meaningless.

But anecdotes aren't unrelated to evidence. For statistics they're a sort of base unit of evidence. As long as the witness isn't lying or delusional, they do prove individual things happen. The problem is that my statement, "I see lots of people who..." strongly implies that I think there's a majority of people who behave in a certain way. And I might think that. I might very well think that. But as to whether it's really true...

Ian Richardson is infinitely more charismatic than Kevin Spacey and anyone who disagrees is objectively wrong.
All my observation can tell us is that some number of people don't understand statistical methods. Sometimes that's good enough : "I saw that man attacking that adorable kitten, officer !". You can use anecdotes to refute sweeping, overstated generalizations ("All men attack kittens.", "Ah, but I knew a man who never attacked a kitten at all !"), but not to support generalizations ("It must be true, I've seen lots of men attacking kittens"). As an estimate as to what fraction of men really attack adorable kittens, it's useless. It's extremely difficult to overcome a personal bias when we witness something happening first-hand for ourselves, especially if we see it repeatedly. Which brings us neatly on to...


Selection Effects

It's almost impossible to get everyone's opinion on any topic or analyse everything in any sample, which is why we need statistics in the first place. But questioning as many people as you can could be utterly useless if you question only specific people. If you're trying to find out how many people enjoy reading, you don't go and only ask people visiting the library, because that's just plain silly.

Of course, you can't ignore the evidence of your own eyes. You see youths being aggressive day in, day out, and it's easy to conclude that young people are aggressive. And it's even true in your experience. But if you're walking the same route each day and see the same youths, you've limited your sample. Or you might be going through a park where the local ruffians choose to congregate, so you're only seeing the dregs and not the far greater numbers of young people who are busy in school.

It's very difficult to eliminate all selection effects when collecting data. But if you don't try to do this at all, you'll end up with a very warped view of the world. Which is probably why people seem to think that most immigrants in Britain are Polish plumbers, even though they're actually highly skilled twenty-somethings.


Think Of The Big Picture

Statistical thinking cautions us to remember that while what we observe is always true, it isn't necessarily representative of what's going on everywhere. Maybe most teenagers in parks are hooligans, but most teenagers overall are just lovely. The point is that you have to be very careful about generalizing from specific observations. The more data you've got, and the wider variety of sources it comes from, the better.

For example, the media often focus on stories about individuals. Being basically empathetic creatures, we react strongly to emotional, personal stories. The trouble is that it doesn't matter how many "violent immigrant" stories you report, all you're doing is picking out anecdotes without reporting the full story. It's a bit like only reporting plane crashes - obviously, there's no story when a plane doesn't crash, because people aren't interested. But we all know planes are safe enough, we accept the small risk that comes with flying because we're at least broadly aware of how many planes don't crash.

It's harder to escape the emotional impact of a violent, personal attack. Our pattern recognition skills tell us, "That dangerous person is in some way different, therefore any people which share that difference might also be dangerous." But unless you also consider the number of such people who aren't dangerous, you are not thinking statistically. You also have to consider how people in other groups behave, otherwise you have nothing to compare with. For instance, 98% of all terrorism in the Western hemisphere is carried out by non-Muslims - which is hardly the view one gets from the media*. People like fear because it stops them from the more difficult task of actually thinking. That sells newspapers but it doesn't tell you what's really going on.

* Strictly speaking if you want to find out if Muslims are more violent, you should look at how many violent acts occur per capita by Muslims compared to all other demographics.

It also puts into perspective how some awful crimes just really aren't that much of a problem. Statistically, terrorism causes about as many annual fatalities as pregnancy, and no-one is talking about a "war on babies".

Underlying Causes

Your local observation of gangs of young ruffians consistently appearing in a local park tells you that you should be wary of those ruffians in that park. It does not, by itself, tell you that you should be wary of all parks or all youngsters. What if that park is in a city with a very high crime rate anyway ? It could be that you're seeing kids in a park because kids hang out in parks; that they're also violent could be related to the fact that the whole area has a high rate of violence in all age groups. There's nothing wrong with your specific knowledge of the area, but you're jumping the gun to assume that all parks (or all youths) are the same everywhere.

It's easy to see why we think this way : we have monkey minds in a modern world. It makes sense to run away from all tigers, because all tigers really are dangerous. The trouble is that we try to apply this thinking to far more complex, modern situations, and it's failing miserably. Instead of making us safer, it's making things more dangerous - our unfounded fears about certain groups cause us to hate them, which causes them to hate us, and the cycle of hate and violence can be difficult to break.

But even when your specific observation is borne out in more general trends, that doesn't necessarily mean anything either. Even if you did see that all kids in parks were violent, it would be silly to conclude that parks make them violent. Similarly it's plainly ridiculous to say that being tall and skinny is a sign of intelligence. Doing intelligent things, like completing a degree in mathematics, is a sign of intelligence - that you're a sexy partygoer is completely and utterly irrelevant, and really quite insulting.

The excellent Spurious Correlations website is full of examples of this, although my favourite has to be this one :


Does eating more chocolate increase your chances of winning a Nobel prize ? Probably not. First, we could turn it around. It doesn't make a lot of sense to say that a few academics winning Nobel prizes causes the whole populace to eat more chocolate, so you can't assume the reverse is true. There could be any number of common reasons why both chocolate consumption and Nobel prize winnings increase simultaneously. In poor countries the population are starved of all foods, so they aren't healthy and have little time to spend on science, while the reverse is true in richer countries. There's a very strong selection effect at work here : why only look at chocolate ? What's the correlation like with other foods ?

Determining what the underlying cause really is is difficult. Ideally you perform an analysis where you see how one variable correlates with lots of other variables, not just one. If you find a correlation and there's a physical mechanism to cause it and no other variable seems to correlate as well, then maybe you've found something interesting.


Ask ALL The Questions !

If a study is focused on a very narrow area, you might think that you can get away with asking very short, simple questions. Not necessarily. You might be introducing a selection effect and miss something very important that's going on. If you're monitoring library usage and find that it's dropping, you don't just ask people whether the chairs are uncomfortable. Ideally you want as much data as possible, so that you can consider both causes you consider likely and unlikely on an equal footing.

Then there's the hugely complicated topic of asking the right questions in the right way. I'm not going to go into this one save to mention this brilliant Yes Minister scene which shows why it's so important :


You've also got to ask the same questions, even if they're not ideal. If you ask people, "Do you like pork pies ?" in Hull and, "Are pork pies your favourite food ?" in Doncaster you will inevitably get different results. This is an even bigger problem when it comes to international studies, since different countries don't always cooperate to get public opinion in the same way. For example, recently there was a claim that America is a less violent place than Britain, which is revealed as pure nonsense when you realise that the two countries have very different definitions of violent crime.



Outliers Are More Noticeable

Selection effects are constantly at work in human memory. We only notice events, we don't notice non-events. A plane that doesn't crash isn't memorable. An immigrant who never breaks the law doesn't stand out. Negative outliers are perhaps even more memorable, because it's safer to remember danger than it is to remember the examples of success. The thing to remember with media stories is that in general, stories only make the news because they're unusual. For that reason, be extremely wary of judging whether anything the media is reporting is typical of what's usually going on. And also be acutely aware that because of this, the media are feeding you a series of unusual events, which will inevitably bias your memory and impressions of what usually happens.


Oddly enough, while it's always possible to point to events in which people were killed, it's not always possible to say when lives are saved - at least not specific, personal examples. It's easy to say when someone dies of heart disease. It's impossible to point to individuals who never get heart disease in the first place because, say, of changes in food regulations or campaigns for healthy eating.

One of my schoolteachers taught me a classic example of what happened when the British government decided to stop moving the clocks back an hour in winter. Campaigners said this would prevent unnecessary deaths in the evening (i.e. schoolchildren walking home in the dark getting hit by cars). And it did. There was also an increase in the number of deaths in the morning, but it was less than the decrease in the evening. So statistically, lives were saved.

But was the media full of stories of children who were, inexplicably, not dead ? No, because you have no idea who was saved by this, but it's very easy to find examples of children killed in the morning, when it was now darker. Of course, you really have no idea who exactly was killed as a result of this either - they might have been run over anyway. Really it makes no more sense to interview the parents of one dead child and say, "this is an example of this law killing children" than it does to to interview the parents of one child who's not dead and say, "this is an example of the law saving lives".

This kind of statistical thinking can seem cold, even cruel and inhuman. In situations like this it's important to remember that we're dealing with probabilities and risk, not individuals. You might think it's a choice of saying, "I want to kill lots more children in the evening and a few less in the morning" or "I want to kill a lot less children in the evening and a few more in the morning", so that basically it boils down to how much killing you want to do - you do not have the luxury of a good choice here.


It's true, but of course altering risk is not the same as either lining up people for a firing squad or rescuing them from a hungry shark. You'll never know who was saved and who was not - you have to go on the numbers, because that's all you've got. You can't avoid taking risks. You can only control which risks you take.


Significance

Unlikely events still happen by chance. If something has a 1 in 10,000 chance of occurring, you can expect that it will occur if the requisite scenario actually does happen 10,000 times. So if you get 10,000 people to flip a coin ten times and one them comes down heads ten times in a row, it's not because one of the people was psychic.

Measuring how statistically significant an event is - the probability that it didn't happen by chance - can be mathematically complicated. It also relies on accounting for those all-important selection effects. Be especially wary of the phenomenon of small number statistics. The smaller your sample, the greater the chance that it can appear to show a trend where actually none exists. For example, if you ask a million people if they like cheese and 600,000 say yes, the result is far more decisive than if you poll ten people and nine of them said yes.

The point is that you should be cautious when an unlikely event happens because it might have happened by chance anyway. Weather prediction is a good example. You can never attribute an individual very hot summer to global warming, because it might have happened anyway. A run of three hot summers is also difficult, because the Earth has been around an awful long time so has probably had umpteen bouts of three hot summers in a row. At what point you can start to say, "this is significant" you'll need to call in a climate expert.





Conclusions


So can these few simple lessons really make you happier ? Possibly. It certainly teaches caution : just because you feel something is true from your own experience doesn't mean that it really is. To be more accurate, personal experience is not a good guide to general trends. Without having experienced other situations, you don't know what selection effects are at work. So anecdotes can tell you something about individual situations, but they're nearly useless when it comes to the big picture. Even your general impressions about what's going on in the world at large are coloured by a media for whom selling emotional but statistically irrelevant stories is de rigueur.


I shall return to the "generation of spoiled idiots" in a moment, but the main point is that we don't often stop to think about how much worse things could be. This is what I call the Grandparent Paradox (nothing to do with the time-travelling Grandfather Paradox). Grandparents are old, experienced people, and experience is valuable. Yet despite having survived World War II, my grandmother was convinced the world was getting worse. Yeah, really, worse than say, being bombed by Nazi Germany, or having to live in a world where women couldn't do equal work for equal pay, homosexuality was illegal, and being an abject racist was just normal.

 "More bad news", she'd say, having just read the latest edition of the Daily "we supported Hitler" Mail. As though the assassination of the Tazenikistani Royal Family (or some such) was somehow impinging on her own need to turn up the heating to a level able to roast a turkey, set the telly on full volume and go off on racist rants for no apparent reason (with hindsight the Daily Mail probably had a lot to do with that).

Statistical thinking means you don't lose your head because something awful happened. Awful things happen all the time, and they probably always will. What they don't indicate is that things are getting worse - you have to look at numbers over time for that. "I see this happening more and more" means precisely diddly-squat without numbers.

Over the last century we've gone from a situation in which racism was normal to, if far from gone, then at least being hated by the majority of people. We're living longer. Our standard of living is immeasurably higher. Women and minorities have equal rights, if not yet equal treatment. And yes, progress hasn't been linear, but inferring that the apocalypse is round the corner because something bad just happened is just dumb.

The other notable quote my gran used to say was that my generation couldn't possibly have fought the Nazis. How she came to this conclusion was anyone's guess, never mind the ridiculous notion about how people "were nicer in those days." It's not my generation that put up signs in pubs saying, "No blacks or Irish". Similarly, the idea that we're a generation of "spoiled idiots" does have a base emotional appeal, but is it really true ?



That's probably the internet meme I hate the most, because every single damn thing about it is wrong. First, I've never seen this in any school playground*. Cardiff, Arecibo, Prague... nope, not happening. Show me the evidence that this does happen and I'll believe you. Second, what's wrong with kids playing with smartphones and not playing sport ? I freakin' HATED sport ! I'd have wounded small puppies for the chance to get out of  P.E. lessons and play with a handheld computer. Why the hell would it have been better for me to suffer ? Kids having access to more advanced technology is a good thing. That's how technological progress is supposed to work.

* Of course, all my anecdote proves is that kids don't avoid sports in every school.

Our continual raising of our own expectations is a mixed blessing. We're continually driven for self-improvement : racism isn't considered normal any more, and homophobia is heading the same way. The downside is that whenever things do get worse we tend to forget how far we've come overall, and because we continually shift the goalposts we're never happy.

Grandparents, in my experience, also seem to have a view that that the modern world is a more dangerous place. "Oh we were allowed to wander off by ourselves when I was six, and we could eat dog poop if we wanted and it never did me any harm", they say. Sure you were. That doesn't mean the world or dog poop was safer (or that medical advice was wrong*) - it could just mean that your parents were stupider, or less well-informed, or didn't love you as much, or just weren't paying attention, or in fact were better at parenting because they let you have more fun. The idea that the world has become more dangerous is only one possible explanation, and just because you see a lot more media reports about murderers doesn't mean there are more murderers around.

* Yeah, you may have gotten away with not washing your hands once in a while. Cleanliness doesn't guarantee survival any more than dirtiness guarantees death - all you're doing is altering the risk. Some changes can't be seen on an individual level - they require a much larger statistical view. There's also a selection effect here : of course it did you no harm, because those people who it did harm are dead and can't complain about it.


The main lesson of statistics isn't that you should jump around in some kind of orgasmic ecstasy about how great the world it, because large aspects of the world are pretty darn crappy. But it reminds us that while we don't live in a Utopia, we do live in something that's infinitely better than many alternatives. We've escaped warlords and petty kings and absolute monarchies and emperors, serfdom and slavery and the workhouse. True, the world isn't what it once was - the only constant is change. But in so many ways, that change has been for the better. I think if we grew up learning about these basic statistical effects, we'd all be a lot more rational. And maybe - just maybe - we'd all be a little bit happier too.