How to detect bullshit
People lie for three reasons; the first is to protect themselves. They may wish to protect something they want or need, a concept they cherish, or to prevent something they fear, like confrontation. There is often a clear psychological need motivating every lie.
A well known fib, “the dog ate my homework”, fits the BS model. In the desperate fear driven attempt not to be caught, children’s imaginations conceive amazing improbabilities. Fires, plagues, revolutions, curses, illnesses and absurd reinventions of the laws of physics and space-time have all been summoned by children around the world on the fateful mornings when they find themselves at school, sans-homework. It’s an emotional experience, this need to BS: as logically speaking, the stress of inventing and maintaining a lie is rarely easier than accepting the consequences of the truth.
... second reason people lie: sometimes it works.
… the third reason people lie, a truth saints and sinners have known for ages: we want to be seen as better than we see ourselves.
The first rule of BS is to expect it. Fire detectors are designed to expect a fire at any moment: they’re not optimists. They fixate on the possibility of fires and that’s why they save lives. If you want to detect BS you have to swallow some cynicism, and add some internal doubt to everything you hear. Socrates, the father of western wisdom, based his philosophy around the recognition, and expectation, of ignorance. It’s far more dangerous to assume people know what they’re talking about, than it is to assume they don’t and let them prove you wrong. Be like Socrates: assume people are unaware of their own ignorance (including yourself) and politely, warmly, probe to sort out the difference.
The first detection tool is a question: How do you know what you know?
Throw this phrase down when someone force feeds you an idea, an argument, a reference to a study or over-confidently suggests a course of action. People so rarely have their claims challenged, that asking someone to explain how they know sheds light on whatever ignorance they’re hiding. It instantly diminishes the force of a BS driven opinion.
The second tool is also a question: What is the counter argument? Anyone who has seriously considered something will have seen enough facts to fit their current argument as well as alternative position: ask for them.
A good thought holds together. Its solid conceptual mass maintains its shape no matter how much you poke, probe, test and examine. But BS is all surface. Like a magician’s bouquet of flowers, it’s pretty as it flashes past your eyes, but its absence of integrity become obvious when you hold it in your hands. Anyone creating BS knows this, and will tend towards urgency. They’ll resist reviews, breaks, consultations or the suggestion of sleeping on decisions before they’re made.
Use time & pressure, the third tool of BS detection, in your favor: never allow big decisions to be mismanaged to the point where they must be made urgently. Ask to withhold judgment for a day, and watch the response.
Especially in business and technology, jargon and obfuscation hide huge quantities of BS. Inflated language is a technique of intimidation. The bet is that if you don’t understand what they’re talking about, you’ll feel stupid, or distracted, and give in to the appearance of their superior knowledge. This is, of course, entirely bullshit. To withstand BS you have to have an inner core of self-reliance, holding on to your doubts longer than the BS’er holds onto their charade.
For example: Our dynamic flow capacity matrix has unprecedented downtime resistance protocols.
If you don’t understand what the hell this means, err on your own side. Don’t assume you’re missing something: assume they are. They’re either hiding something, communicating poorly, or don’t themselves understand what they’re talking about. BS deflating responses include:
I refuse to accept this proposal until I, or someone I trust, fully understands it.
Explain this in simpler terms I can understand (repeat if necessary).
Break this into pieces you can verify, prove, compare, or demonstrate for me.
Are you trying to say “our network server has a backup power supply?” If so, can you speak plainly next time?
The fourth tool of BS detection (derived from the rule of expecting BS) is careful assignment of your trust. Never agree to more than your trust allows. Who cares how confident they are: the question is how confident are you in them? It’s rare that there isn’t time for trust to be earned. Divide requests, projects or commitments into pieces. It’s not offensive to refuse to take someone’s word if they have no history of living up to it before (especially if they’re trying to sell you something).
And trust can be delegated. I don’t need to trust you, if you’ve earned the trust of people I trust. Anyone skilled in the BS arts has obtained that skill through practice, diminishing the odds that many BS-proof people have been successfully deceived by them in the past. Nothing defuses BS faster than a collective of people that help each other detect and eliminate BS. If a team of people witnesses the complete evisceration of someone’s BS few will attempt it again: they’ll know your world is a BS free zone. Great teams and families help each other detect bullshit, both in others and themselves, as sometimes the real BS we need to fear is our own.
How to Tell What's Bullshit and What's True - Part 1
I’ve been shocked lately to notice how many of my friends (who are otherwise very intelligent) believe things which are totally false.
There are hundreds of ... myths floating around in conventional wisdom and email inboxes, and I think it’s important to understand how to be a rational human being in today’s world.
How do you decide what’s true and what’s not?
The first lesson here is to realize that just because something intuitively SEEMS true, does not mean you should believe it. Our brains have the remarkable ability to learn from the world around us, but they can also play tricks on us.
For example, it’s very INTUITIVE to our brains that if you dropped a 100lb. ball and a 10lb. ball at the same time, the heaver one would hit the ground first. But thanks to Isaac Newton Galileo, we now know that to be false.
Similarly, it’s very INTUITIVE for our brains to think of the earth as the center of the solar system with the sun revolving around it. It “just looks” that way because the sun comes up in the morning and goes across the sky. Copernicus eventually convinced the world the sun was at the center of the solar system.
By the way, both of these ideas were met with ridicule (and sometimes worse) when they came out. Why? It upset people because it just didn’t SEEM right. It broke their view of the world, and people get upset when you challenge their closely held beliefs.
My point here is simple: just because something SEEMS true isn’t a good reason to believe it.
Some Common Mistakes
1. Using Anecdotal Evidence This is the most common one I see. It basically means using a personal experience or observation to draw a much larger conclusion. For example, “my grandfather smoked a pack a day and lived to be 95, smoking isn’t dangerous!”. This isn’t evidence at all because it’s not a controlled study with a large enough sample size to draw any reasonable conclusion.
To prove this to yourself, take something you know to be true. For example: eating tons of unhealthy food makes you fat. Now ask yourself if there is any specific case where this is not true? Well, yes, there is probably SOME lucky person out there who eats a ton of food and isn’t fat. Now if you knew nothing about food and had heard this one story (anecdote), it could be used (incorrectly) as evidence to suggest that eating lots of food does NOT make you fat.
Sure, in this case you would know better, but what about in more complex subjects like medicine or health? Hopefully you can see the danger of anecdotal evidence.
Opinions, observations, and stories are no substitute for a carefully controlled experiment (more on this later).
2. Assuming Correlation Proves Causation People also make the mistake of assuming that because A happened, and then B happened, A must have CAUSED B. This is also known as the post-hoc fallacy.
Example: Kids are listening to rap music now (playing more video games, smoking pot, etc - take your pick) and violence is up, obviously the rap music caused it. Well, maybe it did and maybe it didn’t. But merely noticing that when one went up the other went up proves nothing on it’s own. Maybe B caused A, the violence caused the rap music. Or maybe they aren’t related at all. Stating a correlation proves nothing about causation.
As stated on Skeptic.com…You have a cold, so you drink fluids and two weeks later your cold goes away. You have a headache so you stand on your head and six hours later your headache goes away. You put acne medication on a pimple and three weeks later the pimple goes away. You perform some task exceptionally well after forgetting to bathe, so the next time you have to perform the same task you don’t bathe. A solar eclipse occurs so you beat your drums to make the gods spit back the sun. The sun returns, proving to you the efficacy of your action.
Only a controlled study can show causation, not correlation on it’s own.
3. Thinking It’s True Because It Can’t Be Proved False This is called an argument from ignorance or logical fallacy, where you assume that something is true just because it can’t be proved false.
It can also take the opposite form where you assume something is false just because it can’t be proved true.
Example: “You can’t prove God doesn’t exist, so God must exist.”
You can, again, prove to yourself that this isn’t evidence at all. Some things are just impossible to prove false. For example, what if I said there is a teapot orbiting the sun somewhere in our solar system. This is ridiculous of course, because even if you looked long and hard for it, I could always say well you just haven’t found it yet…space is a big place! It is impossible to disprove, and clearly this provides no evidence that a teapot does in fact orbit the sun.
When you hear people say one of these, it should set off a BS alarm off in your head. Here are some phrases to watch out for:
“I know it’s real because it happened to me” (Anecdotal) “Everyone I talked to said it’s true” (Anecdotal) “Of course it happened after [BLANK] came in!” (Correlation/Causation) “Well you can’t prove that it doesn’t!” (Argument from ignorance)
How to spot bullshit
Tips for distinguishing between disguised ignorance and insight.
1) ‘Studies show’ – what studies?
Be cautious of claims made with confidence but without evidence. Some writers will try to lend their views credibility by referring vaguely to science or research, or by quoting unreferenced statistics.
2) Anecdotal evidence can disprove a generalisation, but cannot prove it
If someone says that all Irish men are alcoholics, and you meet an Irish man who is not an alcoholic, your anecdotal evidence is enough to disprove this claim.
If you meet an Irish man who is an alcoholic, however, it does not prove his claim since there are millions of other Irish men. Be wary of people who use anecdotes as evidence to prove a theory. Anecdotes can disprove theories, and can help add colour and insight to a situation, but are not enough in themselves.
For example, remember this Australian clip depicting ‘stupid Americans’ interviewed on the street? The video shows Americans who cannot answer questions like ‘name a country beginning with U’. (‘Yugoslavia,’ one replied. Another suggested ‘Utah.’) Quite funny stuff, but not indicative of what the other 300 million Americans would have answered.
3) Watch out for overkill
Sometimes writers make claims so outlandish and extreme, so far removed from what most people already believe, and with such confidence, that there is a temptation to say, ‘well, if only half of this is true, then I’m convinced’. This is weak logic, don’t let wild accusations and claims move you half-way to acceptance when none of it is backed up.
4) Cherry-picked statistics are useless
Statistics must be placed in context.
5) Generalisations about ‘Europe’ or ‘The West’, conceal the truth
Europe, remember, is a continent ending, arbitrarily, at the Urals and Caucasus mountains, and including Iceland even though it is geographically closer to North America. Europe includes a socialist autocracy (Belarus), a theocracy (Vatican City), and several traditionally Muslim-majority countries (Albania, Kosovo and Turkey, since a small section of Turkish territory lies inside Europe). Most Russians are also Europeans, living west of the Urals. Most European countries never had colonies, and many of them were colonised by their European neighbours. Some European countries are in the EU but not NATO, some in NATO but not the EU, some in neither.
Taking all this into account, it is senseless to talk of ‘European policies’ or ‘European culture’, yet serious commentators do this all the time. This can be deeply misleading... There is wide variation between European countries. Thus there is no European policy or European culture that is not also shared with non-European countries.
Commentators and politicians blur the edges to hide the truth. It is not difficult to be specific. Say ‘EU’, or ‘France and Germany,’ or ‘NATO’ when you mean those things, not Europe.
Europe is just one example, but this kind of generalisation is common for many groups, not least ‘the West’, ‘the East’, Africa, ‘the Muslim World’ and so on. Don’t take for granted that such collections of diverse nations are really accurate or useful, that there is really such a thing as a ‘Western worldview’ or an ‘Islamic policy’. It’s lazy, and deceptive.
6) Data, not celebrity, should rule
Trust no commentators, however likeable or well-respected: always expect data to back up their claims. Journalists sometimes introduce a commentator by referencing his achievements to give credibility.
The same goes for people or organisations one usually disagrees with. Sometimes inconvenient arguments are dismissed because of the person they come from: ‘typical right-wing propaganda’, ‘known Communist-apologist’, etc. Arguments should not be dismissed because of the person who makes them, but rather because of their weak evidence or logic.
In practice, to be fair, we don’t have time to listen to every argument going, so we filter out those arguments coming from people with a long-standing lack of credibility. But just be cautious about it – sometimes the wrong people believe the right thing.
7) ‘Yeah but…’
One common technique for escaping legitimate blame for some abuse or atrocity is to confuse the issue by pointing at the accuser. For example, the Chinese government released a Human Rights Record of the US in 2004. American governments have for years criticised China for its human rights abuses. Chinese government’s reply: ‘yeah but…’
Yeah but, ‘The United States should take its own human rights problems seriously, reflect on its erroneous position and behavior on human rights, and stop its unpopular interference with other countries' internal affairs under the pretext of promoting human rights.’ So does that mean there are no human rights abuses in China? Not at all. In fact it is irrelevant to the discussion of Chinese government human rights abuses. This sleight of hand flips the debate away from sensitive topics by focusing on irrelevant external issues: a useful way to bullshit people.
8) Don't take my word for it
These are useful ways to detect deception, but don't take my word for it. Access data directly, skipping the middle men of media and politics. The internet makes this ever more convenient with amazing sites like Gapminder. Go forth and research yourself.