10 Types of Evidence and How to Interrogate Them
You may well have learned that an argument is composed of a point, an example or piece of evidence, and an explanation. And you might well have mastered the art of coming up with a strong point, choosing an apposite piece of evidence, and explaining it clearly and convincingly. But how can you be sure that the evidence that you’ve chosen is adequate to the point you wish to make?
This is even more important when you’re analysing the arguments of others and trying to find a weak spot. It might be that the evidence seems to back up their point and their explanation is rational, but the whole thing falls down if you can find flaws with the evidence itself. Here are some of the things that you can look out for.
1. Statistics that ignore population growth or inflation
It’s really important to remember when looking at any statistic about how many more people are doing an activity (from getting involved in politics to dying) or how much something costs (from the national debt to cinema box office takings) that inflation exists and the world population has been increasing since 1800. You can see the impact of inflation very easily by comparing the top ten highest grossing movies of all time (oldest movie on the list: Titanic, from 1997) with the same statistic but adjusted for inflation (oldest movie on the list: Gone with the Wind, from 1939). The two lists have just three films in common, and the inflation-adjusted version gives you a much better idea of which films were really the most financially successful.
Population growth has a similar effect. Any statistic that claims “more people than ever before…” needs to take into account that there are now more people than ever before, full stop. The corollary to this, of course, is that any statistic that states fewer people are doing something is more significant than it might first seem. To check this, look for statistics that are expressed as a percentage of the population as a whole, rather than the raw number.
2. Statistics that address an implausible number of people
The opposite problem is when the numbers cited are simply too big. There are all sorts of ways to artificially inflate a statistic to make it sound more impressive.
For instance, imagine you’re talking about a building collapse that killed three people. If three sounds like too small a number, you could list the killed and the injured, bringing it up to – let’s say – 20. But if that still sounds too small, you could list all those affected – the killed, the injured, the emergency services who were over-stretched that day, the office workers who saw the collapse, the landlords whose property values will be affected and the builders who will wonder if they might have been responsible for the poor construction. You could end up being able to claim that hundreds of people were affected, even though just three were killed.
If you see a statistic where the number seems too large to be plausible, take a moment to check it against factors like local population, and keep an eye out for phrases like “affected by” and “up to”, which can conceal that a number has been inflated beyond accuracy.
3. Evidence that takes popular misconceptions as fact
There are a remarkable number of things that are widely held to be fact, but are in fact, nothing of the sort. Wikipedia’s list of common misconceptions rounds up many of them. But these misconceptions are so widely believed that you will frequently find them used as evidence. This seems to occur especially frequently when it comes to history or mythology, where popular misconceptions can be especially strong.
On the one hand are the cases where the popular view is straightforwardly wrong (for instance, the claim that women are always disempowered in traditional fairy tales) but there are plenty more where we simply don’t know the facts either way – almost any claim about pre-Roman religious practices in the UK will fall foul of this problem, where there simply isn’t enough evidence to make definitive statements how people worshipped or what they believed. Of course, the latter can be among the hardest types of evidence to challenge: it’s much easier to demonstrate that someone is wrong because you have countervailing evidence, and much harder to show that no one has the evidence to prove or disprove their claim.
4. Facts with misleading causation
You’ve undoubtedly come across the maxim that correlation doesn’t equal causation. For instance, if Hannah, aged 11, takes up football, plays for two years and grows half a foot, it doesn’t follow that playing football causes children to grow. But there are ways beyond this that causation can be misleading.
One example is when something has become the “leading cause” of something. It’s important here to remember that while one cause can increase, another cause can decrease. For example, dementia is now the leading cause of death in England and Wales. This sounds like bad news – but what it hides is that the number of people dying of stroke and heart disease has plummeted in the past 20 years. People are dying of dementia partly because it is more likely to be recognised and formally diagnosed than previously, but chiefly because they’re living long enough to develop dementia, by virtue of not having died of stroke or heart disease. What seems like a bad news story is actually a story of better diagnosis and increased life expectancy.
5. Statements that are only technically true
A “technically true” statement is one that you might instinctively answer with “yes, but…”. “The United Kingdom has a larger population than France.” (Yes, but only by about a million people). “You receive a dose of radiation from eating bananas.” (Yes, but the dose is so small that you would have to eat thousands of bananas in a short space of time to feel any negative effects). You can undoubtedly think of more examples yourself.
These statements can be hard to debunk because ultimately they are true; they can just be misleading if used without further qualifications or context. “The United Kingdom has a larger population than France” is useful if you have a geography test on population sizes, but more meaningless if your point is that the UK should have more influence than France on global affairs. You can overcome this problem by demanding the details that are missing, for instance by asking why a technically true statement is relevant.
6. Statements about time that are less impressive than they sound
Time is another area where context is important. A key misleading statement is “since records began” – whether that’s the coldest winter since records began, the longest-serving vicar since records began, or anything else that might be superlative since records began. It’s very impressive if records began in 1066, but not so much if it turns out that they began in 2003.
Similar, but less easy to spot, is when the way that something is measured has changed. For instance, if you measure whether it’s snowed on Christmas in the UK by whether there’s snow on the top of Ben Nevis then you’ll get a lot more white Christmases than if you’re measuring it by whether there’s snow on the dome of St Paul’s Cathedral in London, which is the measure typically used by bookmakers when taking bets on whether a given Christmas would be snowy. If you previously measured based on Ben Nevis, then switched to St Paul’s, it’s going to seem as if Christmases have abruptly got less snowy, even if not much has actually changed.
7. Anchoring words
“Anchoring” is a technique in psychology wherein by being told a number, you automatically begin to think within the range of that number. For instance, if someone was launching a new chocolate bar and they told you it was going to be £5, but it had been reduced to £2 for launch, you might find yourself thinking that £2 sounds reasonable even though it’s still quite expensive for a chocolate bar.
The language we frame evidence in can have the same effect. Words and phrases that do this include “just”, “only” and “an astonishing”. Think of how it feels to read “just 10,000 people showed up to the march” compared with “an astonishing 10,000 people showed up to the march”. The former primes you to think that this is a small number; the latter, a big one. Again, context is what matters. In the example of protest marches, they’re frequent enough in capital cities in most democracies that there should be other marches that you can compare the number to, and figure out just how impressive it really is.
8. Zombie polls
Zombie polls are – unfortunately – not quite as cool as they sound. They’re polls that are taken in a way that makes them statistically meaningless. Most polls that you’ll see on Twitter fall into this category, even if they say “retweet for a larger sample size”; because we tend to follow the people on Twitter whose views we agree with, or at the very least whose interests we share, the audience for these polls is inevitably skewed. That’s before you consider the fact that any significantly determined pressure group can set their members to taking a zombie poll in large enough numbers to make the result misleading. This effect isn’t restricted to Twitter; you can also see it at work on polls taken on local newspaper websites, for instance.
These polls can be hard to spot because while most people know to look for a small sample size (for instance, a poll of 10 is not going to be meaningful), zombie polls can have a very large sample size, potentially coming into the tens of thousands if they’re being run by a sufficiently popular site or account. That doesn’t make them any more applicable to the general population. What you’re learning from a zombie poll is typically only which side of an argument is more able and willing to mobilise to skew polls.
9. Policies that kill
It’s a sad part of the responsibility placed in politicians that almost every policy that carries financial backing has the potential to kill. This might sound dramatic, but as soon as you think about how complex the governing of a country can be, it’s also logically true. Any money that the government spends has to come from somewhere, whether it’s increased taxes, borrowing, or cuts in other areas, and even the most harmless-seeming cut can have an impact on someone’s life. For instance, an upgrade to a school crossing, designed to save children’s lives, might make a lollipop lady redundant, with consequences for her mental wellbeing and that of her family. This remains the case even where government is more hands-off, as choosing not to act is also a decision with ramifications for people’s lives.
This means that with sufficient research and determination, it’s possible to oppose almost any government action by pointing to a possible resulting loss of life. It’s important to look at the counterfactual, and ask what the result would be of not pursuing the policy, or any alternatives that have been raised.
10. Individual stories used as evidence
It’s not unusual to find an anecdote being used as evidence. For instance, in discussions on dangerous dogs, you’ll find lots of anecdotes in both directions: someone who was savaged by a pitbull terrier as a child, and another who grew up with a mastiff that was the sweetest, gentlest dog you’ll ever meet. In a world of 7.5 billion people, you can find an anecdote to demonstrate almost anything – hence the maxim, “the plural of anecdote is not data.”
Anecdotes can be among the hardest kinds of evidence to disprove, because it feels like telling someone that their experiences are not important. But unless the argument is based on absolutes (e.g., if someone claims all mastiffs are violent and frightening, one exception is enough to disprove the claim), an anecdote is simply not sufficient to prove it, and instead you need to show a broader trend.