Facts v truth
A view from Dave Trott

Facts v truth

In Los Angeles, in 1964, Juanita Brooks was robbed by a blonde woman with a ponytail.

The woman grabbed her handbag, and a witness saw her get into a yellow car driven by a black man with a beard and moustache.

So the police were looking for an interracial couple fitting that description.

Several days later, they arrested Janet and Malcolm Collins, who were a match.

Unfortunately, neither the victim nor the witness got a good enough look at the robbers’ faces to positively identify them.

So the prosecution brought in an expert witness: a university mathematics professor.

The professor explained how statistics worked to the jury.

He said you can narrow down the possibility by multiplying the probability of the events together.

So he ran through the individual probabilities like this (remember this was 1964):

Probability of a woman with blonde hair: one in three.

Probability of a woman with a ponytail: one in 10.

Probability of a black man with a moustache: one in four.

Probability of a black man with a beard: one in 10.

Probability of a yellow car: one in 10.

Probability of interracial couple in a car: one in 1,000.

So, according to the mathematician’s logic, they needed to multiply all those events together to find the probability of another couple fitting that description.

According to the data, the statistics showed there was a one in 12 million chance of another couple around who fitted that description.

Naturally, the jury was convinced, and Janet and Malcolm Collins were found guilty.

Four years later, the Supreme Court reversed that decision.

It said the numbers may have been right, but the thinking was wrong.

This thinking has become known as "The Prosecutor’s Fallacy".

The Supreme Court said it was wrong to compare them with the entire population of the US.

If they had compared them with the population of LA, there might only be three other couples that matched that description.

In which case, there would be just a one in four chance that they were guilty.

Or, more importantly, a three in four chance that they were innocent.

This is why the Supreme Court decided that maths and statistics were inadmissible as evidence in that case, and generally dubious in a court case.

Statistics and data are very easy to manipulate and what makes them particularly dangerous is they have the appearance of scientific truth.

Numbers look like facts, so data looks like reality.

But like anything, it’s all in how it’s presented.

In the UK, in 1995, the Committee on Safety in Medicine released a report on birth-control pills.

It said the data showed blood clots in women had risen from one in 7,000 to two in 7,000.

So a rise from 0.014% to 0.028%, nothing to worry about.

But this wouldn’t sell papers, so the news media reported the same data differently.

They said the data showed the rate of blood clots had doubled.

It wasn’t a lie, but was a misleading interpretation.

In the year after the scare, unwanted pregnancies rose by around 13,000.

Because many girls and women, terrified of blood clots, stopped using the pill.

Five years later, a report in the British Medical Journal admitted the scare was unfounded.

Remember that next time you’re seduced by a presentation full of numbers.

Remember that every time a media company tries to bury you in reams of data.

Data may be a fact, but it isn’t the truth.

Dave Trott is the author of Creative Blindness and How to Cure It, Creative Mischief, Predatory Thinking and One Plus One Equals Three

Topics