Handling Statistics image

Handling Statistics

0
1
0
Statistics can be really useful. For those of us who like to wrestle with cultural issues, they can be a great way of getting a gauge for what’s going on around us. But as well as being useful, statistics can also be dangerous; they are easy to misunderstand and easy to misapply. So here are a few lessons I have learnt to apply when I’m looking at and using statistics.

Look to the Sources

Most uses of statistics are not neutral. First, the form of the study itself can be biased (e.g. in the specific questions asked and the way the research is conducted). Second, journalists and writers can make careful choices to pick the figures which support their point or achieve their aim. What we read in newspaper reports or online articles is often only part of the story (other relevant figures are sometimes omitted) or are not completely accurate reports of the findings (such as ambiguity about the wording of questions actually asked). It is therefore important to look for the original source of the statistics. Even if an article doesn’t cite its sources, they are usually not too hard to find after a bit of searching.

For example, I was looking for stats on polyamory in the UK. I quickly found this article which claims that nearly one fifth of Brits are polyamorous. Later in the article, we are told that ‘Northern Ireland is home to the largest number of people in polyamorous relationships across all regions that were surveyed.’ These statistics surprised me, so I dug a bit further. When I looked at the source of the statistics I found that the survey actually asked, ‘Do you identify as polyamorous?’ This says nothing about whether the respondents are actually in a polyamorous relationship. I then spotted that the survey defined polyamorous as ‘being capable of having more than one romantic relationship’. So, the survey is not actually even about those who would commonly take this as an identity marker, despite how the question makes it sound, but is actually identifying those who think, perhaps hypothetically speaking, that they could live in a polyamorous relationship. So, to say that one fifth of the UK are polyamorous or to speak of people in polyamorous relationships on the basis of this data is rather misleading.

As a preacher, it would be easy to see this article and use the stats to decry the crumbling of traditional, Christian ethics in our nation, yet a little bit of digging reveals that the situation is probably far less serious. It’s important to look at the sources.

Look at the Methodology

It is also important to consider how the research was undertaken. What was the methodology used and is it likely to give accurate results? The reality is that most statistics, perhaps with the partial exception of those based on census data or exploring smaller groups, are estimates. Those performing the research do so with a certain number of people and then use those results to estimate what the results would be for a wider group of people, such as the population of the UK. This can be done well, and it can be done badly. To be done well, the sample group must be carefully chosen to reflect the wider group and the estimations must then take into consideration various factors to give a fair estimate of what the results would look like in the larger group.

So, for example, as I looked at the source of the stats on polyamory, I noticed that the original survey consulted 2000 UK adults, but I couldn’t find anywhere which said any more about the sample group. These statistics may therefore not actually offer a fair reflection of the UK as a whole (and, credit to them, they don’t claim it does). This led me to search a little further, and I quickly found stats about polyamory in the UK from YouGov (both a summary article and the survey results) for which I could also check the methodology. Here the methodology shows the use of a carefully picked sample and estimates that take into account relevant factors to give an estimate which is as accurate as possible. These figures are therefore much more likely to be reliable.

Look at the Detail

When you look at the sources, you are also able to look at the detail. Details, such as the exact questions that were asked (as in the example above) or how the data was collected, can be really important. Even summary articles about the data produced by those who conducted the research are not always fully accurate.

Good sources will also include warnings about where the data may be inaccurate and about possible factors which could change the results. For example, an ONS study which looked into personal wellbeing in relation to sexual identity found that ‘those who identify as gay or lesbian, or bisexual report lower well-being than the UK average for all personal well-being measures.’ However, they also note that earlier research has shown that you are more likely to identify as LGB if you live in London and that personal well-being is often lower in London. They therefore admit that there could be factors beyond sexual identity which are influencing these results. These little details are important.

Look at the Critics

Finally, it’s always worth checking whether there have been any responses to or criticisms of the stats. Obviously even if there are, we shouldn’t assume that the critics are right, but we should listen to what they have to say and seek to evaluate their assessment to the best of our abilities. This is particularly relevant for significant statistics which are commonly repeated as they are more likely to have been checked by other people.

So, for example, within the current debate about transgender, statistics showing very high rates of self-harm and suicide among people who identify as transgender are commonly used. However, some have looked into the studies and surveys behind these stats and offered thorough critiques of them (e.g. here and here).1 A good critique will help you look at the sources, methodology and the detail in order to evaluate the reliability of the stats. Another example is about the prevalence of intersex conditions. The figure 1.7% is commonly repeated and you can find various critiques and defenses of that stat. A quick look at these reveals that the reason for these different views is actually about the definition of intersex. So, rightly defined, it may be true to say that 1.7% of people are born with an intersex trait, but that does not mean that 1.7% of babies are born with truly ambiguous biological sex, as the stat might suggest to many. Thus, the critics remind us about the need for clarity when sharing these figures.

To Stat, or not to Stat

To be honest, as I’ve learnt more about handling stats well I have looked back with regret on several things I have taught and written. It is so easy for us to pick the figures which support the point we want to make before we have really checked whether they are reliable. Applying these principles will, it’s true, take a bit more time, and will probably mean that we are less often able to use statistics to aid our teaching. But if we’re not teaching the truth, then what’s the point anyway?

Footnotes

  • 1 On the topic of suicide in particular, the Samaritans have some useful advice about suicide reporting which is designed for the media but is also useful to preachers.

← Prev article
Next article →