BBC’s Magic 96 %

How often do you tune in to the BBC?

According to various bodies, the National Average Weekly Reach in 2014 [1] was as follows:

65.8% listened to BCC radio, for at least 5 consecutive minutes in a week.

81.5% watched BBC television, for at least 15 minutes minimum in a week.

49.5% used BBC Online weekly.

Since 2011, the BBC has been bragging of a combined 96% weekly reach and has been using this statistic to justify why every house with a TV (receiving or recording live broadcasts) should be subject to the £145.50 licence fee.

brag

Nobody seemed to have questioned this assertion.

Until now.

As the BBC quotes a “Cross Media Insight Survey”, I set my sights on finding one. I’m afraid that, despite my best efforts, I did not succeed yet. But I found the BBC’s definition of such a survey, and that’s striking gold in itself, as you’ll see.

“Cross-Media Insight (CMI) is a BBC survey designed to look at consumption across a wide range of media, including television, radio and online. The survey is designed to be a single-source measurement system […] administered by GfK NOP.

CMI is a weekly survey of 500 respondents, 450 of whom are on-line and 50 who are recruited offline so the total sample is designed to be representative of the UK by age, sex, social grade and region. In addition, the results are weighted to known proportions in the population so that the results are reliable at a total level and are not subject to sampling fluctuations.

Each respondent answers the CMI survey for a week – they fill in a daily questionnaire which identifies the TV programmes they have watched, the radio stations they have listened to and the websites they have visited across a wide range of channels, both BBC and non-BBC.” [2]

In short:

  • The Average Weekly Reached is based on a microscopic sample of people
  • The length of time qualifying as “reach” is insignificant.
  • Wild extrapolations are made from those data.

DEFINITION OF REACH

One thing to challenge is the minimum length of time that quantifies “reach”. Why is the industry happy to accept anything less than watching/listening to a complete programme? To put that in context, listening to 5 minutes of radio is listening to one song and tuning out as soon as the DJ speaks.  And who honestly watches half or a quarter of a TV programme? Can we just assume viewers who only tune in for only 15 minutes didn’t enjoy the show? On what basis are 5 radio minutes and 15 TV minutes statistically significant? I suspect most of us would have a Weekly Average of more time in the loo.

ACCURACY

Another thing I find questionable is the assumption that, if “96% of 500 people use the BBC weekly”, it should mean “47 million people use the BBC weekly”. I’m afraid it’s a stretch too far and requires too much wishful thinking for my taste. The lower your sample size is, the higher your margin of error and the lower your confidence level becomes, to the point where hard data becomes pure fantasies. So why anyone would believe the BBC’s lazy calculation is a mystery to me.  I guess people just assume a proper survey is conducted because:

a) being impartial and trustworthy is at the core of BBC’s official values and,

b) with a budget in excess of 5 billion, money is no object.

According to SurveyMonkey.com, the size of sample needed for a population the size of the UK is, depending on the level of accuracy wanted, between 6,000 and and 41,000. But let’s not talk in the abstract. Let’s put things in perspective and compare the BBC’s samples with real samples used by other organisations.

  • RAJAR (Radio Joint Audience Research), the official body in charge of measuring radio audiences in the United Kingdom, interview, over 50 weeks, approximately 110,000 respondents aged 15+, plus roughly 4,000 children between 10 and 15 years of age. The quarterly population used recently was 53,502 RAJAR operates a sweep, which means that its respondents only participate for one week.
  • BARB (Broadcasters’ Audience Research Board) is the organisation responsible for providing the official measurement of UK television audiences. They use a panel. In other words, around 5,100 homes selected for representativeness are fitted with a meter to report the TV viewing of the each person in the household. This means around 11,300 individual respondents are monitored daily.

So why are the BBC’s samples 228 times smaller than RAJAR’s and 22 times smaller than BARB’s? A wild guess is that accuracy is not a very high priority for the BBC.

REPRESENTATIVENESS

But what about being representative? What guarantee do we have that the BBC’s otherwise ridiculously small samples are, at least, both randomly picked (as opposed to subjectively selected) and yet representative of the population? None, except their word for it (see definition of Cross Media Insight Survey above) and that’s just not good enough for me.

Considering that nearly 19 % of the population don’t have a TV licence, how many people in this category are included in the Cross Media Insight Survey? These people are non-negligible in quantity and in effect, if their television habits are anything to go by. If they choose to ignore BBC by choice (including its radio network), it’s more than likely that their results would drive the National Average Weekly Reach down dramatically. Mine would.

TRANSPARENCY

Since publishing its strategy “Putting Quality First” in December 2010, the BBC pledged to set new standards of openness and transparency. So they started to produce neat little tables, quarterly. (See sample reproduced below.)

weekly reach

Considering that those tables lack:

  • a proper link to the full results
  • a description of the type of sample (sweep versus panel)
  • criteria used for the selection of the sample
  • details of the composition of the sample (gender, age, location, social position)
  • how the surveys are conducted, and
  • how the data is weighted

I have come to the conclusion that our definition of openness and transparency plainly do not align.

SHENANIGANS

Also, I discovered that the BBC changed the way it was compiling data. This explained going from 93% in 2008 t0 97% in 2009. The footnote in the Annual Report is very enlightening.

93 to 97%

Also, the BBC thinks old data can be forged, as demonstrated in the document British Bold Creative (here) where BBC states that the reach in 2007/08 was 96.9% when it was only 93% according to the relevant Annual Report.

93 to 96%

 

UPDATE

The BBC now claims a 99% weekly reach (see here)

SECOND UPDATE

It’s now “around 95%” (click here)

 

My reply to this has been eloquently written by George Orwell, in ‘1984’ (page 48. I only replaced the word “Ministry of Plenty” with BBC’s logo.)

1984 stats small

As they say, “There are lies, damned lies and statistics“. And on this bombshell…

 


[1]April to June. http://downloads.bbc.co.uk/aboutthebbc/insidethebbc/howwework/accountability/pdf/summary_audience_information_apr_jun_2014.pdf

[2]http://downloads.bbc.co.uk/aboutthebbc/insidethebbc/howwework/accountability/pdf/context_document_january_march_2013.pdf

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s