As usual, I attended the Broadcast Research Council’s presentation of the BRC RAM and SEM results in Johannesburg. It seemed a particularly opportune time to catch up with some of the users after the presentation to find out their reactions to the data.
The first presentation of the year (February ’18, in this case) was a particularly interesting one to attend, because it covered both the last six months of data (July – December 2017) and the full year data (January – December 2017).
The addition of the annual report was an innovation introduced by the BRC. Clare O’Neil, CEO of the BRC, reminded the audience that its purpose was to allow small base and community stations to build up their samples over a 12-month period, in order to meet the required minimum threshold of 40 respondents for data publication. This facilitated the release of results for 51 additional community stations in this report. O’Neil also pointed out that the large sample of randomly selected 30 014 households or 61 276 individuals, achieved through flooding, also made this report ideal for in-depth station profiling and deep dives into various aspects of listenership, such as device usage by station and time.
While the format of the presentation of both the six-month and 12-month data was familiar – not surprisingly, given that this is the seventh release of the data – O’Neil indicated that there were some points that she needed to reiterate and highlight. These answered questions, that the data users had raised with the BRC. The first point of these concerned the sample: the approximately 30 000 households sampled annually, allowed the data to be representative of the 15+ population on a national and provincial level.
Because queries have been raised about the suitability of the sample for reading community stations, she explained that the sample would have to expand to 300 000 households if it were to be representative of all the small areas, defined by Statistics SA. This would clearly place the study beyond the point of affordability for the commercial and public service broadcasters, which take the lion’s share of advertising revenue. After the presentation, consultant strategist, Karen Dyke, and I discussed how compromise is inevitably inherent in all research decisions, and how the funding model of media currency research limited its ability to cover community media.
What media planners should know
Also on the topic of the sample, O’Neil reiterated that because it is designed to be representative of the adult population at a provincial level, the sample was not built up on the basis of individual station footprints. (This would, of course, bias the results.) It was points like these that independent media consultant, Elana de Swardt, felt were important, adding that she was sure there were a lot of media planners who did not know this.
Similarly, she believes most data users were not familiar with the concept of ‘flooding’, nor were they aware that the questions on programme preferences were asked of only the main household respondents rather than the sample as a whole. Her observations certainly reinforced my belief that more of the media agency data users should attend the release presentations as they provide the information needed to work intelligently with the data.
There are a few additional points that data users would do well to remember. The BRC RAM sample frame does not directly mirror that of the Establishment Survey; it has a 60% metropolitan bias vs the ES’s 41%. There is an annual population update based on HIS data: the next will be in November 2018.
Station performance: Community radio shines
Reporting on station performance, O Neil and research director, Setshwano Setshogo, underlined the stability of the results. Four commercial and PBS stations had shown significant growth for the period July ’17 – December ’17 over April ’17 – Sep’17: Gagasi (up 11%), Ligwalagwala (up 20%), Phalaphala (up 16%) and Vuma (up 29%).
The roundup of provincial top performers, based on audience and listening share, showed the strong performance of some community stations. For example, in both the Western Cape and KwaZulu-Natal, three community stations ranked in the top 10: Radio Zibonele, Radio Tygerberg and CCFM in the former and Izwi LoMzansi, Nongoma and Icora in the latter.
De Swardt told me she was delighted to see that many big community radio stations numbers are growing, and confirmed that she was “going to use more of those!” In Gauteng, Jozi is the only community station to rank in the top 10, but as Karen Dyke pointed out, that sets it ahead of 702. Given that 702 features in the top five advertising revenue takers nationally, it does suggest that some advertising decision-makers should familiarise themselves with the BRC RAM data!
When the BRC presents the key station measures, it shows both yesterday and average week audiences in addition to exclusive listenership and time listened. It is probably simplistic to say that broadcasters favour talking about week cumes (the bigger number), while canny buyers might prefer to look at average day figures. (Of course, what advertisers actually buy, when they select spots, are quarter hour audiences.) It does seem that it would simplify matters if both sides of the industry agreed one measure, but that could limit the latitude for negotiation, or even be construed as collusion.
High station loyalty
Before the presentation started, I had met a relative newcomer to our shores, Vineel Agarwal, regional director for Havas Africa (East & Southern), and I was pleased to hear her positive reaction to the presentation. She was surprised and impressed by the stability in the data and high station loyalty. The latter was certainly an area that she believed deserved exploration to better understand how it played out across demographics. She commented that long listening times added to loyalty, and suggested that the medium warranted usage outside of traditional drive-time.
While it was reassuring to hear positive comments from someone who had come afresh to the data, I certainly sensed that there is now a significant level of comfort with the data among media agency users as they become accustomed to it, and understand the decisions and trade-offs that have to be made.
I would have left the presentation feeling thoroughly positive about radio’s fortunes, were it not for a disquieting comment from one of the agency representatives, which resonated with other users. The point made was that it was one thing for the BRC to commission and deliver the data, it was quite another, for compelling stories to be woven from them. That task lies not with the BRC, but with the broadcasters, with their marketing intelligence and sales teams. Now that agency decision-makers are growing more accustomed to the data, this is the next challenge.
Having spent some decades working in the media agencies, Britta Reid now relishes the opportunity to take an independent perspective on the South African media world, especially during this time of radical research transformation.