Digital audience measurement has a long way to go in South Africa, says Byron John.
Over the last few months I’ve been engaged in the ongoing debate over what will happen with the South African Audience Research Foundation (Saarf). There are opposing views and arguments between those in favour of Saarf and those who would prefer to start afresh.
Needless to say it’s drenched in politics, which makes it even more complex and confusing. I find it fascinating how people vehemently defend their argument depending on which ‘hat’ they’re wearing and whose interests they are protecting. I would argue that online audience measurement in South Africa is on a similar trajectory if the conversations with all parties (digital agencies, publishers, advertisers, advertising bodies and the research fraternity) aren’t encouraged and transparency enforced.
Are we being shown the full picture when it comes to digital measurement in South Africa?
Take a look at the Top 10 websites browsed by South African online audiences during August 2014, according to Effective Measure:
4. BBC sites
When I first moved to the digital industry about two years ago, something didn’t seem right with this data and I began asking questions. My first question was, “Does this accurately reflect the online audience universe?” I’m not asking for perfection, but I want to be sure that it gives the most accurate reflection of the online audience in South Africa.
Now let’s compare the list of the Top 10 websites browsed by South African audiences online according to Alexa.com, accessed on the last day of August 2014:
This shows major inconsistencies between the two. Which top 10 list is more accurate? When I questioned those at Effective Measure about the discrepancy, they indicated that I was only able to see the tagged sites in South Africa. They said if I wanted to see a more accurate view of the sites browsed by the South African online audience, the panel data would need to be enabled as it caters for all sites that have not been tagged. This is all well and good, but I was not allowed to see the panel data for a number of reasons that just didn’t make sense.
On the basis of this, I had to understand the differences and merits of both the tagged and panel methodology to understand what was going on with online audience measurement in South Africa.
What this means is that if you visit Facebook.com on your desktop, then go to the kitchen to make coffee, quickly browse a few more pages of Facebook.com on your phone, and then later in the evening you visit Facebook.com on your tablet, you’ve been counted three times in one day although only one of you actually viewed the webpage.
This is where the UB figure is flawed because it’s over-counted. This can be mitigated of course if you have a panel that works in collaboration with tags to give you a view of the individuals and their subsequent browsing behaviour.
This is one of the limitations of ‘tagged-only’ technology as it does not allow for any human intervention.
Another major limitation is in the sampling universe. The way it currently stands is that publishers have to agree to have their sites tagged and, while local sites are very willing to be tagged and measured, larger international sites such as Facebook, Google and LinkedIn (among many other major international sites) are not.
There are several good reasons for this – one is that publishers are concerned that these tags will slow down their sites, leading to a more inferior user experience. A second reason is that these major publishers, for the most part, already have contracts in place with other online audience measurement partners across the globe in much bigger markets. This conundrum leaves us with a mass of websites that are simply not tagged and therefore not visible in our ‘tagged only’ dashboard as displayed in Effective Measure.
To have every website on the planet tagged given the infinite chaos that is the internet is just ludicrous. To have a skewed, local-only view of websites in our Effective Measure dashboard is also ludicrous. Who are we kidding? I would argue that this in fact does more harm to our digital industry. Why? Because of a simple survival tactic most strategists use when planning their media plans: “We don’t see the world the way things are, we see the world the way we are.”
Here’s a simple question: ‘Of those who accessed the internet in South Africa in August 2014, how many would you say accessed Google?’ The answer is varied, but I very rarely hear a figure below 70% with most stating 90% and 100%. The Effective Measure panel data, however, shows that for the month of August 2014, Google reached a mere 50% of the online population (still the largest reach by any one site in the country, of course).
The problem is that when digital planners need to allocate their digital budgets according to site traffic, it’s based on:
- What they are told directly about the site traffic by the international publishers;
- What they are given as a subscriber number; or
- Them taking a flyer and spending a disproportionate amount of money on these major platforms.
So then how accurate is the panel data?
Let’s put this into perspective. The Amps data is based on approximately 25 000 respondents. I understand it’s a probability sample so naturally it gets credence over online, which is – by its very nature – a convenience sample because it is sampled only online and based only on tagged sites or ‘volunteer publishers’. That being said, Effective Measure bases its panel data on a 10 000-strong active panel with a confidence level of 98% and an error margin below 5%. In addition, it is still the technology doing all the measuring. In other words, it’s a ‘plugin’ installed in the browsers of 10 000 panelists, which effectively does exactly what a tag does. That is because the plugin is client-based (browser) and there is no load on the server side (website).
The existing TV panel (even the DStv-i one) with approximately 3 500 panelists is considered reliable enough to base over 40% of all ad spend in the country. However, there is a debate as to whether we should allow a 10 000-strong online panel to exist in our industry. I find this hard to understand. Furthermore, some believe that the sampling methodology for the panel is skewed. Well, if you’re going to argue that the sampling is skewed then you have to question the entire tagged universe.
You see, the sampling methodology is directly related to which sites are tagged and which sites are not. Despite inherent panel skew, the technology caters for this and allows in-depth analysis of where the demographic profile of the panel is weak and corrects this by ensuring that recruitment of browser plugins is executed for the demographic ‘gaps’ or ‘holes’.
The digital universe has no boundaries so there are many more international sites out there that South Africans are browsing. We are unable to see them in Effective Measure because we don’t have access to the panel data. The South African market is shooting itself in the foot by ringfencing mostly local publisher sites, leaving us with the impression that it’s a much smaller digital market than we actually think.
If we’re going to shift ad spend from print, radio, out of home media and TV to digital, we need to show how large the digital reach actually is in this country. Only then will marketers be convinced of how disproportionate digital spend is in South Africa. When we are ‘allowed’ to see the ‘full picture’, digital planners will be able to make informed decisions about the proportions they should be spending on search, social or any other major site categories South Africans love to browse. More than that, I’m of the view that, before we paint them as the enemy, sites like Google and Facebook are helping grow our local traffic. Many local publishers are discovered and discoverable through social networks and/or through search.
Unfortunately my reception of Effective Measure in the digital industry has been one of unfounded statements, the comments of which include the likes of “EM is inaccurate”, “EM doesn’t represent the online universe in SA” and “EM favours certain sites”. These statements can’t go on and it is the responsibility of the entire digital industry, Effective Measure and the Interactive Advertising Bureau (IAB) to question why the panel data is not available in South Africa. EM works in 44 countries and South Africa is the only market where the panel view is limited. We want answers and directions for the common good of the industry. The digital industry needs to decide what that is.
Byron John is insights and innovations director at Habari Media.
This story was first published in the November 2014 issue of The Media magazine.
Want to continue this conversation on The Media Online platforms? Comment on Twitter @MediaTMO or on our Facebook page. Send us your suggestions, comments, contributions or tip-offs via e-mail to firstname.lastname@example.org.