Thanks to the release of Steven Spielberg’s movie with the same name, the term Artificial Intelligence became somewhat of a pop culture motif in 2001. But while the movie’s premise of the first robot programmed to love remains the stuff of imagination, Artificial Intelligence is a buzzword increasingly used in the mainstream.
The truth is that breakthroughs in AI are happening thick and fast, and major tech companies like Google, Amazon and Apple are implementing AI in production right under our noses. Isn’t it suspicious how you just Googled ‘Nike’ and the brand’s latest shoes are appearing in your Facebook feed? The wondrous algorithms at work!
While the average joe does their best to wrap their heads around the magic algorithm, journalists are having to keep up, adapt and rethink the way they gather news.
But what’s an algorithm?
You’d be forgiven for not knowing the answer. But think of an algorithm like a recipe. It’s essentially a series of steps taken to fix a problem. They’re used to manipulate data in various ways. An algorithm can mine data, calculate a statement, prioritise the best results in a Google search, or even recommend other links and articles. And like Facebook, algorithms also filter the best information tailored to your searches.
But what’s really interesting is that these days algorithms are actually producing automated news content. This has given rise to a new form of reporting called automated journalism. Author Matt Carlson writes that; “[automated journalism] denotes algorithmic processes that convert data into narrative news texts with limited to no human intervention beyond the initial programming choices”. In other words, humans write the code and insert the data, and the machine answers with news text.
Robot reporting
You’d be surprised to hear how many news outlets are doing this. Associated Press was among the first to use AI to produce automated financial reports of public companies. Bloomberg News has done something very similar with its project Cyborg. Meanwhile, Forbes has built its own smart content management system called Bertie. This system recommends ways to make headlines more interesting, and even which images to use. And in Sweden, United Robots has been automating sports reports for years now.
There are, of course, the natural concerns that automated journalism may cause newsrooms to shrink even further. But that couldn’t be farther from the truth. Humans still outclass machines in social intelligence and expert thinking. This means human reporting remains critical. You need writers to correct and spruce up automated content. Most media outlets using AI today say it’s complementing their journalists’ work, and giving them more space to focus on the stories that really matter.
Like anything in life, automated journalism has its pros and cons. It’s fast and provides short breaking news reports, for example the magnitude of an earthquake. The LA Times is doing this, using what it calls the Quake Bot. What’s more is that algorithms can take on large sets of data over lengthy periods, but, and this is a considerable but – finding enough rich data remains a major challenge. Such content also lacks human, social and legal reasoning. And the writing is generally boring and straight forward.
But this is just the start of something exciting.
Accountable algorithms
Spider-man’s Uncle Ben once said: “With great power comes great responsibility”. The flip side to all of this is that algorithms have to be transparent, and the media should be accountable. Media organisations have a level of responsibility to reveal the data they use and how their algorithms arrive at their conclusion. This can be in the form of a byline. But right now almost every media group is doing this differently.
While being transparent is one thing, journalists are now also investigating algorithms themselves. In 2016, ProPublica revealed how a risk assessment algorithm used in the US criminal justice system was found to be inherently racist. Journalists have another duty; to spot bad algorithms and report errors.
For example, we spent some time on Google trying to find dodgy autocomplete sentences. It’s quite unbelievable that when you type the phrase ‘South Africa is a…’ into Google, the engine’s algorithmic autocomplete brings up; ‘South Africa is a dump’. In fact, three out of the top five recommended autocomplete sentences are negative, describing the country as a ‘mess’ and a ‘failed state’.
When you click on the autocomplete search: ‘South Africa is a dump’ the top result offers up an article from online news publication My Broadband. The article was posted on 3 December 2018 and was penned by Staff Writer. The report lists only scandals and negative reporting. The reporting smacks of opinion, but nowhere does the supposed news article article say so. The fact is that the article carries a dangerous SEO headline designed to drive website traffic. It’s clickbait. And the result? Google’s top search claims ‘South Africa is a dump’, thus creating a false perception.
Both My Broadband and Google bear the responsibility here. But Google’s introduced a tool to report search prompts that violates its policies. And we’ve since reported the ‘South Africa is a dump’ autocomplete search.
South Africa’s algorithmic journey
For the most part, South Africa is switched on. According to the latest stats, 54% of South Africa’s population were using the internet during February 2019. And it’s no surprise that Facebook remains the most popular social media site, followed by LinkedIn and Instagram. While there are no numbers to show us how popular Google is, the yearly searches reveal the platform’s used often. And News24 remains the country’s most visited website. But despite this growing base, South African media are yet to produce automated news.
“South African media generally haven’t even got their hands around using algorithms to present news that’s already written by other journalists,” says veteran tech journalist, Arthur Goldstuck. He tells us automated journalism is a “perilous exercise”, but adds it’s “fine if you simply want to produce basic facts that come out of a financial report”.
So if local media aren’t fully on board yet with algorithms, what about government? AI, Big Data and the Internet of Things appear to be new terms for government. But in January, the state did promise to bolster technology innovation by creating a legal and regulatory framework so that it can finally take on the fourth industrial revolution.
Meanwhile, though, the state-owned Council for Scientific and Industrial Research has been quietly hard at work on algorithms since 2014. The research group says it’s been “working on algorithms that can potentially be used to predict various forms of crime before it occurs by looking at crime statistics” and had “successfully predicted the outcome of the 2014 national and provincial elections to within one percent”.
Back to the future
“We’re not yet ready for the mainstream,” says Goldstuck. “[But] perhaps 10 years from now the technology will be ready to produce more chatty, colloquial and style-based and analytics based reporting through AI.”
Goldstuck believes we’ll get there eventually, but others argue that South Africa in general is simply not ready for AI.
For now, we can just imagine the possibilities for our local media industry.
With the country’s ever-changing multi-party democracy and current battles to try and keep the taps and lights on, our audiences are starving for updated information.
Imagine a news app that informs us about elections or water shortages in our area? Or an app that tracks reported crime on any given street?
We have the data. We just need the know-how.
Graeme Raubenheimer is an award-winning journalist and TV news producer for e.tv’s new channel, Open News, in Cape Town. Twitter: @GraemeRauby