• Subscribe to our newsletter
The Media Online
  • Home
  • MOST Awards
  • News
    • Awards
    • Media Mecca
  • Print
    • Newspapers
    • Magazines
    • Publishing
  • Broadcasting
    • TV
    • Radio
    • Cinema
    • Video
  • Digital
    • Mobile
    • Online
  • Agencies
    • Advertising
    • Media agency
    • Public Relations
  • OOH
    • Events
  • Research & Education
    • Research
    • Media Education
      • Media Mentor
  • Press Office
    • Press Office
    • TMO.Live Blog
    • Events
    • Jobs
No Result
View All Result
  • Home
  • MOST Awards
  • News
    • Awards
    • Media Mecca
  • Print
    • Newspapers
    • Magazines
    • Publishing
  • Broadcasting
    • TV
    • Radio
    • Cinema
    • Video
  • Digital
    • Mobile
    • Online
  • Agencies
    • Advertising
    • Media agency
    • Public Relations
  • OOH
    • Events
  • Research & Education
    • Research
    • Media Education
      • Media Mentor
  • Press Office
    • Press Office
    • TMO.Live Blog
    • Events
    • Jobs
No Result
View All Result
The Media Online
No Result
View All Result
Home Broadcasting Television

Future of TV: we’re putting new personalised features into shows using an ethical version of AI

by Philip Jackson
March 9, 2022
in Television
0 0
0
Future of TV: we’re putting new personalised features into shows using an ethical version of AI

LightField Studios/Shutterstock

Share on FacebookShare on Twitter

“Look away now if you don’t want to know the score”, they say on the news before reporting the football results. But imagine if your television knew which teams you follow, which results to hold back – or knew to bypass football altogether and tell you about something else. With media personalisation, which we’re working on with the BBC, that sort of thing is becoming possible.

Significant challenges remain for adapting live production, but there are other aspects to media personalisation which are closer. Indeed, media personalisation already exists to an extent. It’s like your BBC iPlayer or Netflix suggesting content to you based on what you’ve watched previously, or your Spotify curating playlists you might like.

But what we’re talking about is personalisation within the programme. This could include adjusting the programme duration (you might be offered an abridged or extended version), adding subtitles or graphics, or enhancing the dialogue (to make it more intelligible if, say, you’re in a noisy place or your hearing is starting to go). Or it might include providing extra information related to the programme (a bit like you can access now with BBC’s red button).

The big difference is that these features wouldn’t be generic. They would see shows re-packaged according to your own tastes, and tailored to your needs, depending on where you are, what devices you have connected and what you’re doing.

To deliver new kinds of media personalisation to audiences at scale, these features will be powered by artificial intelligence (AI). AI works via machine learning, which performs tasks based on information from vast datasets fed in to train the system (an algorithm).

This is the focus of a partnership between the BBC and the University of Surrey’s Centre for Vision, Speech and Signal Processing. Known as Artificial Intelligence for Personalised Media Experiences, or AI4ME, this partnership is seeking to help the BBC better serve the public, especially new audiences.

Acknowledging AI’s difficulties

The AI principles of the Organisation for Economic Cooperation and Development (OECD) require AI to benefit humankind and the planet, incorporating fairness, safety, transparency and accountability.

Yet AI systems are increasingly accused of automating inequality as a consequence of biases in their training, which can reinforce existing prejudices and disadvantage vulnerable groups. This can take the form of gender bias in recruitment, or racial disparities in facial recognition technologies, for example.

Another potential problem with AI systems is what we refer to as generalisation. The first recognised fatality from a self-driving car is an example of this. Having been trained on road footage, which likely captured many cyclists and pedestrians separately, it failed to recognise a woman pushing her bike across a road.

We therefore need to keep retraining AI systems as we learn more about their real-world behaviour and our desired outcomes. It’s impossible to give a machine instructions for all eventualities, and impossible to predict all potential unintended consequences.

We don’t yet fully know what sort of problems our AI could present in the realm of personalised media. This is what we hope to find out through our project. But for example, it could be something like dialogue enhancement working better with male voices than female voices.

Ethical concerns don’t always cut through to become a priority in a technology-focused enterprise, unless government regulation or a media storm demand it. But isn’t it better to anticipate and fix these problems before getting to this point?

A group of people sitting in a circle.
The earlier we can confront AI engineers with any challenges, the sooner they can get to work. Rawpixel.com/Shutterstock

The citizen council

To design our personalisation system well, it calls for public engagement from the outset. This is vital for bringing a broad perspective into technical teams that may suffer from narrowly defined performance metrics, “group think” within their departments, and a lack of diversity.

Surrey and the BBC are working together to test an approach to bring in people – normal people, rather than experts – to oversee AI’s development in media personalisation. We’re trialling “citizen councils” to create a dialogue, where the insight we gain from the councils will inform the development of the technologies. Our citizen council will have diverse representation and independence from the BBC.

First, we frame the theme for a workshop around a particular technology we’re investigating or a design issue, such as using AI to cut out a presenter in a video, for replacement into another video. The workshops draw out opinions and facilitate discussion with experts around the theme, such as one of the engineers. The council then consults, deliberates and produces its recommendations.

The themes give the citizen council a way to review specific technologies against each of the OECD AI principles and to debate the acceptable uses of personal data in media personalisation, independent of corporate or political interests.

There are risks. We might fail to adequately reflect diversity, there might be misunderstanding around proposed technologies or an unwillingness to hear others’ views. What if the council members are unable to reach a consensus or begin to develop a bias?

We cannot measure what disasters are avoided by going through this process, but new insights that influence the engineering design or new issues that allow remedies to be considered earlier will be signs of success.

And one round of councils is not the end of the story. We aim to apply this process throughout this five-year engineering research project. We will share what we learn and encourage other projects to take up this approach to see how it translates.

We believe this approach can bring broad ethical considerations into the purview of engineering developers during the earliest stages of the design of complex AI systems. Our participants are not beholden to the interests of big tech or governments, yet they convey the values and beliefs of society.


Philip Jackson, Reader in Machine Audition, University of Surrey

This article is republished from The Conversation under a Creative Commons license. Read the original article.


Tags: advertisingAIartificial intelligencemediamedia personalisationPhilip Jacksontelevision

Philip Jackson

Philip Jackson is a Reader in Machine Audition at the University of Surrey’s Centre for Vision, Speech and Signal Processing (CVSSP), a lecturer in electronic engineering and an inaugural Surrey AI Fellow. An expert in machine listening, spatial audio and acoustical array signal processing, he directs research in object-based media, audio AI and audio-visual AI towards personalisation for producers to create exciting, high-quality user experiences.

Follow Us

  • twitter
  • threads
  • Trending
  • Comments
  • Latest
Kelders van Geheime: The characters are here

Kelders van Geheime: The characters are here

March 22, 2024
Dissecting the LSM 7-10 market

Dissecting the LSM 7-10 market

May 17, 2023
Keri Miller sets the record straight after being axed from ECR

Keri Miller sets the record straight after being axed from ECR

April 23, 2023
Getting to know the ES SEMs 8-10 (Part 1)

Getting to know the ES SEMs 8-10 (Part 1)

February 22, 2018
Sowetan proves that sex still sells

Sowetan proves that sex still sells

105
It’s black. It’s beautiful. It’s ours.

Exclusive: Haffajee draws a line in the sand over racism

98
The Property Magazine and Media Nova go supernova

The Property Magazine and Media Nova go supernova

44
Warrant of arrest authorised for Media Nova’s Vaughan

Warrant of arrest authorised for Media Nova’s Vaughan

41
Bigger isn’t always better: Here’s why

Bigger isn’t always better: Here’s why

May 23, 2025
Seven Days on Social Media: #TheGreatReset, #AmbushShowreels and #MadibaMagic

Seven Days on Social Media: #TheGreatReset, #AmbushShowreels and #MadibaMagic

May 23, 2025
Here be AI dragons: Navigating the monsters of misinformation

Here be AI dragons: Navigating the monsters of misinformation

May 22, 2025
Why your digital media might not be scaling

Why your digital media might not be scaling

May 22, 2025

Recent News

Bigger isn’t always better: Here’s why

Bigger isn’t always better: Here’s why

May 23, 2025
Seven Days on Social Media: #TheGreatReset, #AmbushShowreels and #MadibaMagic

Seven Days on Social Media: #TheGreatReset, #AmbushShowreels and #MadibaMagic

May 23, 2025
Here be AI dragons: Navigating the monsters of misinformation

Here be AI dragons: Navigating the monsters of misinformation

May 22, 2025
Why your digital media might not be scaling

Why your digital media might not be scaling

May 22, 2025

ABOUT US

The Media Online is the definitive online point of reference for South Africa’s media industry offering relevant, focused and topical news on the media sector. We deliver up-to-date industry insights, guest columns, case studies, content from local and global contributors, news, views and interviews on a daily basis as well as providing an online home for The Media magazine’s content, which is posted on a monthly basis.

Follow Us

  • twitter
  • threads

ARENA HOLDING

Editor: Glenda Nevill
glenda.nevill@cybersmart.co.za
Sales and Advertising:
Tarin-Lee Watts
wattst@arena.africa
Download our rate card

OUR NETWORK

TimesLIVE
Sunday Times
SowetanLIVE
BusinessLIVE
Business Day
Financial Mail
HeraldLIVE
DispatchLIVE
Wanted Online
SA Home Owner
Business Media MAGS
Arena Events

NEWSLETTER SUBSCRIPTION

 
Subscribe
  • About
  • Advertise
  • Privacy & Policy
  • Contact

Copyright © 2015 - 2023 The Media Online. All rights reserved. Part of Arena Holdings (Pty) Ltd

No Result
View All Result
  • Home
  • MOST Awards
  • News
    • Awards
    • Media Mecca
  • Print
    • Newspapers
    • Magazines
    • Publishing
  • Broadcasting
    • TV
    • Radio
    • Cinema
    • Video
  • Digital
    • Mobile
    • Online
  • Agencies
    • Advertising
    • Media agency
    • Public Relations
  • OOH
    • Events
  • Research & Education
    • Research
    • Media Education
      • Media Mentor
  • Press Office
    • Press Office
    • TMO.Live Blog
    • Events
    • Jobs

Copyright © 2015 - 2023 The Media Online. All rights reserved. Part of Arena Holdings (Pty) Ltd

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?