• Subscribe to our newsletter
The Media Online
  • Home
  • MOST Awards
  • News
    • Awards
    • Media Mecca
  • Print
    • Newspapers
    • Magazines
    • Publishing
  • Broadcasting
    • TV
    • Radio
    • Cinema
    • Video
  • Digital
    • Mobile
    • Online
  • Agencies
    • Advertising
    • Media agency
    • Public Relations
  • OOH
    • Events
  • Research & Education
    • Research
    • Media Education
      • Media Mentor
  • Press Office
    • Press Office
    • TMO.Live Blog
    • Events
    • Jobs
No Result
View All Result
  • Home
  • MOST Awards
  • News
    • Awards
    • Media Mecca
  • Print
    • Newspapers
    • Magazines
    • Publishing
  • Broadcasting
    • TV
    • Radio
    • Cinema
    • Video
  • Digital
    • Mobile
    • Online
  • Agencies
    • Advertising
    • Media agency
    • Public Relations
  • OOH
    • Events
  • Research & Education
    • Research
    • Media Education
      • Media Mentor
  • Press Office
    • Press Office
    • TMO.Live Blog
    • Events
    • Jobs
No Result
View All Result
The Media Online
No Result
View All Result
Home Digital

After the ‘Facebook Files’, the social media giant must be more transparent

by Nicolas Suzor
May 29, 2017
in Digital
0 0
0
After the ‘Facebook Files’, the social media giant must be more transparent
Share on FacebookShare on Twitter

Most people on Facebook have probably seen something they wish they hadn’t, whether it be violent pictures or racist comments.

How the social media giant decides what is and isn’t acceptable is often a mystery. Internal content guidelines, recently published in The Guardian, offer new insight into the mechanics of Facebook content moderation.

The slides show the rules can be arbitrary, but that shouldn’t be surprising. Social media platforms like Facebook and Twitter have been around for less than two decades, and there is little regulatory guidance from government regarding how they should police what people post.

In fact, the company faces a significant challenge in trying to keep up with the volume of posted content and often conflicting demands from users, advertisers and civil society organisations.

It’s certainly cathartic to blame Facebook for its decisions, but the true challenge is to work out how we want our online social spaces to be governed.

Before we can have that conversation, we need to know much more about how platforms like Facebook make decisions in practice.

The secret work of policing the internet

Apparently weighing in at thousands of slides, the newly published guidelines give some more detail to the vague community standards Facebook shares with its users.

Most of the documents are training material for Facebook’s army of content moderators who are responsible for deciding what content should go.

Some of the distinctions seem odd, and some are downright offensive. According to the documents, direct threats of violence against Donald Trump will be removed (“someone shoot Trump”), but misogynistic instructions for harming women may not be (“to snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat”).

The Guardian’s Facebook Files explainer video.

The rules appear to reflect the scars of legal and public relations battles Facebook and other social media platforms have fought over the last decade.

The blanket rule against images of nude children had to be changed after Facebook controversially banned the famous image of Kim Phuc fleeing napalm bombing during the Vietnam War. After years of controversy, a specific procedure now exists so people can request the removal of intimate images posted without their consent.

Because these rules develop over time, their complexity is not surprising. But this points to a bigger problem: without good data about how Facebook makes such decisions, we can’t have informed conversations about what type of content we’re comfortable with as a society.

The need for transparency

The core problem is that social media platforms like Facebook make most decisions about what constitutes acceptable speech behind closed doors. This makes it hard to have a genuine public debate about what people believe should be allowable to post online.

As the United Nations’ cultural organisation UNESCO has pointed out, there are real threats to freedom of expression when companies like Facebook have to play this role.

When governments make decisions about what content is allowed in the public domain, there are often court processes and avenues of appeal. When a social media platform makes such decisions, users are often left in the dark about why their content has been removed (or why their complaint has been ignored).

Challenging these decisions is often extremely difficult. Facebook allows users to appeal if their profile or page is removed, but it’s hard to appeal the moderation of a particular post.

OnlineCensorship.org provides guidance to users about how to appeal content moderation decisions.
https://onlinecensorship.org/resources/how-to-appeal

To tackle the issue of offensive and violent content on the platform, Facebook says it will add 3,000 people to its community operations team, on top of its current 4,500.

“Keeping people on Facebook safe is the most important thing we do,” Monika Bickert, head of global policy management at Facebook, said in a statement. “We work hard to make Facebook as safe as possible while enabling free speech. This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously”.

But without good data, there is no way to understand how well Facebook’s system is working overall – it is impossible to test its error rates or potential biases.

Civil society groups and projects including Ranking Digital Rights, Article 19 and the Electronic Frontier Foundation’s OnlineCensorship.org have been advocating for more transparency in these systems.

Facebook and other social media companies must start listening, and give the public real insight and input into how decisions are made.

Nicolas Suzor, Associate professor, Queensland University of Technology

This article was originally published on The Conversation. Read the original article.

Image: The Guardian’s Facebook Files give a much-needed glimpse into how Facebook moderates content.
www.shutterstock.com

Tags: FacebookFacebook FilesFacebook moderatorsfreedom of speechhate speechNicolas Suzorsocial media

Nicolas Suzor

Associate Professor Nicolas Suzor researches the regulation of networked society. He is a Principal Research Fellow in the Law School at Queensland University of Technology in Brisbane, Australia, and a Chief Investigator of QUT’s Digital Media Research Centre, where he leads a program of research on the regulation and governance of the internet and social media. Nic is an ARC DECRA research fellow, studying the regulation of internet intermediaries and social media platforms. His research examines the peer economy, the governance of social networks, digital copyright, and knowledge commons. Nic is also the Legal Lead of the Creative Commons Australia project and the deputy chair of Digital Rights Watch, an Australian non-profit organisation whose mission is to ensure that Australian citizens are equipped, empowered and enabled to uphold their digital rights.

Follow Us

  • twitter
  • threads
  • Trending
  • Comments
  • Latest
Kelders van Geheime: The characters are here

Kelders van Geheime: The characters are here

March 22, 2024
Dissecting the LSM 7-10 market

Dissecting the LSM 7-10 market

May 17, 2023
Keri Miller sets the record straight after being axed from ECR

Keri Miller sets the record straight after being axed from ECR

April 23, 2023
Getting to know the ES SEMs 8-10 (Part 1)

Getting to know the ES SEMs 8-10 (Part 1)

February 22, 2018
Sowetan proves that sex still sells

Sowetan proves that sex still sells

105
It’s black. It’s beautiful. It’s ours.

Exclusive: Haffajee draws a line in the sand over racism

98
The Property Magazine and Media Nova go supernova

The Property Magazine and Media Nova go supernova

44
Warrant of arrest authorised for Media Nova’s Vaughan

Warrant of arrest authorised for Media Nova’s Vaughan

41
South Africa’s commerce media moment has arrived

South Africa’s commerce media moment has arrived

May 30, 2025
Seven Days on Social Media: Child Protection Week, #MyDisappointment and a soppy seal

Seven Days on Social Media: Child Protection Week, #MyDisappointment and a soppy seal

May 30, 2025
Navigating the AI tide without losing our humanity

Navigating the AI tide without losing our humanity

May 29, 2025
The marketing mission remains clear

The marketing mission remains clear

May 29, 2025

Recent News

South Africa’s commerce media moment has arrived

South Africa’s commerce media moment has arrived

May 30, 2025
Seven Days on Social Media: Child Protection Week, #MyDisappointment and a soppy seal

Seven Days on Social Media: Child Protection Week, #MyDisappointment and a soppy seal

May 30, 2025
Navigating the AI tide without losing our humanity

Navigating the AI tide without losing our humanity

May 29, 2025
The marketing mission remains clear

The marketing mission remains clear

May 29, 2025

ABOUT US

The Media Online is the definitive online point of reference for South Africa’s media industry offering relevant, focused and topical news on the media sector. We deliver up-to-date industry insights, guest columns, case studies, content from local and global contributors, news, views and interviews on a daily basis as well as providing an online home for The Media magazine’s content, which is posted on a monthly basis.

Follow Us

  • twitter
  • threads

ARENA HOLDING

Editor: Glenda Nevill
glenda.nevill@cybersmart.co.za
Sales and Advertising:
Tarin-Lee Watts
wattst@arena.africa
Download our rate card

OUR NETWORK

TimesLIVE
Sunday Times
SowetanLIVE
BusinessLIVE
Business Day
Financial Mail
HeraldLIVE
DispatchLIVE
Wanted Online
SA Home Owner
Business Media MAGS
Arena Events

NEWSLETTER SUBSCRIPTION

 
Subscribe
  • About
  • Advertise
  • Privacy & Policy
  • Contact

Copyright © 2015 - 2023 The Media Online. All rights reserved. Part of Arena Holdings (Pty) Ltd

No Result
View All Result
  • Home
  • MOST Awards
  • News
    • Awards
    • Media Mecca
  • Print
    • Newspapers
    • Magazines
    • Publishing
  • Broadcasting
    • TV
    • Radio
    • Cinema
    • Video
  • Digital
    • Mobile
    • Online
  • Agencies
    • Advertising
    • Media agency
    • Public Relations
  • OOH
    • Events
  • Research & Education
    • Research
    • Media Education
      • Media Mentor
  • Press Office
    • Press Office
    • TMO.Live Blog
    • Events
    • Jobs

Copyright © 2015 - 2023 The Media Online. All rights reserved. Part of Arena Holdings (Pty) Ltd

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?