More than 200 prominent women signed an open letter calling for big tech companies to “prioritise the safety of women”.
Posted on the World Wide Web Foundation, the letter coincided with the Generation Equality Forum in Paris, which brought together governments, businesses, international organisations and civil society to make progress towards achieving gender equality. During the forum, Facebook, Google, TikTok and Twitter released commitments to improve the safety of women online, including providing easy navigation and access to safety tools and reducing the amount of abuse they see.
The letter emphasised the need for giving users greater control to manage their safety online, including who can interact with them and more choice on what, how and when they see content online. It also called for improving systems for reporting abuse.
The signatories include Julia Gillard, the former prime minister of Australia; Kathryn D. Sullivan, a US astronaut; Graça Machel, Nelson Mandela’s widow; and actors Ashley Judd, Thandiwe Newton and Emma Watson.
“The internet is the town square of the 21st century,” the signatories wrote. “It is where debate takes place, communities are built, products are sold and reputations are made. But the scale of online abuse means that, for too many women, these digital town squares are unsafe. This is a threat to progress on gender equality.”
Have you read?
- Why we need a global framework to regulate harm online
- Digital harm is on the rise – here’s how we can give victims a pathway to justice
- Why so much harmful content has proliferated online – and what we can do about it
Why do we need to address safety for women online?
The letter cited a 2020 Economist study that found that “38% of women globally have directly experienced online abuse.” According to the study, 85% had witnessed online violence against women. Younger women were more likely to report experiencing online abuse.
The form of online abuse varies, from cyber harassment to hate speech to impersonation to misinformation and defamation, according to the study.
The Covid-19 pandemic has increased the potential abuse, as people have spent more time online during lockdowns.
The World Economic Forum’s Advancing Digital Safety: A Framework to Align Global Action report surveyed 50 experts from academia, civil society, government and business and found that 75% agree or strongly agree that platforms do not adequately act on harmful content. This challenge increases for content that has a less clear definition of harm, with 90% saying that it is handled somewhat or highly ineffectively.
What can we do to increase online safety?
According to the Forum report, employers, advertisers, regulators and digital platforms all have important roles to play in ensuring a safe online environment. Employers can work to secure devices and train employees, advertisers can ensure safety guidelines and take more nuanced approaches to keyword usage and ad placement, and regulators can step up their efforts around monitoring and response.
Digital platforms can enable users to make more informed choices about content, increase cross-platform collaboration to counter online harassment, make safety a leadership priority, and address contracting and work environment issues with the content moderation workforce.
“When it comes to harmful content, today the metrics reported on by platforms focus largely on the absolute number of pieces of content removed, which do not provide an adequate measure of safety according to a user’s experience,” says Cathy Li, head of media, entertainment and sport industries at the World Economic Forum.
“And it certainly does not reflect the important nuance that certain groups – based on their gender, race, ethnicity and other factors, in this case women particularly – may be more exposed to harmful content,” she continues. “This is why we recommend a user‑centric framework which advocates complementing current measures with metrics that focus on the user experience of safety and work to improve it over time to advance digital safety.”
The Forum’s new Global Coalition for Digital Safety will bring together public and private leaders to share best practices for online safety regulation, take action to reduce the risk of harmful content and collaborate on programs to increase digital media literacy.
As the letter’s signatories wrote: “Imagine what you can achieve if you follow through on commitments to build safer platforms: an online world where a journalist can engage with feedback on her reporting, not assassinations of her character. Where a politician may read complaints about her policies, but not threats of rape and murder. Where a young woman can share what she wants to on her terms, knowing there are systems to keep her safe and hold harassers accountable.”
Republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License.
Kirsten Salyer is opinion editor at the World Economic Forum. She has more than a decade of experience in communications, public engagement strategy, editing, writing, reporting, ghost-writing, project management, and developing and managing digital strategies for websites, news publications, magazines, and books.
Want to continue this conversation on The Media Online platforms? Comment on Twitter @MediaTMO or on our Facebook page. Send us your suggestions, comments, contributions or tip-offs via e-mail to firstname.lastname@example.org.