As politicians move closer toward passing new laws on digital content, social media platforms are introducing their own independent panels to tackle content moderation. With the conversation continuing around who should decide what we see online, TenEighty explores the ongoing debate about the correct approach to monitor online material.
“I’ve increasingly come to believe that Facebook should not make so many important decisions about free expression and safety on our own,” CEO Mark Zuckerberg writes in a note about content issues on the platform.
It was published in 2018, a year after the Cambridge Analytica scandal first surfaced and a report by the New York Times found Russian interference had reached over 126 million Facebook users. “The past two years have shown that without sufficient safeguards, people will misuse these tools to interfere in elections, spread misinformation, and incite violence,” Zuckerberg says.
The blogpost paved the way for the platform’s new Oversight Board. Launched in May, the panel made up of 20 members from across the world will “exercise independent judgment over some of the most difficult and significant content decisions” on the social media platform.
“I believe independence is important for a few reasons,” continues Zuckerberg. “First, it will prevent the concentration of too much decision-making within our teams. Second, it will create accountability and oversight. Third, it will provide assurance that these decisions are made in the best interests of our community and not for commercial reasons.”
Facebook says the Oversight Board’s membership will rise to 40 people in time, but the initial 20 members are experienced in areas such as press freedom, digital rights, religious freedom and internet censorship.
Those involved in the board have lived in over 27 countries, with the United Kingdom’s representative being former Editor-in-Chief of The Guardian, Alan Rusbridger.
“Facebook is an entity that defies description,” Rusbridger writes in a Medium article titled Why I’m Joining Facebook’s Oversight Board. “It is a friend of the otherwise voiceless — but also an enabler of darkness. It brings harmony to some, discord to many. It promotes order and amplifies anarchy.
“It employs many brilliant engineers but has — too slowly — recognized that the multiple challenges it faces involve the realms of philosophy, ethics, journalism, religion, geography, and human rights. And it makes a whole lot of money, and a whole lot of enemies, while doing this.
“To address this,” he continues, “it needs independent, external oversight.”
“Facebook is an entity that defies description. It is a friend of the otherwise voiceless — but also an enabler of darkness. It brings harmony to some, discord to many.”
While the board won’t start reviewing cases until later on in the year, other social media platforms soon followed suit with their own indepdent panels.
It was just over a week after Facebook’s announcement that streaming site Twitch announced the formation of the Safety Advisory Council. The panel, made up of online safety experts and streamers, “will inform and guide decisions made by Twitch”.
The blogpost goes on to list some of the areas the group will focus on, including drafting new policies, developing features to improve safety and moderation, and spotting emerging trends that could “impact the Twitch experience”.
A week later, CEO Emmett Shear clarified in a follow-up post: “The role of the council is to advise, offer perspective, and participate in discussions with our internal teams pertaining to the work we do to help keep our community safe and healthy.”
“We said in our blog post we were looking for the council to advise on a variety of things, from advising on new policies to promoting healthy streaming habits, but we could have been clearer about tasks the council will not be involved with,” Shear continues. “Council members will not make moderation decisions, nor will they have access to any details on specific moderation cases.
“They are not Twitch employees, and they do not speak on Twitch’s behalf,” he concludes.
Away from the older social media platforms, independent councils are something which have also been explored by TikTok, the mobile app formerly known as Musical.ly. Their initial intention to create a board of experts “to advise on and review content moderation policies” was made in October last year.
Five months later, in March 2020, the membership of the council was revealed, with Dawn Nunziato, an expert in free speech and content regulation from the George Washington University Law School, chairing the committee.
“A company willing to open its doors to outside experts to help shape upcoming policy shows organizational maturity and humility,” says Nunziato. “I am working with TikTok because they’ve shown that they take content moderation seriously, are open to feedback, and understand the importance of this area both for their community and for the future of healthy public discourse.”
“A company willing to open its doors to outside experts to help shape upcoming policy shows organizational maturity and humility.”
Yet not every social media platform has made the move to a more independent approach to content moderation. While YouTube’s Self-Certification Progamme offers creators the opportunity to assess their own videos against the site’s monetisation policies, the decision around what stays up on YouTube remains the job of its algorithm and reviewing teams.
“We have Community Guidelines that set the rules of the road for what we don’t allow on YouTube,” the platform explains in a report on enforcement. “For example, we do not allow pornography, incitement to violence, harassment, or hate speech.
“We rely on a combination of people and technology to flag inappropriate content and enforce these guidelines,” they go on to add. “Flags can come from our automated flagging systems, from members of the Trusted Flagger program (NGOs, government agencies, and individuals) or from users in the broader YouTube community.”
Though a social media platform making decisions about what content should be visible can raise concerns. In an interview with YouTube’s CEO Susan Wojcicki, touching on content moderation during the current coronavirus pandemic, creator Hank Green explains that it is “so much power for an organisation to have” in terms of “figuring out where the line is”.
“It is just a very big and powerful institution, and it is a lot of responsibility to have sort of collected in one place,” Hank continues. “As much as I agree with the decisions you’re making, I also worry that it sort of feeds into the conspiracy mindset for one organisation to have that level of influence.”
Wojcicki responds by talking about other platforms. “Well, we definitely see that it’s a pretty competitive landscape,” she says. “That’s what we see from our perspective.
“We’ve seen Facebook being very aggressive in video, not just on the Facebook properties, but Instagram, for example,” Wojcicki continues. “We also see emerging players like TikTok, that’s new, that’s just come out of nowhere – well, it came out of Musical.ly.
“We see this as a very competitive space and there are many opportunities to have their content distributed,” she concludes.
While YouTube continues to refine its in-house content moderation systems, the CEO of Twitter has floated the idea of adopting more decentralised model – essentially, a network which isn’t managed by one individual, business or server.
“Twitter is funding a small independent team of up to five open source architects, engineers, and designers to develop an open and decentralized standard for social media,” Jack Dorsey announces in a thread tweeted in December. “The goal is for Twitter to ultimately be a client of this standard.”
At present, Twitter is controlled by a private company. Its CEO went on to reveal that for reasons which were justifiable at the time, the business “took a different path” and made the platform “increasingly centralised”.
However, he goes on to add that a lot has changed over the years. “First, we’re facing entirely new challenges centralized solutions are struggling to meet,” Dorsey says. “For instance, centralized enforcement of global policy to address abuse and misleading information is unlikely to scale over the long-term without placing far too much burden on people.
“Second,” he continues, “the value of social media is shifting away from content hosting and removal, and towards recommendation algorithms directing one’s attention.
“Third, existing social media incentives frequently lead to attention being focused on content and conversation that sparks controversy and outrage, rather than conversation which informs and promotes health.”
His final point is that technologies have emerged which make a decentralised model “more viable”. The aforementioned team of open source experts have been given the name Bluesky, with their latest update from March being that a group of people they have brought together are looking to create a “decentralization ecosystem review”.
Twitter is funding a small independent team of up to five open source architects, engineers, and designers to develop an open and decentralized standard for social media. The goal is for Twitter to ultimately be a client of this standard. 🧵
— jack (@jack) December 11, 2019
“What remains to be seen is whether Twitter’s decentralisation proposal will encompass the guarantees and safeguards necessary to ensure the protection of users’ rights, especially freedom of expression,” the human rights organisation Article 19 writes in a blogpost. “In order for any proposal to fully address the harms of centralised content moderation, it must be approached in a way that is truly independent from the control of powerful companies, applies sector-wide, addresses the relevant market failure, and observes human rights standards.
“While decentralisation has a number of merits when it comes to addressing the various challenges raised by content moderation, it is not necessarily sufficient to solve all problems,” the blog post continues. “For example, decentralisation may render it more difficult to implement common standards on how to respond to illegal content, or to avoid the spread of disinformation.”
As a concept, Article 19 say decentralisation is “an exciting potential development”, especially around the reduction of issues caused by the “centralised enforcement of content moderation”. They do, however, think this decentralisation can be secured in a more comprehensive way – one known as unbundling.
“Unbundling refers to the separation of the hosting and content moderation services performed by dominant social media platforms,” they explain. “Through unbundling these two services, dominant social media platforms would still be able to moderate the content on their platforms, but they would be also obliged to allow competitors to provide competing content moderation services on their platforms.”
Article 19 also go on to add that ‘unbundling’ also relies on regulatory bodies taking action “to ensure consumers benefit from a fair, open, and competitive marketplace”, as opposed to initiatives set up from private companies.
“Regulatory authorities should lower the barriers to market entry and allow various competitors to provide content moderation services,” says Article 19. “New providers would be then able to compete with the existing dominant players for users on the basis of the quality of service they offer, which should include the level of rights protections they offer.”
Yet, the regulation of social media and the relationship between lawmakers and large digital platforms has always been tricky. When an international grand committee on fake news called on Mark Zuckerberg to give evidence in 2018, the Facebook CEO was empty chaired, with the platform’s Vice President of Policy Solutions, Richard Allan, responding to questions from lawmakers instead.
More recently, US President Donald Trump signed an executive order on ‘preventing online censorship’, one which aims to make changes to Section 230 of the Communications Decency Act 1996 protecting ‘interactive computer service’ providers from civil liability when restricting access, in good faith, to content considered “obscene, lewd, lascivious […] or otherwise objectionable”.
“When large, powerful social media companies censor opinions with which they disagree, they exercise a dangerous power,” the document reads. “They cease functioning as passive bulletin boards, and ought to be viewed and treated as content creators.
“Twitter, Facebook, Instagram, and YouTube wield immense, if not unprecedented, power to shape the interpretation of public events,” it continues, “to censor, delete, or disappear information; and to control what people see or do not see.”
The executive order comes after the social media platform placed labels on two tweets from his account relating to mail-in ballots, citing their civic integrity policy.
Three days later, Twitter went on to issue a notice on another tweet by the President – also shared on the White House’s account – saying the post violated its rules around glorifying violence, but that “it may be in the public’s interest for the tweet to remain accessible” to users.
The same day, President Trump called for Section 230 to be revoked by Congress.
“Government-led regulation of free speech is nearly always problematic,” writes Rusbridger in his blogpost about joining Facebook’s Oversight Board. “Regulating such an entity, which operates in all but a handful of countries in the world, is extremely complex. Add in the scale of the [Facebook] platform – with more than 2 billion monthly users – and it’s little wonder that there have been few quick fixes.”
“Government-led regulation of free speech is nearly always problematic. Regulating such an entity, which operates in all but a handful of countries in the world, is extremely complex.”
Although, Article 19 is urging caution in assuming that Facebook’s alternative, the Oversight Board, can solve “multitude of human rights issues” presented by the platform. “For a start, its mandate and powers are inherently limited,” they explain. “Broader issues around the impact of social media on free expression, such as Facebook’s business model, content promotion or downranking, and at least for now, political advertising, are out of bounds.”
“Equally, the FOB is designed specifically only to address content moderation issues on Facebook and its other platform Instagram,” Article 19 continues. “Beyond its limited remit, the FOB is inevitably a means for Facebook to acquire the legitimacy it direly needs and to show lawmakers that self-regulation à la Facebookcan work.”
Whether or not that is Facebook’s intention with the independent panel, content moderation remains on the political agenda. In the UK, the Government has drafted proposals on tackling what it describes as ‘online harms’, including placing a ‘statutory duty of care’ on companies to keep their users safe.
“While some companies have taken steps to improve safety on their platforms, progress has been too slow and inconsistent overall,” reads the foreword to the white paper. “If we surrender our online spaces to those who spread hate, abuse, fear and vitriolic content, then we will all lose.
The introduction continues: “This White Paper therefore puts forward ambitious plans for a new system of accountability and oversight for tech companies, moving far beyond self-regulation.
“A new regulatory framework for online safety will make clear companies’ responsibilities to keep UK users, particularly children, safer online with the most robust action to counter illegal content and activity,” it says.
As social media companies launch their independent panels to tackle content moderation and the many issues it can present for everyone, it’s one thing trying to reassure politicians that they are taking the right approach. In a data and algorithm-conscious world, where the UK public’s trust of social media for providing general news and information continues to fluctuate, these platforms must also convince their users that they’re doing the right thing too.
Want More?
Read our full report on Facebook’s new Oversight Board, or find out more about Twitch’s Safety Advisory Council.
- Experiential Potential: How YouTubers And Brands Are Marketing Emotions
- Lemgthbook: Exploring Meme Dialect And The Forbiddem Glyph
- Vloggers And The Machine: Who’s Got The Love?
- Musical.ly Speaking: How TikTok is Keeping On-Track with Music and Copyright
- Disability On YouTube: Growing Representation Within The Digital Sphere
For updates follow @TenEightyUK on Twitter or like TenEighty UK on Facebook.