In these uncertain times, online systems are now becoming more automated – including on YouTube. With creators already sceptical of machines, TenEighty explores what this means for those looking to make a living.
“I know a lot of people blame and throw around this term a lot, but I want to make it explicitly clear in this video what it means,” Eugene Lee Yang of The Try Guys says during his interview with YouTube CEO Susan Wojcicki in December 2019.
He’s referring to the YouTube algorithm.
“The algorithm is a complicated set of machine learning that takes in billions and billions of pieces of data,” Susan replies. “But let me basically say that the algorithm, in many ways, is a proxy for estimating what your audience wants.”
It is a proxy which has faced some criticism in recent years. In 2017, disabled creators such as Jessica Kellgren-Fozard and Connor Ward called on YouTube to revise its ad policy after videos on their disabilities were demonetised by the platform’s systems.
Last year, a group of LGBTQ+ YouTubers launched a lawsuit against the social media company after they claimed they faced discrimination based on their identities and orientations.
As the coronavirus pandemic continues, the YouTube algorithm also affects videos on the topic, with content previously being demonetised for mentioning the disease under the platform’s “sensitive events” policy.
An update soon followed, with Susan writing in a blog post in March that “it’s becoming clear [coronavirus] is now an ongoing and important part of everyday conversation”.
“We want to make sure news organizations and creators can continue producing quality videos in a sustainable way,” she writes. “In the days ahead, we will enable ads for content discussing the coronavirus on a limited number of channels, including creators who accurately self-certify and a range of news partners.
“We’re preparing our policies and enforcement processes to expand monetization to more creators and news organizations in the coming weeks.”
While YouTube is opening up monetisation of coronavirus content, creators are still feeling the pressure when it comes to their own revenue on the platform.
Hank Green, who runs several channels including vlogbrothers and SciShow, asked creators on Twitter to share their CPM (cost per mille) figures. The sum tells YouTubers how much an advertiser pays for 1,000 adverts displayed – or impressions – on their content.
Hank reported that his CPM had fallen by 28% to $4.75, a figure he says is the lowest for his network since January 2013.
YouTubers, please share you CPM! This is our whole network over the last 28 days. Our lowest since January 2013, but I assume we've dropped less than many. pic.twitter.com/8WcVyPxSdY
— Hank Green (@hankgreen) April 15, 2020
Other creators, including James Charles, revealed their CPM rates had dropped by 10-30%, with the American beauty YouTuber tweeting to say his number had decreased by 20%.
Meanwhile, Rosianna Halse Rojas was one of the few creators to report a rise in their CPM – an increase of 26%, which she says is probably because she is uploading videos again.
While CPM rises and falls, it is not the only way for today’s YouTubers to generate an income, of course. Alongside merchandising and channel memberships, creators have also benefitted from Patreon memberships, which sees fans pay a regular fee to access exclusive content.
“Social networks track and sell their audiences to third parties, turning the privacy of the creators’ biggest fans into a commodity,” Patreon writes in a blog post. “Opaque and ever-changing algorithms interfere with the artist/fan relationship, leaving creators to shout into the void, or pay to ‘boost’ their content to their own audiences.
“While social networking giants say they value creators — and their platforms would not exist without them — all evidence tells a different story: it’s not so much that they’re working against creators, but that they’re working for someone else.”
The post goes on to add: “Instead of building tools to help creators, platforms innovate for advertisers. From machine learning that lumps sex education with pornography to unintentional censorship in order to keep advertisers happy, companies routinely make decisions at the expense of the very people who fuel their platforms.”
“From machine learning that lumps sex education with pornography to unintentional censorship in order to keep advertisers happy, companies routinely make decisions at the expense of the very people who fuel their platforms.”
As payments on YouTube fluctuate, the business model of Patreon has appealed to a fair amount of people. In the first three weeks of March, the platform saw 30,000 new sign-ups from creators.
Data released by Patreon also revealed that “average new patron growth across the US, UK, Canada, Germany, Australia, and Italy is up 36.2% compared to February”, something which the service says is “an indication that fans are turning to Patreon to support creators during this tough time”.
However, this isn’t to say that YouTube’s algorithm and machines don’t have their uses. Earlier this month, the Creator Insider channel detailed plans to allow creators to find out when their audience is on YouTube.
Explaining the tool, Tom, from the YouTube Analytics Engineering team, says they’re reluctant to say whether using it to schedule your uploads or plan publish times has any long-term benefits.
“We don’t really have the data confidence to say that,” he admits. “However, this can help you engage with your community, so you can schedule a post at this time, you can try to engage with comments or your community in other ways.”
In amongst the automation and data, there are features from which creators can benefit. Alongside knowing when audiences are active, YouTube has started testing a tool which allows channels to self-certify their videos against their advertiser-friendly guidelines.
“The goal is to reduce errors from our automated reviews and help your videos get monetised more accurately from the start,” says Tim, a YouTube employee, in a video on their channel for creators.
The announcement comes with an admission from YouTube that “sometimes, these automated systems can make mistakes”, and it isn’t the first time that this problem has been acknowledged.
In response to the ongoing pandemic, the platform revealed last month that its approach to enforcement of YouTube’s Community Guidelines – typically involving both machine learning and human reviewers – would temporarily “start relying more on technology to help with some of the work normally done by reviewers”.
“This means automated systems will start removing some content without human review, so we can continue to act quickly to remove violative content and protect our ecosystem, while we have workplace protections in place,” the blog post reads. “As we do this, users and creators may see increased video removals, including some videos that may not violate policies.”
When the news was shared on Twitter, the tweet was described by Seán McLouglin (better known as Jacksepticeye) as ominous, “To be at the mercy of a system that you admit does not work.”
“Don’t get me wrong, I’m glad you’re letting staff stay home and isolate to keep them safe, but this will scare a lot of people,” he replies.
“This tweet sounds ominous. To be at the mercy of a system that you admit does not work.”
In a video released a few days later addressing concerns, YouTube’s Creator Liaison, Matt Koval, explains: “Just like we have engineers who have to physically plug in to special systems to do their jobs, it’s the same with video reviewers, who are located around the world.”
“YouTube’s systems are amazing at detecting patterns across enormous amounts of content, but they have a hard time understanding nuances – you might agree – like judging whether a news piece or a parody video is in violation,” Matt goes on to add, responding to a question around why automated systems make so many mistakes. “That’s why we also built our systems to include expert reviewers, to bring in human judgement for more complicated content.”
However, coronavirus has, understandably, changed things. In a time of Zoom calls, live streams and more, human interaction with each other and online systems have become automated and digitalised. Processes which may have been almost perfectly balanced between human and machine have now moved, temporarily, toward the latter.
Now, concerns from creators about how far machine learning can go to assess human content are ongoing while YouTube concedes that the systems are not completely accurate when it comes to nuance. Questions may be asked about whether that power dynamic between human and algorithm ever needs to change once more, but they may not be answered for while.
Want more?
Read more about YouTube’s plans to allow creators to see when their audience is online, or our investigation into the role music plays on TikTok and the challenges the platform faces in the future.
- Love In Lockdown: Online Dating In The Coronavirus Era
- YouTube Testing “Chapters” Feature for Longer Videos
- How Coronavirus Is Changing Social Media
- TikTok Bans Direct Messaging for Under-16s
For updates follow @TenEightyUK on Twitter or like TenEighty UK on Facebook.