Users will be able to select up to 25 comments to delete at a time, as well as “amplify and encourage positive interactions”.
The tools were two of the new features unveiled by Instagram, with the platform also rolling out the ability for profiles to toggle who can tag or mention an account in a comment, caption or story.
Today we’re sharing new ways to control your Instagram experience. ❤️
Now you can delete multiple comments and block or restrict multiple people at once. You can also manage who tags or mentions you on Instagram.
— Instagram (@instagram) May 12, 2020
In a blog post, Instagram said the tools mark “the continuation of our effort to lead the industry in the fight against online bullying”.
“We know it can feel overwhelming to manage a rush of negative comments, so we’ve been testing the ability to delete comments in bulk, as well as block or restrict multiple accounts that post negative comments.
“Early feedback has been encouraging, and we’ve found that it helps people, especially with larger followings, maintain a positive environment on their account,” they said.
The ability to pin and bulk delete comments, as well as block multiple users, are the latest tools announced by Instagram, with the social media site rolling out a ‘Support Small Business’ sticker on Monday.
The company’s report on community standards revealed that Instagram took action on 1.5 million pieces of content relating to bullying and harassment, both in the last quarter of 2019, and in the first quarter of 2020.
Meanwhile, 1.3 million pieces of content relating to suicide and self-injury was actioned in the first three months of 2020, increasing by 40% compared to 896.8k pieces of content in the final quarter of last year.
The proaction rate for this form of content, which refers to content removed before it is removed by users, also went up by more than 12 percentage points since the publication of the last report.
Facebook went on to add that the increase comes after improvements to their “text and image matching technology”.
The data is the latest insight into the platform’s move to take down content promoting self-injury and suicide, following public outcry last year over the death of teenager Molly Russell.
In its report, Facebook said views of this content are “very infrequent”, with material removed before users are able to see it, meaning they do not find “enough violating samples to precisely estimate” its prevalence.
“In these cases, we can estimate an upper limit of how often someone would see content that violates these policies.
“In Q1 2020, this upper limit was 0.05% — meaning that for each of these policies, out of every 10,000 views on Facebook or Instagram in Q1 2020, we estimate that no more than five of those views contained content that violated that policy,” they said.
Read about Twitter’s new warnings for tweets containing misinformation, or take a look at Instagram’s ‘Support Small Business’ sticker launched this week.