Facebook has introduced new comment moderation, reporting and appeal tools to ensure that people can control unwanted, offensive or hurtful experiences on the platform. Users can now hide or delete multiple comments at once from the ‘options’ menu of their post.
‘This feature is rolling out on desktop and Android and will be available on iOS in the coming months. We are also testing ways to more easily search for and block offensive words from appearing in comments,’ said Antigone Davis, Global Head of Safety, Facebook.
Also, if you see a friend or family member being bullied or harassed, you can report on their behalf via the menu above the post that you are concerned about. Once reported, Facebook’s Community Operations team will review the post, keeping your report anonymous, and determine whether it violates Community Standards or not.
Users can also ask for a second review if Facebook fails to take down the reported piece of content.
The company recently introduced a new partnership with the National Parent Teachers Association in the US to facilitate 200 community events in cities in every state to address tech-related challenges faced by families, including bullying prevention.
In India, Facebook supports a program that has educated tens of thousands of young people about online safety, thoughtful sharing, and privacy and security.
‘We know our job is never done when it comes to keeping people safe, and we’ll continue listening to feedback on how we can build better tools and improve our policies,’ added Davis.
Facebook has also fixed the vulnerability that hackers used to gain access to 50 million user accounts last week. The company has reset the access tokens for a total of 90 million accounts – 50 million had their access tokens stolen and 40 million were subjected to ‘View As’ look-up last year.