Pointing out that 'Facebook's content monitoring is extremely insufficient', what should be improved to build a sound platform?


by

www.shopcatalog.com

Facebook said President Donald Trump's post suggesting that he would use force as he said, ``If looting begins, shooting will begin,'' in response to protests triggered by the death of black man George Floyd due to police detention It has been strongly criticized both inside and outside the company for not deleting or warning. Meanwhile, researchers at the New York University Stern Center for Business and Human Rights point out that 'Facebook's efforts to monitor content are significantly underdeveloped.'

Tech-Content Moderation June 2020 — NYU Stern Center for Business and Human Rights
https://bhr.stern.nyu.edu/tech-content-moderation-june-2020

NYU study: Facebook's content moderation efforts are'grossly inadequate' | VentureBeat
https://venturebeat.com/2020/06/07/nyu-study-facebooks-content-moderation-efforts-are-grossly-inadequate/



Regarding the post suggesting the use of force by Trump on SNS, Twitter

displayed a warning because it judged that it was 'praise for violence', but Facebook did not label anything. Facebook employees have also been criticized for this response, and Mark Zuckerberg held a large-scale online meeting with employees to discuss 'whether content censorship' ..

Facebook CEO Zuckerberg holds a big meeting with employees to discuss platform regulations and policies-GIGAZINE


by Billionaires Success

In connection with a series of issues, a research team at the New York University Stern Center for Business and Human Rights published a research report on Facebook content management. The report points out that many major social networking platforms are suffering from content monitoring issues.

For example, YouTube and Google each have 10,000 people, Twitter has 1500 moderators monitoring the content, while Facebook has about 15,000 moderators monitoring the content. Facebook has a relatively large number of moderators and works with 60 news organizations to perform fact-checking of the information, saying, 'These numbers may seem quite high. , It's significantly inadequate given the amount of information disseminated on the site,' said the report.

The research team believes the report is behind Facebook's focus on constant growth as to why Facebook's content monitoring isn't going well. Zuckerberg's CEO's initial motto is 'Companies cross countries,' and Facebook is keenly aware of expanding its scale. However, the report's lead author, Paul Barrett , says Facebook lacks a strategy for parallel growth and growth strategies and content monitoring.


by www.shopcatalog.com

Facebook content monitoring is a mechanism that is first flagged by users and artificial intelligence systems and then checked by moderators about the flagged content. The number of contents that can be flagged reaches 3 million per day, but the moderator's error rate is also reported to be 10%, and Facebook is erroneous about 300,000 contents per day ..

If the content on Facebook is not properly monitored, the platform will be flooded with content such as spam, bullying, racism and violence, and child sexual abuse, leaving many consumers out. .. So while content monitoring by moderators is the basis for maintaining Facebook's usability, Barrett argues that Facebook moderators are essentially outsourced.

Basically, it is pointed out that content moderators work at a lower salary than regular employees working at the head office, and the working environment is poor . Also, in developing countries where labor costs are low, many moderators are employed, but by monitoring content of another country from a physically remote place, the moderators can actually contain content that includes dangerous content. There is also a risk of being overlooked.



To solve these problems, Barrett proposes the following eight improvements.

1: Content management is done in-house to give moderators sufficient salary and benefits.
2: Double the number of content moderators.
3: Assign senior executives to oversee content management.
4: Invest in 'countries at high risk of posting dangerous content' in Asia and Africa and build a team to work in the local language.
5: Provide online medical care to solve moderator health problems.
6: Cooperate in academic research on health risks of content management.
7: Cooperate with government regulations regarding the spread of harmful content.
8: Expand the fact check to prevent the spread of fake news.

Although such measures are costly, Facebook is showing a stance to fight dangerous content such as preventing fake news from spreading in relation to the new coronavirus, so Barrett says Facebook is monitoring content status I think there is hope for improvement. Despite the ambitious policy of abolishing moderator outsourcing, 'I think Facebook may be ready to move in this direction,' Barrett said.


by

JD Lasica

In addition, several Facebook moderators jointly issued a statement about activities to protest against black discrimination spread throughout the United States. Moderators are working to end violence and hatred on Facebook and claim to be solidarity with the American black community. In support of an employee protesting against Zuckerberg, who did not censor President Trump's remarks, Facebook said Facebook should listen to the voices of black employees and users.

This is a message of solidarity from a group of current and former content moderators at Facebook.
https://medium.com/@fbcontentmods/this-is-a-message-of-solidarity-from-a-group-of-current-and-former-content-moderators-at-facebook-6af1b3b2a020

Facebook moderators join criticism of Zuckerberg over Trump stance | Technology | The Guardian
https://www.theguardian.com/technology/2020/jun/08/facebook-moderators-criticism-mark-zuckerberg-donald-trump



in Web Service, Posted by log1h_ik