Facebook puts probation groups on groups with many posts that violate community rules

Facebook and other social networking companies are keeping an eye on the spread of fake news, and before the 2020 US presidential election, Facebook

rejects 'advertising that interferes with voting' and reminds us of combat and the military. I've deleted posts that verbally encourage voting . It is reported that Facebook has newly introduced a kind of 'probation' for groups with many posts that violate the rules of content.

Facebook is cracking down on problematic groups with new rule --The Washington Post

Facebook says it will put groups on probation for violating its content rules --The Verge

Facebook has taken various steps in connection with the 2020 US presidential election to prevent the spread of misinformation and inciting statements. Even after the vote is over, Trump's supporters have created groups that argue that the Democratic Party's voting fraud should be stopped, and Facebook is taking steps to remove each group.

Facebook group of Trump supporters urging 'Democrats stop voting fraud' is removed-GIGAZINE

by www.shopcatalog.com

According to a new report by The Washington Post, Facebook asked Facebook groups that frequently found posts that violated community standards, 'whether the group's manager or moderator approves the members' posts. It is obligatory to 'check manually'. The target Facebook group is about political and social topics, whether private or public.

Facebook spokeswoman Leonard Lam said, 'For some political or social group managers and moderators in the United States, manually post all posts if there are many community rule violations by members of the group. We are temporarily requesting approval at. The steps we have taken this time are to protect people in this unprecedented time. '

Manual post approval by administrators and moderators will continue for 60 days, the group will not be able to challenge Facebook and will not be able to disable the approval system on their own.

Facebook will also closely monitor how group admins and moderators process posts for 60 days and will shut down the group altogether if problematic posts are repeatedly allowed after manual approval has been introduced. There is a possibility. In short, the sequence of actions is a kind of 'probation' for the group, requiring managers and moderators to be more responsible for posting within the group.

The Washington Post said, 'Facebook and other social media companies have long relied on unpaid moderators to moderate much of what was posted within the group. In the case of Facebook, artificial intelligence and expertise. We use a combination of moderators to find problematic content, but the more subtle decisions are left to the volunteers within the group, 'said the group's moderators, who play a major role in preventing the spread of misinformation. I point out that there is.

This measure will make the moderator even more responsible, but the burden of manually approving a large number of posts is very heavy. The Facebook group for residents of Aberdeen , Washington, which has been 'probated' by this measure, has as many as 7,000 members discussing local events, businesses, and issues. Deb Blecha, who has been managing the group for 10 years, said he is considering closing the group because the manual approval process is a heavy burden.

in Web Service, Posted by log1h_ik