Summary of major problems written in Facebook's internal document 'Facebook Papers' leaked by whistleblower



Around September 2021, the Wall Street Journal, an overseas newspaper, based on a document leaked by a former Facebook employee, 'Facebook is aware of the findings that Instagram is harmful to teenage boys and girls. But I was hiding

it . ' Published by Facebook's Frances Haugen , these documents are called 'Facebook Papers' and are being researched by various media outlets for relevant information.

The Facebook Files-WSJ
https://www.wsj.com/articles/the-facebook-files-11631713039

Eight things we learned from the Facebook Papers --The Verge
https://www.theverge.com/22740969/facebook-files-papers-frances-haugen-whistleblower-civic-integrity

The overseas newspaper, The Verge, has picked up and summarized eight notable information on Facebook Papers.

◆ 1: Facebook was not fully prepared to process comments about the new coronavirus infection vaccine
According to a document dated March 2021, there was not enough preparation for the 'wrong comment on the vaccine' sent to the post about the new coronavirus infection. As of March, the Facebook team has formed a team to process pandemic information and has built a system that automatically flags false information, but this system is not fully functioning. , It was said that misinformation was included in one-fifth of the comments on vaccine-related articles.

'As far as I read the document in Facebook Papers, Facebook may have underestimated the problem,' The Verge wrote.

◆ 2: Facebook was warned by Apple to respond to 'the current situation where workers are being bought and sold in the app'
In October 2019, Apple discovered on Facebook and Instagram that 'household servants are being sold as a labor force, and sellers are encouraging buyers to abuse them, such as by confiscating their passports.' .. We warned Facebook to respond appropriately to this information.

According to Facebook Papers, Facebook was aware of this issue, but was unaware of the scale of the issue because the actual number of reported damages was so small. However, following Apple's warning, Facebook eventually disabled 1021 accounts and removed about 130,000 content. We are responding to the problem as soon as possible, such as changing the policy regarding advertising for domestic servants.

Reuters, an overseas newspaper, has taken up the issue that Facebook did not address even though it was aware of the existence of violent content in developing countries. It is said that Facebook has introduced a filtering system using AI as a way to tackle violent content, but the system does not function sufficiently outside the major language areas and violent content is increasing. A former Facebook employee said on this issue, 'leaders don't understand the importance and don't have enough resources.'

Facebook knew about, failed to police, abusive content globally --documents | Reuters
https://www.reuters.com/technology/facebook-knew-about-failed-police-abusive-content-globally-documents-2021-10-25/



◆ 3: CEO Mark Zuckerberg sometimes rejected employee ideas directly
Although employees devised an idea to mitigate the negative impact of Facebook on users, it was stated in the document that Mr. Zuckerberg directly rejected the idea. The Verge points out that Zuckerberg's issue of running on his own has had a major impact, especially on Facebook's succumbing to a comment censorship request from the Vietnamese government on dissidents. In this case, Zuckerberg

decides to give the Vietnamese government almost complete control over the platform in order to prioritize profits in Vietnam, which earns more than $ 1 billion annually. He said he did.

◆ 4: Facebook used the German anti-vaccine movement as a moderation test case
In the moderation work where the management side checks the comments, it became clear that Facebook's new classification of 'harmful topic community' worked well or was experimented with in the German anti-vaccine movement. The community called Querdenken, which spreads conspiracy theories in Germany, had the potential for violence, but did not carry out extreme activities that could be eliminated as a violation of the rules, and Facebook employees are conducting experiments with Querdenken. 'It could be a good case study to learn how to tackle these issues in the future,' the document states. The document also states that 'the experiment was relatively effective.'

◆ 5: Record of measures against confusion when President Joe Biden took office
Despite preparing for the possibility that supporters of former President Donald Trump would incite citizens in an attempt to prevent Mr. Biden from becoming president of the United States, it actually did not work well and was technical. Or the bureaucratic stagnation caused confusion.


by Michael Vadon

◆ 6: Facebook was trying to restore the balance of the news feed for 'citizen health'
Facebook is aware of the problem that Facebook's news feed algorithm isn't working ideally and is driving citizen polarization, such as taking surveys of users to provide really good content for them. It was stated in the document that he was trying to respond. According to The Verge, Facebook implemented content optimization in February 2020 and aimed to build a system by March of the same year, but it is unknown what actually happened.

◆ 7: The reason why 'Like' was not forcibly hidden on Facebook and Instagram
It is clear that the plan to forcibly hide 'likes' on Facebook and Instagram named 'Project Daisy' has been partially changed 'due to a decrease in advertising revenue and application usage rate' became. The Instagram team that proposed this plan was aiming to 'release from the pressure of users' likes', but the Facebook team did not show much interest. It is also clear that executives wanted to 'minimize the impact on Facebook' and 'maintain credibility with Facebook.' Eventually, hiding likes was introduced as a selective feature in both services.

◆ 8: A simple design mistake was discovered in Facebook's 'Citizenship Policy'
In October 2020, Facebook announced that it would stop 'displaying citizenship and political groups as recommendations' to users in the United States. However, internal documents state that this decision created a major challenge for Facebook.

The document states that 'citizens and political groups began to appear as recommendations as early as November 2020,' indicating that Facebook's policies are not functioning as intended. At the time, Facebook observers recognized that it was a difficult philosophical issue to clearly delineate 'which group is a citizenship' and that it would be extremely difficult to implement across the large Facebook platform. It was said that.

However, in reality, the filtering algorithm was designed to 'reference and filter the content of the last 7 days', so it is easy for the organization to avoid labeling by posting only non-political content for 7 days. It turned out that it was.

in Web Service, Posted by log1p_kr