Facebook executives ignored an internal report that 'Facebook's algorithm is promoting people's division'



In recent years, political activities and conflicts on SNS have been widely seen, and sometimes conflicts and polarization between users have accelerated, causing problems. The Wall Street Journal pointed out that an internal report submitted to Facebook executives in 2018 `` Facebook's algorithm is promoting division of people '', but executives ignore this report and solve the problem I reported that I put it off.

Facebook Executives Shut Down Efforts to Make the Site Less Divisive-WSJ

https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499

Internal Facebook report found algorithms drove people apart: report | TheHill
https://thehill.com/policy/technology/499611-internal-facebook-report-found-algorithms-drove-people-apart-report

Facebook reportedly ignored its own research showing algorithms divided users-The Verge
https://www.theverge.com/2020/5/26/21270659/facebook-division-news-feed-algorithms



By tracking user interests and interests, Facebook is building algorithms to serve targeted ads to the right people and to recommend recommended Facebook groups. This algorithm is sometimes used for political activities, and it is reported in 2019 that `` ad distribution is targeted to users who are interested in'Nazi 'and'white supreme' ''. I will.

Facebook identifies users who are interested in Nazis and white supremeism and uses it to display targeted ads-GIGAZINE


by www.shopcatalog.com

In an internal report submitted to Facebook executives in 2018, it was pointed out that 'our algorithm is embedded in the human brain attracted to division'. The group that produced the report found that Facebook's recommended algorithm 'provides more disruptive content between users to attract user attention and spend more time on the platform,' Wall Street said. -The journal reports.

Earlier, it was pointed out that Facebook was fueling users' political ideologies and accelerating the division. A 2016 report found that over one-third of the large political Facebook groups with German Facebook users participated frequently, with extreme content. It seems that the radical Facebook group is full of racist, conspiracy theorist, and pro-Russian content, and that the participating users were heavily influenced by some of the actively speaking users.

The growth of these politically radical Facebook groups was driven by Facebook algorithms, a 2016 report said. In the report, '64% of the participants of these radical groups participated by Facebook's recommended tool', and the algorithm that the platform recommends groups suitable for users is radical Facebook group It seems that it has contributed to the increase in the number of users. According to a Facebook employee, the issue is not unique to Germany.



Facebook's VP of Public Policy is accused of downplaying these concerns about Facebook's algorithms and leading Facebook's efforts to move it away from

depolarizing . Mr. Kaplan was the Deputy Chief of Staff under the 43rd President of the United States, George W. Bush , and is known to be a solid right-wing politician.

It has been pointed out that Kaplan has a role in soothing conservatives and avoiding criticism that they are 'liberal' in their efforts to eliminate political problems on Facebook. For example, Mr. Kaplan is said to play an important role in Facebook's political advertising policy, and it also affected Facebook's ' policy to continue posting political ads ' announced in 2020.

The Wall Street Journal points out that Kaplan opposed and weakened an approach to weakening the influence of users and bots called 'super-sharers' who are active in partisan activities. Kaplan also opposed the 'Common Ground' initiative, which connects people with politically neutral content and does not promote content that promotes political polarization, and the project has been suspended . ..

A Facebook spokeswoman told The Verge, “We are learning a lot from the issues of the 2016 US presidential election and are no longer the same company as they were at the time. Building a robust and honest team, policies and commitments To limit harmful content, research and understand the impact of the platform on society, and continue to improve.To support independent research on bipolarization, just in February 2020. We announced that we will provide 2 million dollars (about 215 million yen). '



in Mobile,   Web Service, Posted by log1h_ik