A leaker of inside information reveals that 'Instagram worsens suicide thinking'
Internal documents have revealed that Facebook was recognizing but hiding the findings that 'Instagram is harmful to teenage boys and girls,' and has been criticized. Until now, the person who leaked inside information was anonymous, but that person revealed his identity and answered in an interview 'what is the fundamental problem of Facebook'.
Whistleblower: Facebook is misleading the public on progress against hate speech, violence, misinformation -60 Minutes ―― CBS News
The Facebook Whistleblower, Frances Haugen, Says She Wants to Fix the Company, Not Harm It --WSJ
On September 14, 2021, the Wall Street Journal hid information while Facebook recognized that 'Instagram is harmful to teens,' and said, 'Using SNS may be beneficial to mental health. I reported that I continued to send. The news has sparked criticism of Facebook, the Senate Commercial Commission of Congress has launched an investigation, and the planned development of the Instagram for Children project has been frozen.
The Wall Street Journal report was sourced from documents leaked by whistleblowers. Until now, whistleblowers have been anonymous, but whistleblowers revealed their identities in a program called '60 Minutes' that aired on October 3 in the United States.
The information was leaked by Frances Haugen, product manager of the Civic Misinformation team on Facebook. Haugen earned a master's degree in business from Harvard University after earning a degree in computer engineering, worked as a product manager at Pinterest, Yelp, Google, etc. before joining Facebook in 2019. Mr. Haugen seems to have decided to join the company on the condition that he will 'handle misinformation' from the experience of losing a friend due to the conspiracy theory spread on the Internet.
In an interview, Haugen said, 'On Facebook, I've seen many conflicts of interest between'good for people'and'good for Facebook'. We have chosen to optimize for our own profits, such as 'making money.'
According to Haugen, the root of Facebook's problem lies in the algorithm changes made in 2018. Within the Facebook app, 'what the user wants to see' is always calculated by an algorithm. With the change in 2018, Facebook's algorithm has given priority to displaying 'content that is highly responsive to users', but the problem at this time is that from past research, 'polarizing or dividing people' Harmful content that is more likely to elicit human emotions and more responsive to users. '
The most influential emotion on social networks is 'anger'-GIGAZINE
Conversely, designing an algorithm that doesn't promote content like the one above is more secure, but it reduces the amount of time people spend on Facebook and the number of clicks on ads, and reduces Facebook revenue. .. Facebook is aware of this, so only in the 2020 presidential election, we have put in place a safety system to temporarily reduce false information. 'Facebook prioritized growth over safety,' said Haugen, as the safety system was lifted as soon as the elections were over.
At the time of the algorithm change, CEO Mark Zuckerberg explained that the new algorithm 'increases the chances of people interacting with people they care about.'
Bringing People Closer Together --About Facebook
'I sympathize with Mark. He has never tried to create a harmful platform, but he has side effects,' Haugen said, saying Facebook's conflict of interest was worse than any other company. It allowed the choices that were born, and these choices spread the harmful and polarized content widely. '
Haugen has been holding on to facts that no one outside Facebook knows about. It seems that there were many other people fighting to improve the situation besides Mr. Haugen, but after seeing the situation that no one could improve, Mr. Haugen decided to release the internal document to the outside. While Haugen is at risk of being accused of 'stealing information' from Facebook regarding information leaks, Haugen's lawyer claims to be protected by the Dodd-Frank Wall Street Reform and Consumer Protection Act.
Facebook did not respond to an interview with 60 Minutes, saying, 'We protect the right of billions of people to express themselves openly every day and keep the platform in a safe and positive place. We have to make a big improvement in the fight to prevent the spread of harmful content and misinformation, and the claim that we are promoting bad content and not doing anything. 'It's not true.' 'If there were research showing accurate solutions to these complex problems, the technology industry, government, and society would have solved them long ago,' he said in a statement.
in Web Service, Posted by darkhorse_log