What is the problem with the content monitoring project 'GIFCT' that affects online safety and freedom of speech?



In recent years, cases of radical thought groups and terrorists have been increasing based on SNS, and in the

shooting incident in Christchurch , New Zealand, the offender was live streaming on Facebook . Regarding the efforts and problems of the project called ' Global Internet Forum to Counter Terrorism/GIFCT ', which deals with such terrorism-related contents posted on SNS, the news media SLATE Summarizes.

The GIFCT is the future of content moderation.
https://slate.com/technology/2020/08/gifct-content-moderation-free-speech-online.html?scrolla=5eb6d68b7fedc32c19ef33b4

In October 2019, the German Halle Jewish synagogue in the ( synagogue ) is attacked by two anti-Semite, incident in which two people are killed has occurred. In this case, shooters live streamed the attack on Twitch, and the video continued to remain on Twitch for about an hour.

However, when the user who downloaded this video tried to share the video on platforms such as Facebook, Twitter and YouTube, the video sharing was stopped. This is because Twitch copied the digital fingerprints and hashes of the videos and quickly shared them with other platforms. Hash sharing by Twitch was done via GIFCT, which is composed of several private companies.

GIFCT was founded in 2017 by Facebook, Microsoft, Twitter and YouTube for the purpose of countering terrorism online. Little is known about the existence of GIFCT in the public domain, but it makes an important decision by keeping an eye on online speech and content in a place that is out of sight of people.



The background of the establishment of GIFCT in 2017 is the 2015

terrorist attacks in Paris and the 2016 terrorist attacks in Brussels . As a result of these incidents, government and EU legislators put pressure on ``technology companies to be responsible for online counter-terrorism'', and as a result, to coordinate content deletion between various platforms GIFCT was established in.

GIFCT has created a database of 'images and advertisements of violent terrorists', and this database is shared by multiple companies to help manage content on each platform. At the time of writing the article, at least 11 companies including Facebook, Twitter, YouTube, Twitch, etc. are members of GIFCT, and another 13 companies can access the GIFCT database.

In September 2019, GIFCT was reorganized into an independent organization with specialized staff, and an advisory committee consisting of representatives of governments, international organizations, and citizens including Japan, the United States, the United Kingdom, France, and Canada was also formed. However, SLATE points out that the internal structure of GIFCT has many uncertainties. For example, it's unclear how individual platforms use the GIFCT database, which acts as a sort of blacklist, and how the final content deletion happens.



Mechanisms like GIFCT, where multiple platforms collaborate, are an effective way for online platforms to combat extremist content, but some researchers argue that there is a potential danger. Evelyn Douek , a researcher at Harvard University, points out that GIFCT's work is uncertain and there is not enough scrutiny in determining what content to exclude.

GIFCT has set a secret standard for 'violent extremist' speeches and content, and should not explain what content violates or does not violate GIFCT rules. Hmm. In addition, since external researchers cannot access the GIFCT database, it cannot be denied that GIFCT may have deleted mere satire, information that should be informed about serious acts of terrorism, and documents regarding human rights violations. thing.

Platforms routinely monitor and eliminate content based on their own rules, but GIFCT differs in that the decision of the core large-scale platform also affects the small-scale platforms joining the end. I will. Of course, smaller platforms can leave content at their discretion, based on flags set by GIFCT. However, smaller platforms generally do not have sufficient resources for content monitoring, so it is believed that they will often eliminate content according to the GIFCT recommendations.

The establishment of an advisory board may make GIFCT's work more democratic, but at the same time there is a risk that 'national governments will intervene in GIFCT's content monitoring'. On 30 July 2020, rights groups, including the Center for Democracy and Technology (CDT) , a non-profit organization on technology and personal rights, said, 'Counter-terrorism programs and surveillance include Muslims, Arabs and other It violates the rights of the group and has been used by the government to silence civil society,' a letter to GIFCT was filed.



Regarding the dangers of GIFCT, researchers believe that improving transparency regarding the content of censored content is the most important issue. In July 2020, GIFCT published a transparency report, but Stanford researcher Daphne Keller improved external researchers' access to GIFCT's blocklist, and there are prejudices and mistakes. If you do, you need a system that you can point out.

Keller said the GIFCT was founded on the demands of the democratic government, but it is unclear what GIFCT actually is. The establishment of GIFCT is, in fact, 'non-legal rules mean that the four platforms have been integrated to create a fully opaque and powerful system for controlling online speech,' Keller said. Said.



in Web Service, Posted by log1h_ik