Bluesky decides to increase moderation staff from 25 to 100, strengthening response to child pornography and local culture
Bluesky has exceeded 20 million total users and3.5 million active users in November 2024. Meanwhile, it has been revealed that Bluesky has decided to increase its moderation staff to 100 people.
Inside Bluesky's big growth surge
https://www.platformer.news/bluesky-growth-content-moderation-trust-safety-interview/
Like other social media sites, Bluesky has a dedicated moderation team to monitor violent content and nuisance behavior. When Bluesky's technical advisor, Why , visited Japan in April 2024, he said, 'We have 22 moderators,' and 'We have specialized teams for Japanese, English, and Portuguese,' revealing that they are building a moderation system tailored to the culture of each region.
An event was held where you could ask Bluesky developers anything, such as 'Are there plans to implement a key account?' and 'Are there plans to establish a Japanese branch?' When I went, it was a fulfilling event where ideas that were added to the developers' 'to-do list' came out one after another - GIGAZINE
Aaron Rodrikx, head of trust and safety at Bluesky, told Platformer that the company has decided to increase its moderation staff from 25 to 100, making it clear that the company is working to strengthen its moderation system in response to the sudden increase in users.
Bluesky monitors content through user reports and automated systems. However, automated systems have the problem of being 'difficult to respond to regional differences in culture.' For example, Bluesky reported that the frequency of posts of the word 'KKK' increased with the sudden increase in Brazilian users . The automated system considered 'KKK' to be a term referring to the white supremacist group 'Ku Klux Klan (KKK),' but it turned out that in Brazil, 'KKK' was actually a word referring to 'laughter.' In order to respond to such regional cultures, Bluesky is increasing the number of moderators.
In addition, on November 26, 2024, Bluesky's safety team revealed that harmful content was also increasing with the rapid increase in the number of users, and announced that they had strengthened moderation in important areas such as child safety in the short term. This short-term moderation strengthening seems to be in a trade-off with accuracy, and the safety team is calling on users who have been mistakenly targeted for regulation to appeal.
1/ We're experiencing a huge influx of users, and with that, a predictable uptick in harmful content posted to the network.
— Bluesky Safety (@safety.bsky.app) 2024-11-26T00:07:59.547Z
As a result, for some very high-severity policy areas like child safety, we recently made some short-term moderation choices to prioritize recall over precision.
According to Lordrix, Bluesky uses Safer , a detection tool developed by child protection organization Thorn , to monitor for child sexual abuse material (CSAM).
Safer is now on Bluesky! Celebrating with a HUGE thanks to @caseynewton.bsky.social for highlighting the critical work happening behind the scenes at @bsky.app to keep its platform safer for everyone. We're proud to play a role in resolved child sexual abuse material (CSAM) as this community grows.
— Safer, Built by Thorn (@saferbythorn.bsky.social) 2024-11-26T01:48:14.061Z
In addition, Bluesky was told by a European Commission spokesperson that 'All platforms operating in the EU, even if they are below the standard, must set up a dedicated page to state the number of users in the EU and where they are legally established. At present, Bluesky is not complying with this rule.' In response to this criticism, Bluesky commented to foreign media Bloomberg on November 26, 2024, 'We are consulting with lawyers to comply with EU regulations,' indicating its intention to comply with the regulations.
'Bluesky violates EU regulatory law,' says European Commission spokesman, but the platform is so small that it is not subject to regulation - GIGAZINE
Related Posts:
in Web Service, Posted by log1o_hf