Study reveals that X does not respond to reports of deepfake pornography for 'non-consensual intimate media' but responds quickly to copyright infringement



A study conducted on how X (formerly Twitter) responded to 'Non-Consensual Intimate Media (NCIM)' by preparing actual deepfake pornographic images revealed that while reporting it as NCIM resulted in no action, reporting it as 'copyright infringement' resulted in all images being removed within 25 hours.

[2409.12138] Reporting Non-Consensual Intimate Media: An Audit Study of Deepfakes

https://arxiv.org/abs/2409.12138



Twitter Acts Fast on Nonconsensual Nudity If It Thinks It's a Copyright Violation

https://www.404media.co/twitter-acts-fast-on-nonconsensual-nudity-if-it-thoughts-its-a-copyright-violation/

X ignores revenge porn takedown requests unless DMCA is used, study says - Ars Technica
https://arstechnica.com/tech-policy/2024/10/study-fastest-way-to-get-revenge-porn-off-x-is-a-dmca-takedown/

The research was conducted by Lee Chi-wei and others from the University of Michigan. It was submitted to the paper publishing site arxiv and was currently undergoing peer review at the time of writing.

'Non-Consensual Intimate Media (NCIM)' refers to images or videos that contain nudity or sexual acts that are taken or created without the consent of the people depicted. When referring to images alone, they are also called 'Non-Consensual Intimate Images (NCII).'

Advances in AI have made it easier to create fake images and videos, so-called 'deep fakes,' and there are even cases of students creating deep fake pornography based on photos of their classmates. Deep fake pornography created in this way is equivalent to NCIM.

Police launch investigation after high school boy shares AI-generated 'fake nude photos of female classmates' in group chat - GIGAZINE



Lee and his team generated 50 deep fake pornographic images that looked like NCIM and posted them to X. 25 of them were reported as 'NCIM' and the remaining 25 were reported as 'copyright infringement' and were taken down under the Digital Millennium Copyright Act (DMCA).

As a result, the images reported through NCIM were not deleted at all even after waiting for more than three weeks, whereas all images reported for copyright infringement were deleted within 25 hours.

The difference in response is because the DMCA requires that a valid takedown request be responded to promptly, Lee and his colleagues believe, and they point out the need for legislation focused on removing NCIM from the internet. They also said that the ethical considerations for auditing NCIM on social media need to be discussed.

In addition, the news site 404media points out that because the photos are protected by the photographer's copyright, victims of NCIM do not necessarily have the images and videos removed using the DMCA.

in Web Service, Posted by logc_nt