Instagram’s recommendation algorithms have been promoting and connecting accounts that facilitate and sell child sexual abuse content, a study by Stanford University’s Internet Observatory Cyber Policy Center, the University of Massachusetts Amherst and The Wall Street Journal has revealed. The researchers found that Instagram has a “particularly severe problem” with accounts that show self-generated child sexual abuse material (SG-CSAM) and purport to be operated by minors. The study attributed the widespread discovery of such accounts to the use of hashtags, the relatively long life of seller accounts and Instagram’s effective recommendation algorithm.
Meta’s Response to the Accusations
In response to the investigation, a spokesperson for Meta, Instagram’s parent company, stated that the company has been taking several steps to fix the issues raised in the study. The spokesperson also said that Meta “set up an internal task force” to investigate and address the claims. They further added that child exploitation is a horrific crime and that the company works aggressively to fight it both on and off their platforms.
Twitter’s Issues with Child Exploitation
Alex Stamos, one of the authors of the paper and former chief security officer at Facebook, said that the researchers focused on Instagram because of its position as the most popular platform for teenagers globally, making it a critical part of the ecosystem. However, he added that Twitter continues to have serious issues with child exploitation, and the problem has persisted since Elon Musk acquired Twitter late last year. According to Stamos, Twitter’s basic scanning for known CSAM broke after Musk’s takeover and was not fixed until the researchers notified the company. In a tweet, Stamos wrote, “They then cut off our API access,” referring to the software that lets researchers access Twitter data to conduct their studies.
Earlier this year, NBC News reported that multiple Twitter accounts that offer or sell CSAM remained available for months, even after Musk pledged to address problems with child exploitation on the social messaging service. Twitter did not provide a comment on the matter.
Instagram’s recommendation algorithms are promoting and connecting accounts that facilitate and sell child sexual abuse content, according to the study. The researchers found that Instagram has a “particularly severe problem” with SG-CSAM accounts, which purport to be operated by minors. While Meta has stated that they are taking several steps to fix the issues, Twitter continues to have serious issues with child exploitation, as reported by the study’s authors.
Leave a Reply