Is Meta (Facebook, Instagram, WhatsApp, and Messenger) profiting from algorithms allowing pedophiles to find content? A recent letter spearheaded by Representative Kathy Castor (D-FL) seeks to uncover the truth, demanding Meta answer questions raised from investigations showing the promotion of adult content through even minimal interaction.
A June report by the Wall Street Journal showed that Instagram, in particular, is frequently unable to remove child sexual abuse material (CSAM) found and reported on the site, and users are regularly shown CSAM or CSAM-adjacent content through even a passing interaction with sellers and purveyors of such material. More egregiously, posts flagged for potentially containing CSAM give users two options: Report or show it anyway.
Meta's attempts at removing accounts do not help, as many can use and create backup or alternate accounts to continue their stomach-churning "work."
Hence the question: Is Meta allowing this to happen and making money off of it?
As a result, Rep. Castor and Representative Lori Trahan (D-MA) led a coalition of lawmakers in demanding Meta answer for its inability to confront the problem.
"We are deeply concerned by reports indicating that Meta and Instagram steer users toward sexualized videos through associations with children and that the platform runs ads alongside such content without advertisers’ knowledge and in violation of their policies," the letter stated, calling this phenomenon "highly disturbing and suggests that Meta’s algorithms serve the prurient interests of pedophiles and that Meta is aware of the practice and chooses to maximize their profits while turning a blind eye to the harm caused by sexualizing children."
This is not the first time Florida officials have demanded that Meta answer for the proliferation of CSAM on their platforms. Attorney General Ashley Moody demanded Meta CEO Mark Zuckerberg testify before the Florida Statewide Council on Human Trafficking in July.
Statistics from Moody's letter showed that 163 out of 269 human trafficking victims rescued in 2022 were recruited through Facebook or Instagram, while 85% of self-reported CSAM was found on Meta platforms.
Rep. Castor's letter comes hot on the heels of a bill introduced by colleague Representative Debbie Wasserman Schultz (D-FL), which would alert parents if their children were in potential danger online.