In a recent development, Attorney General Raúl Torrez of New Mexico announced a lawsuit against Snap, Inc., accusing the company of endangering children through its platform, Snapchat. The lawsuit alleges that Snapchat’s design features, policies, and recommendation algorithms contribute to harmful activities involving minors.
The lawsuit claims that Snapchat's temporary content gives users a false sense of security, which can be exploited by malicious actors. The platform's recommendation algorithm is accused of connecting inappropriate accounts with minors, increasing the risk of harmful interactions. The platform's failure to verify user identities is highlighted as a factor that potentially allows minors to access unsafe content.
The New Mexico DOJ’s investigation alleges that Snapchat is frequently used by individuals to engage in harmful activities targeting minors. As part of the investigation, the DOJ set up a decoy account, for a 14-year-old, which engaged with accounts named with explicit references, and found around 10,000 records of abusive content linked to Snapchat on dark websites leading to disturbing interactions that reinforced the lawsuit's claims.
According to the official website of the New Mexico DOJ, Torrez said, “Our undercover investigation revealed that Snapchat’s harmful design features create an environment where predators can easily target children through sextortion schemes and other forms of sexual abuse,” said Attorney General Torrez. “Snap has misled users into believing that photos and videos sent on their platform will disappear, but predators can permanently capture this content and they have created a virtual yearbook of child sexual images that are traded, sold, and stored indefinitely. Through our litigation against Meta and Snap, the New Mexico Department of Justice will continue to hold these platforms accountable for prioritising profits over children’s safety.”
The lawsuit also mentions that the platform is a primary social media platform for sharing child sexual abuse material (CSAM). Parents report that their children share more CSAM on Snapchat than on any other platform, minors report having more online sexual interaction on the platform than any other platform, and more sex trafficking victims are recruited on Snapchat than on any other platform.
The lawsuit accuses Snap, Inc. of negligence, prioritising platform engagement over the safety of its users. Snapchat’s widespread use among U.S. teens is mentioned, with over 20 million teens reportedly active on the platform.
The Snapchat community guidelines explicitly mention, "We prohibit any activity that involves sexual exploitation or abuse of a minor, including sharing child sexual exploitation or abuse imagery, grooming, or sexual extortion (sextortion). When we identify such activity, we report all instances of child sexual exploitation to authorities, including attempts to engage in such conduct. Never post, save, send, forward, distribute, or ask for nude or sexually explicit content involving anyone under the age of 18 (this includes sending or saving such images of yourself). We prohibit promoting, distributing, or sharing pornographic content. We also don’t allow commercial activities that relate to pornography or sexual interactions (whether online or offline). Breastfeeding and other depictions of nudity in non-sexual contexts are generally permitted."
This lawsuit follows a similar legal action against Meta Platforms in December, which also accused the company of failing to protect children from sexual abuse and predation.