According to court documents in the New Mexico lawsuit, it is stated that Meta showed a ‘prevailing resistance in the past’ when it came to safeguarding children on Instagram.

Movie DetailerWorld According to court documents in the New Mexico lawsuit, it is stated that Meta showed a ‘prevailing resistance in the past’ when it came to safeguarding children on Instagram.
Newly revealed documents from the lawsuit filed against Meta in New Mexico highlight the company’s historical reluctance to prioritize the safety of children on its platforms, according to the complaint.The recently unredacted passages from the lawsuit, which were made public on Wednesday, include internal employee messages and presentations from 2020 and 2021. These documents demonstrate that Meta was aware of issues such as adult strangers contacting children on Instagram, the sexualization of minors on the platform, and the risks associated with its “people you may know” feature, which suggested connections between adults and children.

However, the passages also reveal that Meta was slow to address these issues. For example, Instagram only began restricting adults’ messaging capabilities with minors in 2021.

One internal document mentioned in the lawsuit shows Meta scrambling in 2020 to respond to an Apple executive whose 12-year-old child was solicited on the platform. The document noted that this kind of incident could anger Apple to the point of threatening to remove Meta’s app from the App Store.

According to the complaint, Meta was well aware that adults soliciting minors was a problem on its platform and only treated it as urgent when absolutely necessary.

The report’s author criticized Meta’s reasoning, describing it as unconvincing and accusing the company of prioritizing growth over children’s safety.

In March 2021, Instagram announced that it would restrict individuals over the age of 19 from messaging minors.

In a July 2020 internal chat, an employee raised concerns about child grooming on TikTok and asked what Meta was specifically doing about it. Another employee responded that their efforts were somewhere between zero and negligible, as child safety was not a priority for that period.

In a statement, Meta asserted its commitment to providing safe online experiences for teenagers and emphasized its decade-long focus on these issues. The company also mentioned its employment of individuals dedicated to keeping young people safe online. Meta criticized the complaint for distorting their work through selective quotes and cherry-picked documents.

The complaint also accused Instagram of failing to address inappropriate comments made by adults under posts made by minors.

This issue was corroborated by former Meta engineering director Arturo Béjar, who recently testified about his own daughter’s troubling experiences with Instagram. Béjar, an expert in combating online harassment, told U.S. senators in November that his daughter and her friends had been subjected to unwanted sexual advances and harassment.

A child safety presentation from March 2021 highlighted Meta’s inadequate focus on sexualization of minors on Instagram, particularly regarding sexualized comments on posts made by minors. This not only resulted in a terrible experience for content creators and bystanders but also provided an avenue for malicious individuals to identify and connect with one another.

Meta claimed that it utilizes advanced technology, hires child safety experts, reports content to the National Center for Missing and Exploited Children, and collaborates with other companies and law enforcement agencies, including state attorneys general, to combat online predators. The company, based in Menlo Park, California, has been updating its safeguards and tools for younger users under increasing pressure from lawmakers, although critics argue that more needs to be done.

New Mexico Attorney General Hector Balderas Torrez stated in a release that Meta employees had been attempting to raise awareness about the dangers faced by children due to decisions made by Meta executives. While the company downplays illegal and harmful activities on its platforms, internal data and presentations from Meta suggest that the problem is extensive and pervasive.


Leave a Reply

Your email address will not be published. Required fields are marked *