When the news broke on Wednesday, December 6th, that Facebook (Meta) and its offshoot Instagram were finally being served with a lawsuit for steering children to child sexual abuse material (CSAM) and toward predators, those of us who work in #OpChildSafety cheered. It’s incredibly reassuring to know that someone is listening and responding aggressively to this epidemic, which is out of control.
Meta was served the lawsuit the previous day, and two days later, they dared to announce the decision to introduce end-to-end encryption for its platforms. In essence, such a privacy move will, by consequence, allow predators the liberty to solicit and distribute CSAM and other harmful content across its platforms without Meta first having an efficient way to combat this epidemic.
Think about this for a moment.
I digress. Attorney General Raúl Torrez of Santa Fe, New Mexico, announced the lawsuit that day, releasing a lengthy and detailed account that explained the scope of child abuse and child sex trafficking on the social media platforms, which also contained censored screenshots of child abuse and evidence of trafficking.
The most insane thing is that it’s not so much a question of how efficient the algorithms are. I say this because those of us who work in #OpChildSafety on those platforms know they are not working. But rather, Meta’s response to the accusations leads to more unanswered questions that demand immediate attention and cessation of cookie-cutter corporate responses.
Even when the content wasn’t being searched, the advertising algorithms were hard at work, exposing the content and illicit peddlers toward children. Ultimately, this also allowed child predators to uncover and message kids. Thus, the lawsuit was necessary to form an effort to protect children from such things as grooming, human trafficking, and online solicitation of CSAM.
According to Torrez, Meta’s lack of proactivity and, by extension, absence of protections to insulate children from harmful exposure to this content is because of the possibility their advertising revenue could become negatively impacted.
His office launched an undercover investigation by setting up fake accounts of fictional underage persons, which worked as a honeypot to attract predators. They used images of fictional teens and preteens generated by artificial intelligence and watched as Meta’s algorithms began recommending sexual content. This also consisted of a wave of explicit messages and solicitations from adults.
"Meta has allowed Facebook and Instagram to become a marketplace for predators in search of children upon whom to prey," the lawsuit alleges.
This whole episode reminds me of the 2010 case against Craigslist, which was exposed for soliciting child prostitution and trafficking of women, which prompted a swift change in the platform's algorithms.
What is Meta hiding?
According to the filing, the day after the lawsuit was filed, New Mexico investigators on the case were contacted and informed that Meta shut down the test accounts that were used to investigate the profiles promoting CSAM. This also meant that these test accounts would no longer be collecting and accessing data. Included in the notice was a warning that said these offending accounts would be “permanently disabled.”
But why?
What about the child predators and traffickers that use them? Disabling these accounts should be the final action once the perpetrators operating behind these profiles have been brought to justice.
Because of this, Torrez asked a judge to enforce an order against Meta not to terminate any data connected to the test accounts after Meta supposedly claimed it would only retain relevant information to the claims.
Meta aggressively refuted the lawsuit’s claim by the Attorney General. “We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators,” said spokesperson Nkechi Nneji in a statement.
Meta spokesperson Andy Stone said in a statement responding to the accusations in the lawsuit, “We will, of course, preserve data consistent with our legal obligations.”
In my opinion, in the culmination of all the statements released on behalf of Meta, they’ve been caught red-handed, knowing this was an epidemic on their platform but not exactly fighting against it.
But why?
The filing states, “While it is unclear whether “permanently disabl[ing]” an account is the functional equivalent to deleting the account, the State believes that is the case. … Indeed, in a California social media litigation, another technology company ‘locked’ the plaintiffs’ accounts following initiation of the action … The company confirmed in a recent court filing there that these ‘locked’ accounts were inadvertently deleted by the company’s automated processes.”
This is exactly what the State hopes to prevent the social media tech giant from doing.
What gets more alarming is that on the same day these accounts were disabled, confirmation was requested from Meta to ensure it would preserve all the information they collected from the test accounts and other accounts detailed in the complaint.
Interestingly enough, Meta’s lawyers seemed to sidestep the order with a carefully worded response by simply stating they would take “reasonable steps” to analyze the referenced accounts and retain “relevant data.”
It gets worse.
Meta didn’t reply to a request for follow-up, alleged the filing. This means that Meta provided no clarification on what data they would or would not determine as “relevant.” The filing also states that Meta refuses to preserve “all data” with the referenced accounts. Most importantly, a court order is necessary to preserve this data as key evidence for trial.
Facebook vs. #opchildsafety
The lawsuit against Meta platforms and CEO Mark Zuckerberg is vital to our fight against CSAM and sex trafficking because, as hunters, many of us feel the platform is stonewalling any effort to inspect, analyze, and resolve our complaints.
#OpChildSafety workers have been reporting accounts of this nature ad nauseam for as long as we have been operating, and Facebook isn’t exactly prioritizing removing the illegal content. On the other hand, if you say something offensive, you’re put in “Facebook Jail” almost immediately.
Moderating free speech and censoring words and phrases is more important than protecting children from sexual content. This is self-evident because the investigation led by Torrez proved Meta’s algorithms deliberately pair the content with underage users on its platforms. And if Meta’s response to these accusations is to announce end-to-end encryption…
The timing is outrageous and only seems to be digging the tech company a deeper grave due to its gross lack of discretion, and even grosser disregard for the seriousness of the accusations. Its delay in providing transparency is alarming, especially since it involves abused children.
Similarly, I co-run a Million Mask March group on Facebook, which regularly experiences pornographic spam distributed by sock accounts. Reporting the accounts doesn’t prevent it. It’s as though the algorithms that analyze, identify, and flag certain content aren’t working.
But why?
Approximately 500,000 online predators are believed to be active daily. Kids aged 12 to 15 are particularly vulnerable to being groomed or manipulated by adults encountered on the internet. According to the FBI, more than half of the victims of online sexual exploitation fall within the 12 to 15 age range. About 89 percent of inappropriate advances toward children take place in Internet chatrooms or through instant messaging.
In my opinion, what Meta needs is to promote respect for the law and put children first before revenue. It is written somewhere that the love of money is the root of all evil. That is why we should demand that Meta does the right thing.
If not, I hope people everywhere will boycott Meta, even if it takes a personal sacrifice on behalf of its users. Disconnecting from it might prove difficult. I do not want to support a platform that provides insulation from prosecution to child predators and sex traffickers.
Enough is enough.
More from Cybernews:
TOC label:TOC id: #
FBI seizes ALPHV ransomware gang‘s dark web blog
Sony's Insomniac games leaked by Rhysida ransom gang
Beware: someone’s trying to log in to your Facebook account
Hotels targeted with “inhospitality” campaign
Breach of Comcast’s Xfinity exposes nearly 36 million people
Subscribe to our newsletter
Your email address will not be published. Required fields are markedmarked