These are perilous times when it comes to protecting children from online predators. In fact, the fight is so overwhelming that advocates, including some countries, are wading into dangerous waters by striving to normalize pedophilia and diminish the stigma and, ultimately, the crime itself.
One case early this year involved the St. Rochus daycare center in Kerpen, Germany, which started promoting and encouraging sexual exploration among nursery students. This agenda encompassed several other daycares in Germany.
Even Apple Inc. tried taking up the fight against child sexual abuse material (CSAM) and lost when, back in August 2021, the tech giant announced a controversial new software-driven initiative that could have revolutionized CSAM detection and prevention on iPhones before the material could be uploaded to the iCloud.
Comparing image hashes with a known CSAM database of image hashes would have empowered the company to identify the illegal material without actually having access to view a user’s Photo Album. Positive matches would have been vetted by Apple and then provided to the National Center for Missing and Exploited Children (NCMEC). The detection software was announced to maintain privacy at its core.
Big Tech needs to step up
While the tech industry and privacy advocates believed the technology would be too invasive if implemented in the wild, the hesitancy offers little to nothing in the way of a solution, thus leaving the epidemic to continue to proliferate. Especially in the wake of the Jeffrey Epstein case, which implicated major global political actors in sex abuse and trafficking.
There were disruptions among Apple’s child safety executives concerning trust and safety. Erik Neuenschwander, director of privacy and child safety at Apple, said: “How can users be assured that a tool for one type of surveillance has not been reconfigured to surveil for other content such as political activity or religious persecution?”
It’s also curious to note that we are all still living in a post-9/11 era where privacy has been exchanged for surveillance by governments across the world. Furthermore, the privacy dynamic changed again during the COVID-19 pandemic.
Most Big Tech companies monopolize and monetize user and customer data with virtually no actionable resistance from the public until egregious privacy invasions unfold. For this reason, I find it hard to understand their reluctance to support Apple’s initiative, especially since Apple has a history of opposing governmental encroachments upon its technology.
To drive the nail in the coffin, Meta (formerly Facebook), the social media giant, is currently battling a lawsuit from Attorney General Raúl Torrez of Santa Fe, New Mexico, for apparently creating the perfect environment for a thriving child sexual exploitation marketplace, which its algorithms seemingly do not detect in the slightest.
The lawsuit also states that Meta “proactively served and directed [children] to egregious, sexually explicit images through recommended users and posts – even where the child has expressed no interest in this content.”
In all fairness, during the 2nd quarter of 2023, Meta and its sister company Instagram sent 3.7 million NCMEC Cybertip Reports for child sexual exploitation. Nevertheless, the question we all should be asking is why are predators so comfortable sharing highly illegal debased content like this on Meta’s platforms?
It boils down to a lack of consequences on behalf of the predator and the companies that refuse to police it. This has everything to do with its algorithms, where it’s arguably a case of someone being asleep at the wheel.
This has created the perfect environment for predators. Moreover, Meta’s announcement to roll out end-to-end encryption for their Messenger platform only two days after being served the lawsuit strikes a defiant note – despite the epidemic festering on its platforms.
Don’t even get me started on the latest trend in AI image generation to create child sex imagery. It has, by all definitions, been a war imposed on all fronts. Policymakers and legislators have arguably arrived too late.
It’s my personal opinion that no organization or government, especially the United States, has the infrastructure to stop the spread of this epidemic unless they approach it with the same determination and resources as their war on terror.
FBI CSAM database
Interestingly enough, over a decade ago, after reading an attachment to the search and seizure criminal affidavit of USA v Jason Childs (5th Cir.) I learned that the FBI also operates a CSAM database. Similar to the database maintained by NCMEC, which Apple would have used, the FBI catalogs unique hash fingerprints for each CSAM item they discover.
However, the attachment also revealed that the FBI is responsible for distributing child pornography over Peer-To-Peer networks like LimeWire with the goal of catching those who download it. Remember, this was over a decade ago. I can understand why the document isn’t searchable on PACER and was likely placed under seal. (Hint to all you FOIA hunters out there).
Statistics
The following statistics do not reflect present statistics because the NCMEC has yet to publish the current figures. However, the NCMEC reported that over 99.5% of reports made to its CyberTipline in 2022 concerned incidents of suspected child CSAM.
On January 31st, 2024, they published some current figures on Twitter: “In 2023, NCMEC’s CyberTipline received 36.2 million reports of suspected child sexual exploitation online. Those reports contained more than 105 million images, videos, and other files. We also saw an explosion in reports of online enticement, an increase of more than 300% between 2021-2023.”
Last year, the Internet Watch Foundation (IWF) investigated a total of 392,660 reports of suspected CSAM.
In 2023, 254,070 websites were taken action against by the IWF took action to remove material from the web containing user-created content where a child was coerced, blackmailed, tricked, or groomed into performing sexually via a webcam.
The total list of statistics is so staggering, that to share them would amount to a dedicated article of its own.
Leveraging the war
The fact is that children are losing the war against child exploitation, and the casualties against innocence lie squarely on the conscience of the tech industry. That’s why it is up to everyday people like you and me to lobby and advocate for online child safety.
Congress is already actively considering five legislative items that could change the tides of this war. This was proposed by NCMEC, which is viewable on their webpage. This provides a kind of framework that can be adapted toward our participation in this battle.
Tech companies, from social media to Email Service Providers and Internet Service Providers, could implement a CSAM detecting framework similar to Apple’s to identify and flag CSAM. The industry must create a hostile environment for these bad actors, and harsher legal ramifications must become severe enough to act as a deterrent.
You can accomplish this by contacting your House of Representatives members and requesting them to hold the tech industry accountable. Also, contact your State Senators.
Draft your request and send it, recruiting others to do the same in their respective States. Publicize your initiative on social media to recruit more to your cause.
For UK citizens, you can petition parliament and the government. For US citizens, you can launch a petition campaign using Change.org.
Attend the Crimes Against Children conference and network with tech companies, law enforcement, and non-profit organizations that have joined forces in fighting back. There are a variety of related conferences around the world.
These groups do not work with “cyber vigilantes,” and law enforcement is reluctant to work with them. That is why networking as an individual is crucial in building trust and credibility if you hope to work with them.
Most of all, use due diligence in monitoring your children’s online activities. If you see something, report it. Together, we can secure a new era of a safe internet for children.
Your email address will not be published. Required fields are markedmarked