Will Apple CSAM Scans Open Up More People to Being Framed for Child Porn?

small picture of attorney bill henry
By: Bill Henry
PublishedAug 26, 2021
4 minute read

Tech giant Apple recently announced it will soon scan user images for child pornography. The company also will report users who have child sexual abuse material (CSAM) users to authorities. Child pornography charges carry severe, lifelong penalties.

In this article, we’ll explore some of the legal implications this scanning technology can have it is misused by malicious individuals. We’ll also explain how Apple’s scanning technology works.

Talk with a Criminal Defense Lawyer

You should speak with a lawyer if you face child pornography charges. Having sexually exploitative images of children is a felony and carries serious penalties, such as prison time, significant fines, parole, and sex offender registration.

A child pornography conviction will follow you for the rest of your life. Call 303-688-0944 to set up a case assessment or schedule the meeting yourself when you click here.

Scanning for Child Pornography

Apple will report users who have a collection of child pornographic images to the National Center for Missing and Exploited Children (NCMEC).

This announcement garnered the ire of privacy and security experts, researchers, and organizations who penned an open letter to the company. The letter has since been signed by thousands of consumers who oppose the technology that they say will undermine user privacy.

Apple disputes opponents’ opinions, saying the scanning application was developed in a way that protects user privacy – unless, of course, Apple discovers you tried to upload a collection of child pornography to iCloud.

Here’s how Apple says the CSAM scanning program works:
Apple gets encrypted images of known child pornography from the National Center for Missing and Exploited Children. The encrypted images are added to Apple operating systems.
Before user images are uploaded to iCloud, they are compared against the encrypted CSAM images. If the scan identifies any matches, the user images then undergo another assessment called the “threshold test” to determine the number of matches.
Apple is alerted if a user has 30 or more matches of CSAM images stored on their device.
The final step in the process is to confirm whether the matches are indeed sexually explicit images of children. When user images reach this stage, the images are decrypted and an Apple employee confirms whether the images contain CSAM.

What Happens if an Apple User Has CSAM

Apple users found to have a collection of pornographic images of children will have their accounts disabled. They will be reported to the National Center for Missing and Exploited Children. That group works with law enforcement to prosecute individuals with child porn.

Can Apple Make a Mistake?

Of course, it can. But Apple says there is a one in one trillion chance someone’s account will be incorrectly flagged for having illegal images.

The tech giant insists this program was created with user privacy in mind. The company has underscored that user pictures remain private unless they have matches that exceed the threshold of known images of child sexual abuse.

Digging into the Legal Implications

Our article so far has provided a simple explanation about Apple’s image scanning process. Now that you have a basic understanding of it, let’s dig into some of the potential legal implications.

Can Innocent People Be Framed? 

In addition to privacy concerns, others are worried Apple’s scanning creates an opportunity for more people to be framed for child pornography.

If you think this is totally out of the realm of possibility, think again.

The Case of the Retaliatory Wife

In 2019, an Arkansas woman pled guilty to viewing, possessing, and distributing child pornography after she planted CSAM images on her husband’s phone in an effort to frame him.

Cherie R. Bolton turned in her husband to the police, but inconsistencies in the woman’s story led authorities to discover she had actually downloaded the CSAM images herself – not her husband – and made up the whole thing.

Police believe she concocted the scheme to retaliate after her husband kicked her out of the house.

In this particular case, the perpetrator alerted police to the illegal images. But what if someone puts the images on your phone and leaves them there for Apple to discover? If that happens, you could find yourself in a serious legal conundrum.

One Expert’s Warning

A Johns Hopkins University associate professor has been outspoken about this very possibility. Matthew D. Green is a “nationally recognized expert on applied cryptography and cryptographic engineering,” according to his university profile.

Green’s concerns, however, are a little different than someone putting illegal pictures directly onto your phone like in the Arkansas retaliatory wife case. Instead, Green says it is possible for Apple’s algorithm to be fooled by a benign image that is designed to be detected as child porn.

“Researchers have been able to do this pretty easily,” Green has told numerous tech industry publications and media outlets.

Colorado’s Child Pornography Charges

Individuals convicted of child porn charges face years in prison, sexual offender registration, fines, being labeled a felon, and more. The impact of this charge extends far beyond a prison term.

If you are caught possessing child pornography in Colorado, you can be charged with sexual exploitation of a child. Anyone under the age of 18 qualifies as a child. You can read the full text of the law here.

Possessing child pornography is a class 5 felony or class 4 felony depending on the circumstances of the case. For instance, if you possess a video of CSAM, the charge classification increases to a class 4 felony. Also, if this isn’t your first CSAM offense, the classification is also a class 4 felony.

In Colorado, people convicted of a class 5 felony face up to three years in prison. Class 4 felonies are punishable by up to six years in prison.

Defenses to this Charge

There are a number of ways to defend a sexual exploitation of a child charge. Which ones your attorney can use to build your defense depends on the unique facts of your case.

Colorado law says that “a person commits sexual exploitation of a child if, for any purpose, he or she knowingly: … possesses or controls any sexually exploitative material for any purpose.” CRS § 18-6-403(3)(b.5)

You and your attorney will discuss whether you were aware the images were on your device, and, if not, how they could have gotten on it.

Malware 

Your defense team may hire a forensic expert to explore whether the pictures and/or videos were placed there by another person or possibly through some type of malware called a Trojan or Trojan horse.

This malicious software or code can get on your devices in various ways. Courts and attorneys often scoff at the so-called Trojan defense. For one reason, actual pedophiles use it in an effort to dodge a conviction.

Harvard Law’s Cyberlaw Clinic former director Phil Malone once compared the Trojan defense to the “dog ate my homework” excuse.

“The problem is,” Malone said, “sometimes the dog does eat your homework.”

Someone Else Downloaded It

Do you use a shared computer or network? If so, could someone have downloaded it and now point the finger at you?

Again, a forensic expert will investigate to determine, for instance, when the images or videos were downloaded.

Get a Lawyer for Child Pornography Charges

If you’re being accused of possessing child pornography schedule a case assessment with a member of our Criminal Defense Team.

During this meeting, you and a criminal defense attorney will discuss the facts of your case, what your legal options and obligations are, how much defense may cost, and more.

You’ve got a lot on the line right now. Let’s get started building a strong defense for you.

More Than Just Lawyers. Lawyers for Your Life.

Learn more about our law firm’s philosophy and values.