Last week, the Ninth Circuit held in United States v. Wilson, No. 18-50440, 2021 WL 4270847, that a law enforcement officer violated a criminal defendant’s Fourth Amendment rights when he opened images attached to the defendant’s emails without a warrant, even though the images had previously been flagged as child sexual abuse materials (“CSAM”) by Google’s automated CSAM-detection software.  The court based its ruling on the private search exception to the Fourth Amendment, which permits law enforcement to conduct a warrantless search only to the extent the search was previously conducted by a private party.  Because no individual at Google actually opened and viewed the images flagged as CSAM, the court held that law enforcement “exceeded the scope of the antecedent private search,” thereby “exceed[ing] the limits of the private search exception.”  Op. at 20-21.

Because the Fourth Amendment applies only to searches conducted by the government, “a private party may conduct a search that would be unconstitutional if conducted by the government.”  Op. at 13.  If the private party later provides the fruit of that search to the government, the private search doctrine permits the government to repeat the search without a warrant.  In United States v. Jacobsen, 466 U.S. 109 (1984), the Supreme Court explained that a warrantless government search may be permissible under the private search doctrine if the search does not exceed the scope of an antecedent private search, in that the government “learn[s] nothing that had not previously been learned during the private search.”  Id. at 120.

Circuit courts are split on how to apply the private search doctrine in the context of CSAM reporting:  Federal law requires an electronic communication service provider with actual knowledge of a violation of child pornography laws to file a report with the National Center for Missing and Exploited Children (“NCMEC”), which forwards the report to law enforcement.  See 18 U.S.C. § 2258A.  Although providers are not required proactively to “search, screen, or scan” for CSAM on their platforms, id. § 2258A(f)(3), many choose to do so using automated CSAM-detection software.

Here, as the Ninth Circuit explained, Google’s report to NCMEC was “based on an automated assessment that the images [the defendant] uploaded were the same as images other Google employees had earlier viewed and classified as child pornography,” and NCMEC forwarded the report to the San Diego Internet Crimes Against Children Task Force.  Op. at 5.  Neither Google nor NCMEC personnel opened or viewed the reported images.  Id.  But when NCMEC forwarded the Google’s report to law enforcement, the investigating officer opened and viewed the images, without a warrant, and relied on descriptions of the images in applying for warrants to search Wilson’s email account and home, which ultimately led to his conviction.  Id. at 5-6.

The Ninth Circuit found a “large gap between the information in [Google’s report] and the information the government obtained and used to support the warrant application and to prosecute Wilson.”  Id. at 24.  The court considered the process by which Google’s automated software scanned for CSAM and concluded that, because “no identified Google employee ‘knew and could say’ what th[e] images showed,” and the report merely communicated that “the four images Wilson uploaded to his email account matched images previously identified by some Google employee at some time in the past as child pornography,” the government exceeded the scope of the private search by opening the email attachments and viewing the images without a warrant.  Id. at 22, 24.

The Ninth Circuit found the “critical fact” to be that no Google employee had viewed the images before the police officer did, relying on the “clear holding of Jacobsen” to conclude that “[w]hen the government views anything other than the specific materials that a private party saw during the course of a private search, the government search exceeds the scope of the private search.”  Id. at 28.

The court acknowledged that its decision “contribute[s] to a growing tension in the circuits about the application of the private search doctrine to the detection of child pornography.”  Id. at 31.  While the Tenth and Eleventh Circuits have adopted a similar interpretation of the private search doctrine, see United States v. Ackerman, 831 F.3d 1292 (10th Cir. 2016); United States v. Sparks, 806 F.3d 1323 (11th Cir. 2015), the Fifth and Sixth Circuits have concluded that a law enforcement officer may open and view images reported as CSAM without violating the Fourth Amendment, even if the reporting provider and NCMEC have not previously viewed these images, see United States v. Miller, 982 F.3d 412 (6th Cir. 2020); States v. Reddick, 900 F.3d 636 (5th Cir. 2018).  The U.S. Department of Justice has not yet indicated whether it will petition for rehearing, or seek Supreme Court review to address this circuit split.

 

Print:
EmailTweetLikeLinkedIn
Megan Crowley

Megan Crowley is a litigator who represents clients in high-stakes matters, from case inception through trial and appeal. Her practice focuses on complex commercial disputes and litigation under the Administrative Procedure Act. Megan currently represents several leading technology companies in cutting-edge litigation relating…

Megan Crowley is a litigator who represents clients in high-stakes matters, from case inception through trial and appeal. Her practice focuses on complex commercial disputes and litigation under the Administrative Procedure Act. Megan currently represents several leading technology companies in cutting-edge litigation relating to cybersecurity and data privacy.

Megan rejoined Covington from the U.S. Department of Justice, where she defended executive branch agencies in some of their most high-profile cases. Drawing upon this experience, she has secured a number of landmark victories against the federal government in recent years. Megan was a key member of the Covington team that represented TikTok in its successful challenge to the Trump Administration’s efforts to ban the app, and its defense of the district court’s injunction on appeal. She also represented Xiaomi Corporation in its successful challenge to the Department of Defense designation that would have banned the company from U.S. financial markets, securing a preliminary injunction and, ultimately, a rescission of the ban.

Photo of Chloe Goodwin Chloe Goodwin

Chloe Goodwin is a litigator and regulatory attorney focused on privacy and technology issues. She represents several leading technology companies in litigation and compliance matters relating to electronic surveillance, law enforcement access to digital evidence, cybersecurity, and data privacy.