NetChoice, LLC v. Bonta, No. 5:2022cv08861 - Document 74 (N.D. Cal. 2023)

Court Description: ORDER GRANTING 29 MOTION FOR PRELIMINARY INJUNCTION. Signed by Judge Beth Labson Freeman on 9/18/2023. (mdllc, COURT STAFF) (Filed on 9/18/2023)

Download PDF
NetChoice, LLC v. Bonta Doc. 74 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 1 of 45 1 2 3 4 UNITED STATES DISTRICT COURT 5 NORTHERN DISTRICT OF CALIFORNIA 6 SAN JOSE DIVISION 7 NETCHOICE, LLC, d/b/a NetChoice 8 Plaintiff, 9 ORDER GRANTING MOTION FOR PRELIMINARY INJUNCTION v. 10 ROB BONTA, Attorney General of the State of California, in his official capacity, 11 United States District Court Northern District of California Case No. 22-cv-08861-BLF 12 [Re: ECF 29] Defendant. 13 14 This suit challenges the enforceability of the California Age-Appropriate Design Code Act 15 (“the CAADCA” or “the Act”), which was recently enacted for the stated purpose of affording 16 protections to children when they access the internet. See Cal. Civ. Code § 1798.99.29.1 The Act 17 applies to for-profit businesses that collect consumers’ personal information and satisfy other 18 19 20 21 criteria relating to business size and revenue. See CAADCA § 30; Cal. Civ. Code § 1798.140. Effective July 1, 2024, the Act imposes a number of requirements on any covered business that “provides an online service, product, or feature likely to be accessed by children.” CAADCA § 31. 22 Plaintiff NetChoice, LLC (“NetChoice”) “is a national trade association of online 23 businesses that share the goal of promoting free speech and free enterprise on the Internet.” 24 Compl. ¶ 5, ECF 1. NetChoice’s members include Google, Amazon, Meta, TikTok and many 25 other companies with strong online presences. NetChoice sues Defendant Rob Bonta, Attorney 26 27 28 1 The CAADCA is codified at California Civil Code §§ 1798.99.28–1798.99.40. When citing to the Act, the Court will cite to the statute’s abbreviated title and last two digits. For example, the Court will cite to Cal. Civil Code § 1798.99.31 as “CAADCA § 31.” Dockets.Justia.com Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 2 of 45 1 General of the State of California (“the State”), for declaratory and injunctive relief related to the 2 CAADCA, which it asserts is both facially unconstitutional and preempted by federal statute. NetChoice moves for preliminary injunction based on its claims that the CAADCA United States District Court Northern District of California 3 4 violates the First Amendment and the dormant Commerce Clause of the United States 5 Constitution, and is preempted by both the Children’s Online Privacy Protection Act (“COPPA”), 6 15 U.S.C. §§ 6501–6506, and Section 230 of the Communications Decency Act, 47 U.S.C. § 230. 7 See Mot., ECF 29. The State opposes the motion, arguing that the CAADCA regulates conduct— 8 the collection and use of children’s personal information—that does not implicate the First 9 Amendment. See Opp’n, ECF 51. The State also contends that the CAADCA does not violate the 10 dormant Commerce Clause and is not preempted by either COPPA or Section 230. See id. 11 Mindful that the CAADCA was enacted with the unanimous support of California’s 12 Legislature and Governor, the Court has given careful consideration to the motion, the State’s 13 opposition, NetChoice’s reply, the supplemental briefs filed by both parties, the briefs filed by 14 seven sets of amici curiae, and the oral arguments presented at the hearing on July 27, 2023. The 15 Court finds that although the stated purpose of the Act—protecting children when they are 16 online—clearly is important, NetChoice has shown that it is likely to succeed on the merits of its 17 argument that the provisions of the CAADCA intended to achieve that purpose do not pass 18 constitutional muster. Specifically, the Court finds that the CAADCA likely violates the First 19 Amendment. The motion for preliminary injunction is GRANTED on that basis. 20 I. BACKGROUND 21 The internet has become indispensable to the exchange of information. Many online 22 providers allow users to view content and access services without creating an account, while 23 others require the creation of a free account to access services, and still others require users to pay 24 fees. See Cairella Decl. ¶¶ 4–8, ECF 22; Masnick Decl. ¶¶ 5–6, ECF 29; Roin Decl. ¶¶ 7–9, ECF 25 25; Paolucci Decl. ¶ 2, ECF 28. Online providers generally rely on advertising to earn revenue 26 that supports the content and services they offer. See Cairella Decl. ¶¶ 4, 21; Roin Decl. ¶ 10. 27 Advertisements are targeted to users based on their interests, which are gleaned from data 28 collected from the users while they are online. See Egelman Decl. ¶¶ 13–14, ECF 51-1. Such data 2 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 3 of 45 1 also is used by online providers to tailor content to individual users. See Cairella Decl. ¶ 8; Roin 2 Decl. ¶¶ 2–6. In addition, online providers may sell user data to third parties. See Egelman Decl. 3 ¶ 11. 4 5 the provider’s services. See Egelman Decl. ¶ 24. Users also may change their privacy settings to 6 block or delete “cookies,” which are data that websites store in consumers’ web browsers, which 7 are then transmitted back to websites when visited again. See id. ¶ 29. However, privacy policies 8 can be difficult to understand and privacy settings are not always user friendly. See id. ¶¶ 24–30. 9 United States District Court Northern District of California Users can manage their online privacy by reading privacy policies before engaging with These privacy concerns have become increasingly relevant to children, because their 10 internet use has grown dramatically in recent years. See Radesky Decl. ¶¶ 21–25, ECF 51-5. 11 During the COVID-19 pandemic, children’s access to digital technology and time online went up 12 significantly. See id. ¶ 26. Children’s time online increased approximately 52% during the 13 pandemic, and heavier technology use habits have persisted. See id. Children depend on the 14 internet for both educational and entertainment purposes. See id. ¶¶ 26-29. Unplugging is not a 15 viable option. See id. ¶ 29. 16 A federal child privacy law, COPPA, limits the ability of online providers to collect 17 personal information from children. See 15 U.S.C.A. §§ 6501–06. COPPA makes it “unlawful 18 for an operator of a website or online service directed to children, or any operator that has actual 19 knowledge that it is collecting personal information from a child, to collect personal information 20 from a child in a manner that violates the regulations prescribed” under the statute. 15 U.S.C. § 21 6502(a)(1). “Child” is defined as an individual under the age of 13. 15 U.S.C. § 6501(1). The 22 applicable regulations require the operator to obtain parental consent prior to any collection, use, 23 or disclosure of personal information from children. See 16 C.F.R. § 312.3(b). 24 The California Consumer Privacy Act (“CCPA”) imposes limits on the collection of 25 personal information from users generally, requiring among other things that online providers 26 inform users of the categories of personal information to be collected and the purposes of such 27 collection. See Cal. Civ. Code § 1798.100(a)(1). The CCPA defines “personal information” to 28 include any information that “relates to, describes, is reasonably capable of being associated with, 3 United States District Court Northern District of California Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 4 of 45 1 or could reasonably be linked, directly or indirectly, with a particular consumer or household.” 2 Cal. Civ. Code § 1798.140(v). 3 It is against this backdrop that the CAADCA was enacted. The CAADCA goes far beyond 4 the scope of protections offered by COPPA and the CCPA. Whereas COPPA limits the collection 5 of user data by operators of websites and services “directed to children,” 15 U.S.C. § 6502(a)(1), 6 the CAADCA “declares that children should be afforded protections not only by online products 7 and services specifically directed at them but by all online products and services they are likely to 8 access,” CAADCA § 29. COPPA protects children under the age of 13, see 15 U.S.C. § 6501(1), 9 while the CAADCA protects children under the age of 18, see CAADCA § 30(b)(1). COPPA 10 gives parents authority to make decisions about use of their children’s personal information, see 11 16 C.F.R. § 312.3(b), and the CCPA gives users authority to make decisions about their own 12 personal information, see Cal. Civ. Code § 1798.135. In contrast, the CAADCA requires online 13 providers to create a Data Protection Impact Assessment (“DPIA”) report identifying, for each 14 offered online service, product, or feature likely to be accessed by children, any risk of material 15 detriment to children arising from the provider’s data management practices. See CAADCA § 16 30(a)(1). Providers must create a “timed plan to mitigate or eliminate” the risks identified in the 17 DPIA “before the online service, product, or feature is accessed by children,” id. § 30(a)(2), and 18 must provide the DPIA reports to the California Attorney General upon written request, see id. § 19 30(a)(2). The CAADCA also requires that online providers comply with a list of enumerated 20 mandates and prohibitions, discussed in detail below. See id. § 31(a)–(b). 21 Covered businesses must complete the required DPIA reports and satisfy related 22 requirements by July 1, 2024, and continue to do so on an ongoing basis. See CAADCA §§ 31, 23 33. The CAADCA authorizes the California Attorney General to bring a civil enforcement action 24 against any business that fails to comply with the Act’s requirements. See id. § 35. Violators are 25 subject to civil penalties of $2,500 per child for each negligent violation and $7,500 for each 26 intentional violation. See id. 27 28 NetChoice filed this suit on December 14, 2022, challenging the CAADCA as facially unconstitutional and preempted by federal statute. The complaint asserts the following claims: 4 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 5 of 45 1 (1) violation of the First and Fourteenth Amendments to the U.S. Constitution, and Article I, 2 Section 2(a) of the California Constitution; (2) violation of the Fourth Amendment to the U.S. 3 Constitution; (3) void for vagueness under the First Amendment and Due Process Clause of the 4 U.S. Constitution, and Article I, Section 7(a) of the California Constitution; (4) violation of the 5 dormant Commerce Clause of the U.S. Constitution; (5) preemption by COPPA; and (6) 6 preemption by Section 230. Compl. ¶¶ 76–122. The complaint requests declaratory and 7 injunctive relief prohibiting enforcement of the CAADCA. NetChoice now seeks a preliminary injunction enjoining enforcement of the CAADCA 8 9 10 pending disposition of the suit. II. “Courts consider four factors in deciding whether to grant a preliminary injunction: the 11 United States District Court Northern District of California LEGAL STANDARD 12 plaintiff’s likelihood of success on the merits; her likelihood of suffering irreparable harm in the 13 absence of preliminary relief; whether the balance of equities tips in her favor; and whether an 14 injunction is in the public interest.” Garcia v. City of Los Angeles, 11 F.4th 1113, 1118 (9th Cir. 15 2021) (citing Winter v. Nat. Res. Def. Council, Inc., 555 U.S. 7, 20 (2008)). In this circuit, “[l]ikelihood of success on the merits is the most important factor.”2 16 17 Apartment Ass’n of L.A. Cnty., Inc. v. City of Los Angeles, 10 F.4th 905, 912 (9th Cir. 2021) 18 (quoting California v. Azar, 911 F.3d 558, 575 (9th Cir. 2018)). “It is well-established that the 19 first factor is especially important when a plaintiff alleges a constitutional violation and injury.” 20 Baird v. Bonta, --- F.4th ----, 2023 WL 5763345, at *3 (9th Cir. Sept. 7, 2023). “If a plaintiff in 21 such a case shows he is likely to prevail on the merits, that showing usually demonstrates he is 22 suffering irreparable harm no matter how brief the violation.” Id. Finally, “[w]hen, like here, the 23 nonmovant is the government, the last two Winter factors merge.” Id. at *2 (quotation marks and 24 citation omitted). 25 26 27 28 Where the plaintiff cannot show a likelihood of success on the merits, “‘serious questions going to the merits’ and a hardship balance that tips sharply toward the plaintiff can support issuance of an injunction, assuming the other two elements of the Winter test are also met.” All. for the Wild Rockies v. Cottrell, 632 F.3d 1127, 1132 (9th Cir. 2011). The Court need not apply this alternative formulation of the Winter test here because, as discussed below, NetChoice makes a strong showing on likelihood of success and on the other Winter factors. 5 2 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 6 of 45 1 III. DISCUSSION 2 A. 3 NetChoice argues that it is likely to succeed on the merits of its claims that the Act violates 4 free speech rights under the First Amendment (Claims 1 and 3), violates the dormant Commerce 5 Clause (Claim 4), and is preempted by both COPPA (Claim 5) and Section 230 (Claim 6). See 6 Mot. 1; Compl. ¶¶ 76–122. 7 8 9 United States District Court Northern District of California Likelihood of Success on the Merits 1. First Amendment (Claims 1 and 3) Claim 1 asserts that the CAADCA violates the First Amendment because it is an unlawful prior restraint on protected speech, is unconstitutionally overbroad, and regulates protected 10 expression but fails strict scrutiny or any lesser standard of scrutiny that may apply. See Compl. 11 ¶¶ 76–88. Claim 3 asserts that the CAADCA is void for vagueness under the First Amendment. 12 See id. ¶¶ 93–103. NetChoice argues that it is likely to succeed on its First Amendment claims 13 because the CAADCA: (1) is an unlawful prior restraint; (2) is unconstitutionally overbroad; (3) 14 is void for vagueness; and (4) is subject to and fails strict scrutiny. Mot. 7–22. 15 Before taking up these arguments, the Court notes that both parties appear to have accepted 16 the relaxed standard for standing in a First Amendment facial challenge. That is, although the 17 general rule of standing is that a party may not challenge a statute’s constitutionality “on the 18 ground that it may conceivably be applied unconstitutionally to others,” Broadrick v. Oklahoma, 19 413 U.S. 601, 610 (1973), a party making a First Amendment claim has standing to challenge the 20 impact of a regulation on both “its own expressive activities, as well as those of others,” S.O.C. 21 Inc. v. County of Clark, 152 F.3d 1136, 1142 (9th Cir. 1998). Accordingly, the parties have 22 made—and the Court will consider—arguments about the CAADCA’s alleged impact on the 23 expressive activities of individuals and entities who are not NetChoice members. 24 Turning to NetChoice’s four First Amendment arguments on likelihood of success, the 25 Court first addresses the argument that the Act regulates protected expression and fails the 26 applicable level of scrutiny. Because the argument is dispositive, the Court need not address 27 NetChoice’s additional First Amendment arguments based on prior restraint, overbreadth, and 28 vagueness. 6 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 7 of 45 1 United States District Court Northern District of California 2 a. Legal Framework re Scrutiny for Regulations of Speech “The First Amendment generally prevents government from proscribing speech, [] or even 3 expressive conduct, [] because of disapproval of the ideas expressed.” R.A.V. v. City of St. Paul, 4 505 U.S. 377, 382 (1992) (internal citations omitted). A law compelling speech is no less subject 5 to First Amendment scrutiny than a law prohibiting speech. Frudden v. Pilling, 742 F.3d 1199, 6 1203 (9th Cir. 2014) (citing W. Va. State Bd. Of Educ. v. Barnette, 319 U.S. 624, 633–34 (1943)). 7 The threshold question in a free speech analysis is whether the challenged law invokes the 8 First Amendment at all. See Int’l Franchise Ass’n v. City of Seattle, 803 F.3d 389, 408 (9th Cir. 9 2015). “All manner of speech—from ‘pictures, films, paintings, drawings, and engravings,’ to 10 ‘oral utterance and the printed word’—qualify for the First Amendment's protections; no less can 11 hold true when it comes to speech . . . conveyed over the Internet.” 303 Creative LLC v. Elenis, 12 600 U.S. —, 143 S. Ct. 2298, 2312 (2023) (citations omitted). That is, the First Amendment’s 13 protections apply not only to written or verbal speech, but to any expressive conduct. See, e.g., 14 Ward v. Rock Against Racism, 491 U.S. 781, 790 (1989) (“Music, as a form of expression and 15 communication, is protected under the First Amendment.”). In determining whether a law 16 regulates protected expression, courts evaluate “whether [activity] with a ‘significant expressive 17 element’ drew the legal remedy or the ordinance has the inevitable effect of ‘singling out those 18 engaged in expressive activity.’” Int’l Franchise, 803 F.3d at 408 (quoting Arcara v. Cloud 19 Books, Inc., 478 U.S. 697, 706–07 (1986)). For example, a tax on paper and ink that in effect 20 “single[s] out the press for special treatment” regulates protected expression, although the 21 application of a general sales tax to newspapers does not. See Minneapolis Star & Tribune Co. v. 22 Minn. Comm’r of Revenue, 460 U.S. 575, 581–82 (1983). A regulation that restricts conduct 23 without a “significant expressive element” is not subject to any level of First Amendment scrutiny. 24 See HomeAway.com, Inc. v. City of Santa Monica, 918 F.3d 676, 684 (9th Cir. 2019); see also 25 Sorrell v. IMS Health Inc., 564 U.S. 552, 567 (2011) (“[T]he First Amendment does not prevent 26 restrictions directed at commerce or conduct from imposing incidental burdens on speech.”). 27 28 If a court finds that a challenged law regulates some manner of protected expression, it must then “determine the scope of the [regulated] speech” in order to apply the appropriate level 7 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 8 of 45 1 of scrutiny. Yim v. City of Seattle, 63 F.4th 783, 791 (9th Cir. 2023). There are several levels of 2 scrutiny that may apply, depending on the type of expression at issue. 3 United States District Court Northern District of California 4 i. Strict Scrutiny If the challenged regulation restricts only non-commercial speech, the level of scrutiny 5 depends on whether the law is content based or content neutral. “Government regulation of 6 speech is content based if a law applies to particular speech because of the topic discussed or the 7 idea or message expressed,” that is, if the regulation “draws distinctions based on the message a 8 speaker conveys.” Reed v. Town of Gilbert, 576 U.S. 155, 163 (2015) (citations omitted). A law 9 is also content based if, even though facially neutral, it “cannot be justified without reference to 10 the content of the regulated speech, or . . . were adopted by the government because of 11 disagreement with the message the speech conveys.” Id. at 164 (internal punctuation marks and 12 citation omitted). If the court determines a law is content based, it applies strict scrutiny, 13 “regardless of the government’s benign motive, content-neutral justification, or lack of ‘animus 14 toward the ideas contained’ in the regulated speech.” Porter v. Martinez, 68 F.4th 429, 439 (9th 15 Cir. 2023) (citations omitted). Strict scrutiny “requires the Government to prove that the 16 restriction furthers a compelling interest and is narrowly tailored to achieve that interest.” Reed, 17 576 U.S. at 171; see also Berger v. City of Seattle, 569 F.3d 1029, 1050 (9th Cir. 2009) (“Under 18 that standard [of strict scrutiny], the regulation is valid only if it is the least restrictive means 19 available to further a compelling government.”) (citing United States v. Playboy Ent. Grp., Inc., 20 529 U.S. 803, 813 (2000)). 21 22 ii. Intermediate Scrutiny “By contrast, a content-neutral regulation of [non-commercial] expression must meet the 23 less exacting standard of intermediate scrutiny.” Porter, 68 F.4th at 439 (citation omitted). Under 24 this lower standard, “a regulation is constitutional ‘if it furthers an important or substantial 25 governmental interest; if the governmental interest is unrelated to the suppression of free 26 expression; and if the incidental restriction on alleged First Amendment freedoms is no greater 27 than is essential to the furtherance of that interest.’” Id. (quoting United States v. O’Brien, 394 28 U.S. 367, 377 (1968)). 8 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 9 of 45 1 iii. If a statute regulates only commercial speech—i.e., “‘expression related solely to the 2 United States District Court Northern District of California Commercial Speech Scrutiny 3 economic interests of the speaker and its audience’” that “does no more than propose a 4 commercial transaction,” Am. Acad. of Pain Mgmt. v. Joseph, 353 F.3d 1099, 1106 (9th Cir. 2004) 5 (citations omitted)—the court applies commercial speech scrutiny3 as established by Central 6 Hudson Gas & Electric Corp. v. Public Service Commission of New York, 447 U.S. 557 (1980). 7 First, commercial speech is not entitled to any First Amendment protection if it is misleading or 8 related to illegal activity. Cent. Hudson, 447 U.S. at 563–64; see also, e.g., Thompson v. W. States 9 Med. Ctr., 535 U.S. 357, 367 (2002). For all other commercial speech, the court asks “whether the 10 asserted governmental interest is substantial,” “whether the regulation directly advances the 11 governmental interest,” and “whether [the regulation] is not more extensive than is necessary to 12 serve that interest.” Retail Digital Network, LLC v. Prieto, 861 F.3d 839, 844 (9th Cir. 2017) 13 (quoting Cent. Hudson, 447 U.S. at 566). The regulation is constitutional only if the answer to all 14 three questions is “yes,”. See id. This analysis applies to commercial speech regardless of 15 whether the regulation is content based or content neutral. Yim, 63 F.4th at 793 n.14 (citing Valle 16 Del Sol, Inc. v. Whiting, 709 F.3d 808, 820 (9th Cir. 2013)). 17 iv. 18 19 20 21 22 Scrutiny where Commercial and Non-Commercial Speech is Inextricably Intertwined Finally, if a law regulates expression that “inextricably intertwines” commercial and noncommercial components, the court does not “apply[] one test to one phrase and another test to another phrase,” but instead treats the entire expression as non-commercial speech and applies the appropriate level of scrutiny. Riley v. Nat’l Fed’n of the Blind of N.C., Inc., 487 U.S. 781, 796 (1988) (applying strict scrutiny to content-based regulation of solicitation of charitable 23 24 contributions by professional fundraisers while assuming professional fundraiser’s financial motivation for solicitation intertwined commercial interest with non-commercial advocacy). 25 With these principles in mind, the Court now assesses whether NetChoice has shown that it 26 27 28 The Court will use the phrase “commercial speech scrutiny” in this order to refer to the “intermediate scrutiny standard codified in Central Hudson.” Yim, 63 F.4th at 793. 9 3 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 10 of 45 1 is likely to succeed both in establishing that the CAADCA regulates protected expression, and in 2 establishing that the CAADCA fails the applicable level of scrutiny. United States District Court Northern District of California 3 b. Protected Expression or Non-Expressive Conduct 4 NetChoice argues that the CAADCA regulates speech by requiring internet content 5 providers to take various actions to protect minors from harmful messages, such as making 6 content-based assessments about potential harm to minors in order to comply with the DPIA 7 requirement, and necessarily reviewing content to adhere to the Act’s content policy enforcement 8 provision. See Mot. 19–21. The State argues that the Act merely regulates business practices 9 regarding the collection and use of children’s data, so that its restrictions are only of nonexpressive 10 conduct that is not entitled to First Amendment protection. See Opp’n 10–12. The State further 11 contends that the Act does not restrict speech because it does not prevent any particular content 12 from being shown to a minor—even if the content provider knows it would be harmful—as long 13 as the content provider does not use the minor’s personal information to do so. See id. at 12. 14 In evaluating whether the CAADCA regulates protected expression, the Court first notes 15 that determining whether the statute applies to a business will often require viewing the content of 16 the online service, product, or feature to evaluate whether it is “likely to be accessed by children” 17 because, for example, it contains “advertisements marketed to children.” CAADCA §§ 18 29(b)(4)(C), 31(a). But having to view content to determine whether the statute applies does not 19 by itself mean that the statute regulates speech. See, e.g., Am. Soc’y of Journalists & Authors, Inc. 20 v. Bonta, 15 F.4th 954, 960–61 (9th Cir. 2021) (finding law classifying workers as employees or 21 independent contractors based on criteria including whether worker’s output was “to be 22 appreciated primarily or solely for its imaginative, aesthetic, or intellectual content” did not 23 regulate speech) (citing Cal. Labor Code § 2778(b)(2)(F)(ii)). The question is whether the law at 24 issue regulates expression “because of its message, its ideas, its subject matter, or its content.” Id. 25 at 960 (quoting Reed, 576 U.S. at 163). The Court will evaluate this question first with respect to 26 those portions of the statute that prohibit certain actions, see CAADCA § 31(b), and then turn to 27 the sections of the statute mandating specific acts, see id. § 31(a). 28 10 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 11 of 45 1 United States District Court Northern District of California 2 i. The Act’s Prohibitions (CAADCA § 31(b)) The CAADCA’s prohibitions forbid the for-profit entities covered by the Act from 3 engaging—with some exceptions—in the collection, sale, sharing, or retention of children’s 4 personal information, including precise geolocation information, for profiling or other purposes. 5 See generally id. § 31(b). The State argues that the CAADCA’s regulation of “collection and use 6 of children’s personal information” is akin to laws that courts have upheld as regulating economic 7 activity, business practices, or other conduct without a significant expressive element. Opp’n 11– 8 12 (citations omitted). There are two problems with the State’s argument. First, none of the 9 decisions cited by the State for this proposition involved laws that, like the CAADCA, restricted 10 the collection and sharing of information. See id.; Rumsfeld v. Forum for Acad. & Inst. Rights, 11 Inc., 547 U.S. 47, 66 (2006) (statute denying federal funding to educational institutions restricting 12 military recruiting did not regulate “inherently expressive” conduct because expressive nature of 13 act of preventing military recruitment necessitated explanatory speech); Roulette v. City of Seattle, 14 97 F.3d 300, 305 (9th Cir. 1996) (ordinance prohibiting sitting or lying on sidewalk did not 15 regulate “forms of conduct integral to, or commonly associated with, expression”); Int’l 16 Franchise, 803 F.3d at 397–98, 408 (minimum wage increase ordinance classifying franchisees as 17 large employers “exhibit[ed] nothing that even the most vivid imagination might deem uniquely 18 expressive”) (citation omitted); HomeAway.com, 918 F.3d at 680, 685 (ordinance regulating forms 19 of short-term rentals was “plainly a housing and rental regulation” that “regulate[d] nonexpressive 20 conduct—namely, booking transactions”); Am. Soc’y of Journalists & Authors, 15 F.4th at 961–62 21 (law governing classification of workers as employees or independent contractors “regulate[d] 22 economic activity rather than speech”). 23 Second, in a decision evaluating a Vermont law restricting the sale, disclosure, and use of 24 information about the prescribing practices of individual doctors—which pharmaceutical 25 manufacturers used to better target their drug promotions to doctors—the Supreme Court held the 26 law to be an unconstitutional regulation of speech, rather than conduct. Sorrell, 564 U.S. at 557, 27 562, 570–71. The Supreme Court noted that it had previously held the “creation and 28 dissemination of information are speech within the meaning of the First Amendment,” 564 U.S. at 11 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 12 of 45 1 570 (citing Bartnicki v. Vopper, 532 U.S. 514, 527 (2001); Rubin v. Coors Brewing Co., 514 U.S. 2 476, 481 (1995); Dun & Bradstreet, Inc. v. Greenmoss Builders, Inc., 472 U.S. 749, 759 (1985) 3 (plurality opinion)), and further held that even if the prescriber information at issue was a 4 commodity, rather than speech, the law’s “content- and speaker-based restrictions on the 5 availability and use of . . . identifying information” constituted a regulation of speech, id. at 570– 6 71; see also id. at 568 (“An individual’s right to speak is implicated when information he or she 7 possesses is subject to ‘restraints on the way in which the information might be used’ or 8 disseminated.”) (quoting Seattle Times Co. v. Rhinehart, 467 U.S. 20, 32 (1984)). United States District Court Northern District of California 9 The State argues that Sorrell does not necessitate the conclusion that the CAADCA’s 10 prohibitions regulate speech because Sorrell (1) does not hold that a business has a right to collect 11 data from individuals, and (2) is generally distinguishable on the facts because the physicians 12 described in Sorrell, whose information was collected, were willing participants in the data 13 generation who had the power to restrict the use of their information. See July 27, 2023 Hr’g Tr. 14 (“Tr.”) 27:16–31:13; Opp’n 11–12; see also id. 1 (“Plaintiff’s members do not have a First 15 Amendment right to children’s personal information.”). As for the first point, the State is correct 16 that Sorrell does not address any general right to collect data from individuals. In fact, the 17 Supreme Court noted that the “capacity of technology to find and publish personal information . . . 18 presents serious and unresolved issues with respect to personal privacy and the dignity it seeks to 19 secure.” Sorrell, 564 U.S. at 579–80. But whether there is a general right to collect data is 20 independent from the question of whether a law restricting the collection and sale of data regulates 21 conduct or speech. Under Sorrell, the unequivocal answer to the latter question is that a law 22 that—like the CAADCA—restricts the “availability and use” of information by some speakers but 23 not others, and for some purposes but not others, is a regulation of protected expression. Id. at 24 570–71. The State’s attempt to distinguish Sorrell based on the physicians’ ability to prevent their 25 information from being collected, see Tr. 31:7–10, is not persuasive because the Supreme Court 26 concluded that the law at issue regulated speech based on its restrictions on the use of the 27 information after it was collected, without including any reasoning about the nature of the source 28 of the information. See Sorrell, 564 U.S. at 570–71. 12 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 13 of 45 1 2 prohibitions—which restrict covered business from “[c]ollect[ing], sell[ing], shar[ing], or 3 retain[ing] any personal information” for most purposes, see, e.g., CAADCA § 31(b)(3)—limit the 4 “availability and use” of information by certain speakers and for certain purposes and thus regulate 5 protected speech. 6 7 United States District Court Northern District of California Accordingly, the Court finds that NetChoice is likely to succeed in showing that the Act’s ii. The Act’s Mandates (CAADCA § 31(a)) The Act’s ten statutory mandates are more varied than the prohibitions. See generally 8 CAADCA §§ 31(a)(1)–(10). One of the main requirements of the Act is that companies create 9 DPIA reports identifying, for each offered online service, product, or feature likely to be accessed 10 by children, any risk of material detriment to children arising from the business’s data 11 management practices. Id. §§ 31(a)(1)–(4). For example, a DPIA report must assess whether the 12 “design of the online service, product, or feature could harm children, including by exposing 13 children to harmful, or potentially harmful, content on the online service, product, or feature.” Id. 14 § 31(a)(1)(B). Each business must then create a “timed plan to mitigate or eliminate” the risks 15 identified in the DPIA “before the online service, product, or feature is accessed by children,” id. § 16 31(a)(2), and provide a list of all DPIA reports and the reports themselves to the state Attorney 17 General upon written request, id. § 31(a)(3)-(4). 18 The State contended at oral argument that the DPIA report requirement merely “requires 19 businesses to consider how the product’s use design features, like nudging to keep a child engaged 20 to extend the time the child is using the product” might harm children, and that the consideration 21 of such features “has nothing to do with speech.” Tr. 19:14–20:5; see also id. at 23:5–6 (“[T]his is 22 only assessing how your business models . . . might harm children.”). The Court is not persuaded 23 by the State’s argument because “assessing how [a] business model[] . . . might harm children” 24 facially requires a business to express its ideas and analysis about likely harm. It therefore appears 25 to the Court that NetChoice is likely to succeed in its argument that the DPIA provisions, which 26 require covered businesses to identify and disclose to the government potential risks to minors and 27 to develop a timed plan to mitigate or eliminate the identified risks, regulate the distribution of 28 speech and therefore trigger First Amendment scrutiny. See Reply 2, ECF 60; Sorrell, 564 U.S. at 13 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 14 of 45 1 570 (“This Court has held that the creation and dissemination of information are speech within the 2 meaning of the First Amendment.”) (citations omitted). United States District Court Northern District of California 3 Several sections require businesses to affirmatively provide information to users, and by 4 requiring speech necessarily regulate it. See CAADCA § 31(a)(7) (requiring businesses 5 “[p]rovide any privacy information . . . concisely, prominently, and using clear language suited to 6 the age of children likely to access that online service, product, or feature”); id. § 31(a)(8) 7 (requiring that businesses “provide an obvious signal to [a] child” if the child is being tracked or 8 monitored by a parent or guardian via an online service, product, or feature); id. § 31(a)(10) 9 (“Provide prominent, accessible, and responsive tools to help children . . . exercise their privacy 10 rights and report concerns.”); see also, e.g., Rubin, 514 U.S. at 481 (holding “information on beer 11 labels” constitutes speech). The CAADCA also requires a covered business to enforce its 12 “published terms, policies, and community standards”—i.e., its content moderation policies. 13 CAADCA § 31(a)(9). Although the State argues that the policy enforcement provision does not 14 regulate speech because businesses are free to create their own policies, it appears to the Court that 15 NetChoice’s position that the State has no right to enforce obligations that would essentially press 16 private companies into service as government censors, thus violating the First Amendment by 17 proxy, is better grounded in the relevant binding and persuasive precedent. See Mot. 11; Playboy 18 Ent. Grp., 529 U.S. at 806 (finding statute requiring cable television operators providing channels 19 with content deemed inappropriate for children to take measures to prevent children from viewing 20 content was unconstitutional regulation of speech); NetChoice, LLC v. Att’y Gen., Fla. 21 (“NetChoice v. Fla.”), 34 F.4th 1196, 1213 (11th Cir. 2022) (“When platforms choose to remove 22 users or posts, deprioritize content in viewers’ feeds or search results, or sanction breaches of their 23 community standards, they engage in First-Amendment-protected activity.”); Engdahl v. City of 24 Kenosha, 317 F. Supp. 1133, 1135–36 (E.D. Wis. 1970) (holding ordinance restricting minors 25 from viewing certain movies based on ratings provided by Motion Picture Association of America 26 impermissibly regulated speech). 27 28 The remaining two sections of the CAADCA require businesses to estimate the age of child users and provide them with a high default privacy setting, or forgo age estimation and 14 United States District Court Northern District of California Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 15 of 45 1 provide the high default privacy setting to all users. CAADCA §§ 31(a)(5)–(6). The State argues 2 that “[r]equiring businesses to protect children’s privacy and data implicates neither protected 3 speech nor expressive conduct,” and notes that the provisions “say[] nothing about content and 4 do[] not require businesses to block any content for users of any age.” Opp’n 15. However, the 5 materials before the Court indicate that the steps a business would need to take to sufficiently 6 estimate the age of child users would likely prevent both children and adults from accessing 7 certain content. See Amicus Curiae Br. of Prof. Eric Goldman (“Goldman Am. Br.”) 4–7 8 (explaining that age assurance methods create time delays and other barriers to entry that studies 9 show cause users to navigate away from pages), ECF 34-1; Amicus Curiae Br. of New York 10 Times Co. & Student Press Law Ctr. (“NYT Am. Br.”) 6 (stating age-based regulations would 11 “almost certain[ly] [cause] news organizations and others [to] take steps to prevent those under the 12 age of 18 from accessing online news content, features, or services”), ECF 56-1. The age 13 estimation and privacy provisions thus appear likely to impede the “availability and use” of 14 information and accordingly to regulate speech. Sorrell, 564 U.S. at 570–71. 15 The Court is keenly aware of the myriad harms that may befall children on the internet, 16 and it does not seek to undermine the government’s efforts to resolve internet-based “issues with 17 respect to personal privacy and . . . dignity.” See Sorrell, 564 U.S. at 579; Def.’s Suppl. Br. 1 18 (“[T]he ‘serious and unresolved issues’ raised by increased data collection capacity due to 19 technological advances remained largely unaddressed [in Sorrell].”). However, the Court is 20 troubled by the CAADCA’s clear targeting of certain speakers—i.e., a segment of for-profit 21 entities, but not governmental or non-profit entities—that the Act would prevent from collecting 22 and using the information at issue. As the Supreme Court noted in Sorrell, the State’s arguments 23 about the broad protections engendered by a challenged law are weakened by the law’s application 24 to a narrow set of speakers. See Sorrell, 564 U.S. at 580 (“Privacy is a concept too integral to the 25 person and a right too essential to freedom to allow its manipulation to support just those ideas the 26 government prefers”). 27 28 For the foregoing reasons, the Court finds that NetChoice is likely to succeed in showing that the CAADCA’s prohibitions and mandates regulate speech, so that the Act triggers First 15 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 16 of 45 1 Amendment scrutiny. 2 United States District Court Northern District of California 3 c. The Type of Speech Regulated by the CAADCA Because the Court has found the CAADCA likely regulates protected speech, it must now 4 determine what type of speech is at issue in order to apply the appropriate level of scrutiny. As 5 described above, see Part III(A)(1)(a), strict scrutiny applies to a law regulating non-commercial 6 speech in a content-based manner, meaning the law “target[s] speech based on its communicative 7 content.” Reed, 576 U.S. at 163. To survive strict scrutiny, the “the Government [must] prove 8 that the restriction furthers a compelling interest and is narrowly tailored to achieve that interest.” 9 Id. at 171. A content-neutral regulation of non-commercial speech, on the other hand, “is 10 constitutional as long as it withstands intermediate scrutiny—i.e., if: (1) ‘it furthers an important 11 or substantial government interest’; (2) ‘the governmental interest is unrelated to the suppression 12 of free expression’; and (3) ‘the incidental restriction on alleged First Amendment freedoms is no 13 greater than is essential to the furtherance of that interest.’” Jacobs v. Clark Cnty. Sch. Dist., 526 14 F.3d 419, 434 (9th Cir. 2008) (quoting Turner Broad. Sys., Inc. v. FCC, 512 U.S. 622, 661–62 15 (1994)). And if the speech at issue is commercial, courts apply intermediate scrutiny under the 16 four-part test articulated by the Supreme Court in Central Hudson, which the Ninth Circuit has 17 described as follows: 18 19 20 21 (1) [I]f “the communication is neither misleading nor related to unlawful activity,” then it merits First Amendment scrutiny as a threshold matter; [and] in order for the restriction to withstand such scrutiny, (2) “[t]he State must assert a substantial interest to be achieved by restrictions on commercial speech;” (3) “the restriction must directly advance the state interest involved;” and (4) it must not be “more extensive than is necessary to serve that interest.” 22 Metro Lights, L.L.C. v. City of Los Angeles, 551 F.3d 898, 903 (9th Cir. 2009) (quoting Cent. 23 Hudson, 447 U.S. at 564–66); see also Junior Sports Mags. Inc. v. Bonta, --- F.4th ----, 2023 WL 24 5945879, at *4 (9th Cir. Sept. 13, 2023). 25 NetChoice argues that the CAADCA regulates non-commercial speech because the speech 26 at issue goes beyond proposing a commercial transaction, Reply 10, and that the speech is 27 “content-based in many obvious respects” because its “very premise [is] that providers must 28 prioritize content that promotes the ‘well-being’ of minors,” Mot. 19. Accordingly, NetChoice 16 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 17 of 45 1 contends that the Act is subject to strict scrutiny. See Mot. 19–21; Reply 9–10. The State 2 counters that any protected expression regulated by the Act is at most commercial speech, so that 3 the Act is subject to the lower level of scrutiny described in Central Hudson. Opp’n 19. The State 4 argues that the Act affects how businesses persuade consumers to engage with their products— 5 such as by posting policies that aid consumers in deciding whether to engage with certain 6 products—and that consumer engagement in turn drives the regulated businesses’ revenue. Id. 7 Based on this revenue model, the State concludes that “there can be no doubt that regulated 8 businesses have ‘an economic motive for engaging in the [alleged] speech’ with regard to the 9 specific products—services likely to be accessed by children—that the Act regulates.” Id. 10 United States District Court Northern District of California 11 (quoting Am. Acad. of Pain Mgmt., 353 F.3d at 1106). Based on the record before it, the Court finds it difficult to determine whether the Act 12 regulates only commercial speech. NetChoice argues in fairly conclusory fashion that the Act 13 “regulates speech that does far more than ‘propose a commercial transaction’” and that the for- 14 profit nature of a website “does not render [its] content commercial speech” because many 15 covered businesses rely on advertisements to support the expressive content and services they 16 provide. Reply 10; see Mot. 2, 19–21. NetChoice provides some support for the latter argument. 17 See, e.g., Roin Decl. ¶ 10 (stating that the Goodreads application earns the vast majority of its 18 revenue from advertising, including personalized advertisements targeted to registered users). 19 However, the Court notes that some sections of the CAADCA, such as those prohibiting the sale 20 of personal information, see generally CAADCA § 31(b), may well be analyzed as regulating only 21 commercial speech. See, e.g., Hunt v. City of Los Angeles, 638 F.3d 703, 715–16 (9th Cir. 2011) 22 (finding speech commercial because it was “directed to their products and why a consumer should 23 buy them” and not “inextricably intertwined” with non-commercial speech). Ultimately, the Court 24 finds that NetChoice has not provided sufficient material to demonstrate that it is likely to succeed 25 in showing that the Act regulates either purely non-commercial speech or non-commercial speech 26 that is inextricably intertwined with commercial speech. It is NetChoice’s burden to make that 27 showing in order to trigger application of strict scrutiny. See, e.g., Yim, 63 F.4th at 793 (“The 28 parties on appeal dispute whether the Ordinance regulates commercial speech and calls for the 17 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 18 of 45 1 application of intermediate scrutiny, or whether the Ordinance regulates [content-based] non- 2 commercial speech and is subject to strict scrutiny review.”). United States District Court Northern District of California 3 However, as the Ninth Circuit reasoned in Yim, the Court “need not decide that question, . . 4 . because [it] conclude[s] that the [Act] does not survive the intermediate scrutiny standard of 5 review” for commercial speech. Id.; see also Junior Sports Mags., 2023 WL 5945879, at *4 (“We 6 need not decide this issue because ‘the outcome is the same whether a special commercial speech 7 inquiry or a stricter form of judicial scrutiny is applied.’”) (quoting Sorrell, 564 U.S. at 571). 8 Accordingly, the Court will assume for the purposes of the present motion that only the lesser 9 standard of intermediate scrutiny for commercial speech applies because, as shown below, the 10 outcome of the analysis here is not affected by the Act’s evaluation under the lower standard of 11 commercial speech scrutiny. 12 13 d. Application of Commercial Speech Scrutiny to the CAADCA Under the standard for commercial speech scrutiny, if the regulation restricts speech that is 14 neither misleading nor related to unlawful activity, it is the State’s burden to show “at least that 15 the statute directly advances a substantial governmental interest and that the measure is drawn to 16 achieve that interest.” Sorrell, 564 U.S. at 572 (citations omitted); Junior Sports Mags., 2023 WL 17 5945879, at *5 (“Under Central Hudson, a state seeking to justify a restriction on commercial 18 speech bears the burden to prove that its law directly advances that [substantial] interest to a 19 material degree.”). That is, “the restriction must directly advance the state interest involved,” and 20 it must not be “more extensive than is necessary to serve that interest.” Cent. Hudson, 447 U.S. at 21 66. These “last two steps of the Central Hudson analysis basically involve a consideration of the 22 fit between the legislature’s ends and the means chosen to accomplish those ends.” Hunt, 638 23 F.3d at 717 (quoting Rubin, 514 U.S. at 786) (internal quotation marks omitted). The government 24 need not employ the least restrictive means to advance its interest, but the means employed may 25 not be “substantially excessive.” Id. (quoting Bd. of Trs. of State Univ. of N.Y. v. Fox, 492 U.S. 26 469, 479 (1989)). 27 28 i. Substantial State Interest There is no dispute that the CAADCA regulates speech that is neither misleading nor 18 United States District Court Northern District of California Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 19 of 45 1 related to unlawful activity. The Court thus turns directly to the question of whether the State can 2 show a substantial state interest to which the CAADCA is geared. The State asserts a substantial4 3 interest in “protecting the physical, mental, and emotional health and well-being of minors.” 4 Def.’s Suppl. Br. 1–2; see also Opp’n 20 (describing substantial state interest in ““safeguarding 5 the physical and psychological well-being of a minor”); Tr. 71:6–13 (accord); id. at 74:25–75:3 6 (“[T]he government has a compelling interest [in] the nature of online space for children.”). 7 NetChoice does not dispute that “the well-being of children is a compelling interest in the 8 abstract,” but argues that the CAADCA does not identify a sufficiently concrete harm that the law 9 addresses. Mot. 21–22. However, the State has presented evidence that children are currently 10 harmed by lax data and privacy protections online. See Radesky Decl. ¶¶ 45–47 (privacy settings 11 often allow unwanted contact), ¶¶ 64–68 (profiling leads to children being targeted with ads for 12 monetization and extreme dieting). In light of this evidence, and given that the Supreme Court has 13 repeatedly recognized a compelling interest in “protecting the physical and psychological well- 14 being of minors,” the Court finds that NetChoice is not likely to show that the State has not 15 satisfied its burden of showing a substantial interest under the commercial speech scrutiny 16 standard. Sable Comm’cns of Cal., Inc. v. FCC, 492 U.S. 115, 126 (1989); see also New York v. 17 Ferber, 458 U.S. 747, 756 (1982) (“It is evident beyond the need for elaboration that a State's 18 interest in ‘safeguarding the physical and psychological well-being of a minor’ is ‘compelling.’”) 19 (quoting Globe Newspaper Co. v. Super. Ct., 457 U.S. 596, 607 (1982)). 20 ii. Means-Ends Fit After the State shows a substantial interest, the Court evaluates the commercial speech 21 22 regulation under the last two prongs of the Central Hudson analysis, i.e., whether the “restriction . 23 . . directly advance[s] the state interest involved” and whether it is not “more extensive than is 24 necessary to serve that interest.” Metro Lights, L.L.C., 551 F.3d at 903 (quoting Cent. Hudson, 25 447 U.S. at 564–66). As noted above, the “last two steps of the Central Hudson analysis basically 26 27 28 4 Because the State argues that the CAADCA satisfies both strict scrutiny and commercial speech scrutiny, it occasionally describes its interest as “compelling,” rather than “substantial.” See, e.g., Opp’n 19–20. The Court treats those arguments as supporting the State’s position that it has a substantial state interest as required by the commercial speech scrutiny standard. 19 United States District Court Northern District of California Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 20 of 45 1 involve a consideration of the fit between the legislature’s ends and the means chosen to 2 accomplish those ends.” Hunt, 638 F.3d at 717 (citation omitted). Once again, it is the State’s 3 burden to show that the statute satisfies the standards set forth by Central Hudson. Junior Sports 4 Mags., 2023 WL 5945879, at *4 (citations omitted); see also Sorrell, 564 U.S. at 572. 5 NetChoice argues that certain provisions of the CAADCA—namely, CAADCA §§ 6 31(a)(1)–(7), 31(a)(9), 31(b)(1)–(4), and 31(b)(7)—fail commercial speech scrutiny, and that the 7 entire statute must be enjoined because the invalid provisions are not severable from the otherwise 8 valid remainder.5 See Pl.’s Suppl. Br. in Supp. of Mot. for Prelim. Inj. (“Pl.’s Suppl. Br.”), ECF 9 71, at 2–7. The State argues that all of the mandates and prohibitions of the CAADCA satisfy 10 commercial speech scrutiny because each provision is appropriately tailored to the State’s 11 substantial interest in protecting the physical, mental, and emotional health and well-being of 12 minors. See Def.’s Suppl. Br. 2–7. The Court will first address whether the specific provisions of 13 the Act challenged by NetChoice survive commercial speech scrutiny before turning to the issue 14 of severability. (1) 15 DPIA Report Requirement (CAADCA § 31(a)(1)-(4)) The State contends that the CAADCA’s DPIA report requirement furthers its substantial 16 17 interest in protecting children’s safety because the provisions will cause covered businesses to 18 proactively assess “how their products use children’s data and whether their data management 19 practices or product designs pose risks to children,” so that “fewer children will be subject to 20 preventable harms.” Def.’s Suppl. Br. 2–3. According to the State’s expert, “[c]hildren’s digital 21 risks and opportunity are shaped by the design of digital products, services, and features,” and 22 businesses currently take a reactive approach by removing problematic features only after harm is 23 discovered. See Radesky Decl. ¶ 40 (emphasis added). For example, the mobile application 24 Snapchat ended the use of a speed filter after the feature was linked to dangerous incidents of 25 reckless driving by adolescents. Id. ¶ 41. 26 27 28 The Court refers to those portions of the Act not challenged by NetChoice as a “valid remainder” for the purposes of its decision on the motion for preliminary injunction, but does not intend to suggest it has conducted an analysis and found those unchallenged provisions to be legally valid. 20 5 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 21 of 45 1 2 State has not met its burden to demonstrate that the DPIA provisions in fact address the identified 3 harm. For example, the Act does not require covered businesses to assess the potential harm of 4 product designs—which Dr. Radesky asserts cause the harm at issue—but rather of “the risks of 5 material detriment to children that arise from the data management practices of the business.” 6 CAADCA § 31(a)(1)(B) (emphasis added). And more importantly, although the CAADCA 7 requires businesses to “create a timed plan to mitigate or eliminate the risk before the online 8 service, product, or feature is accessed by children,” id. § 31(a)(2), there is no actual requirement 9 to adhere to such a plan. See generally id. § 31(a)(1)-(4); see also Tr. 26:9–10 (“As long as you 10 11 United States District Court Northern District of California Accepting the State’s statement of the harm it seeks to cure, the Court concludes that the write the plan, there is no way to be in violation.”), ECF 66. “A restriction ‘directly and materially advances’ the government’s interests if the 12 government can show ‘the harms it recites are real and that its restriction will in fact alleviate them 13 to a material degree.’” Yim, 63 F.4th at 794 (quoting Fla. Bar v. Went For It, Inc., 515 U.S. 618, 14 626 (1995)). Because the DPIA report provisions do not require businesses to assess the potential 15 harm of the design of digital products, services, and features, and also do not require actual 16 mitigation of any identified risks, the State has not shown that these provisions will “in fact 17 alleviate [the identified harms] to a material degree.” Id. The Court accordingly finds that 18 NetChoice is likely to succeed in showing that the DPIA report provisions provide “only 19 ineffective or remote support for the government’s purpose” and do not “directly advance” the 20 government’s substantial interest in promoting a proactive approach to the design of digital 21 products, services, and feature. Id. (citations omitted). NetChoice is therefore likely to succeed in 22 showing that the DPIA report requirement does not satisfy commercial speech scrutiny. See 23 Junior Sports Mags., 2023 WL 5945879, at *4 (“Because California fails to satisfy its burden to 24 justify the proposed speech restriction, [Plaintiff] is likely to prevail on the merits of its First 25 Amendment claim.”). 26 27 28 (2) Age Estimation (CAADCA § 31(a)(5)) The CAADCA requires that covered businesses “[e]stimate the age of child users with a reasonable level of certainty appropriate to the risks that arise from the data management practices 21 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 22 of 45 1 of the business or apply the privacy and data protections afforded to children to all consumers.” 2 CAADCA § 31(a)(5). The State argues that CAADCA § 31(a)(5) promotes the well-being of 3 children by requiring covered businesses to “provide data and privacy protections to users based 4 on estimated age or, if the business does not estimate age, apply child-appropriate data and privacy 5 protections to all users.”6 Def.’s Suppl. Br. 3. This argument relies on the state legislature’s 6 finding that greater data privacy “necessarily means greater security and well-being.” Id. (quoting 7 AB 2273 § 1(a)(4)). NetChoice counters that the age estimation provision does not directly 8 advance the State’s substantial interest in children’s well-being because the practical process of 9 such estimation involves further information collection that is itself invasive. See Reply 5–6; 10 Goldman Am. Br. 2–4. As described above, for the Act to survive commercial speech scrutiny, the State must United States District Court Northern District of California 11 12 show that the CAADCA’s challenged provisions directly advance a substantial government 13 interest by materially alleviating real harms. See Yim, 63 F.4th at 794; Junior Sports Mags., 2023 14 WL 5945879, at *5. Based on the materials before the Court, the CAADCA’s age estimation 15 provision appears not only unlikely to materially alleviate the harm of insufficient data and 16 privacy protections for children, but actually likely to exacerbate the problem by inducing covered 17 businesses to require consumers, including children, to divulge additional personal information. 18 The State argues that age estimation is distinct from the more onerous exercise of age verification, 19 that the statute requires only a level of estimation that is appropriate to the risk presented by a 20 business’s data management practices, and that there are “minimally invasive” age estimation 21 tools, some of which are already used by NetChoice’s member companies. See Opp’n 15–16. But 22 even the evidence cited by the State about the supposedly minimally invasive tools indicates that 23 consumers might have to permit a face scan, or that businesses might use “locally-analyzed and 24 stored biometric information” to signal whether the user is a child or not. See id. at 16 (citing 25 26 27 28 6 The Court notes that the age estimation provision does not itself require any specific protections; the required data and privacy protections for either minors (if the business estimates age) or all users (if the business does not estimate age) are set forth in the remainder of the statute, and especially at CAADCA §§ 31(b)(1)–(8). 22 United States District Court Northern District of California Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 23 of 45 1 Radesky Decl. ¶ 96); see also Radesky Decl. ¶ 96(b) & n.92 (noting Google’s use of facial age- 2 estimation software),7 ¶ 96(d) (noting businesses receive signals from hardware devices based on 3 “locally-analyzed and stored biometric information” that indicate whether a user is a child). 4 Further, as noted in Professor Goldman’s amicus brief, age estimation is in practice quite similar 5 to age verification, and—unless a company relies on user self-reporting of age, which provides 6 little reliability—generally requires either documentary evidence of age or automated estimation 7 based on facial recognition. See Goldman Am. Br. 3–4. Such measures would appear to counter 8 the State’s interest in increasing privacy protections for children. For these reasons, the State has 9 not met its burden under Central Hudson and thus NetChoice is likely to succeed in showing that 10 the age estimation clause does not satisfy commercial speech scrutiny. See Yim, 63 F.4th at 794 11 (“[A] statute cannot meaningfully advance the government’s stated interests if it contains 12 exceptions that ‘undermine and counteract’ those goals.”) (quoting Rubin, 514 U.S. at 489). 13 If a business does not estimate age, it must “apply the privacy and data protections 14 afforded to children to all consumers.” CAADCA § 31(a)(5). Doing so would clearly advance the 15 government’s interest in increasing data and privacy protections for children. NetChoice argues, 16 however, that the effect of this requirement would be to restrain a great deal of protected speech. 17 See Mot. 13–14, Reply 12. The Court is indeed concerned with the potentially vast chilling effect 18 of the CAADCA generally, and the age estimation provision specifically. The State argues that 19 the CAADCA does not prevent any specific content from being displayed to a consumer, even if 20 the consumer is a minor; it only prohibits a business from profiling a minor and using that 21 information to provide targeted content. See, e.g., Opp’n 16. Yet the State does not deny that the 22 end goal of the CAADCA is to reduce the amount of harmful content displayed to children. See 23 id. (“[T]he Act prevents businesses from attempting to increase their profits by using children’s 24 data to deliver them things they do not want and have not asked for, such as ads for weight loss 25 supplements and content promoting violence and self-harm.”); Def.’s Suppl. Br. 6 (“Children are 26 27 28 Although Dr. Radesky states that Google’s current system involves facial recognition only by adults who have been placed in “child mode” through a machine-learning analysis, Radesky Decl. ¶ 96(b), there is nothing to suggest that companies would not request all consumers to undergo such a process. 23 7 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 24 of 45 1 unable to avoid harmful unsolicited content—including extreme weight loss content and gambling 2 and sports betting ads—directed at them based on businesses’ data collection and use practices.”). United States District Court Northern District of California 3 Putting aside for the moment the issue of whether the government may shield children 4 from such content—and the Court does not question that the content is in fact harmful—the Court 5 here focuses on the logical conclusion that data and privacy protections intended to shield children 6 from harmful content, if applied to adults, will also shield adults from that same content. That is, 7 if a business chooses not to estimate age but instead to apply broad privacy and data protections to 8 all consumers, it appears that the inevitable effect will be to impermissibly “reduce the adult 9 population … to reading only what is fit for children.” Butler v. Michigan, 352 U.S. 380, 381, 383 10 (1957). And because such an effect would likely be, at the very least, a “substantially excessive” 11 means of achieving greater data and privacy protections for children, see Hunt, 638 F.3d at 717 12 (citation omitted), NetChoice is likely to succeed in showing that the provision’s clause applying 13 the same process to all users fails commercial speech scrutiny. 14 For these reasons, even accepting the increasing of children’s data and privacy protections 15 as a substantial governmental interest, the Court finds that the State has failed to satisfy its burden 16 to justify the age estimation provision as directly advancing the State’s substantial interest in 17 protecting the physical, mental, and emotional health and well-being of minors, so that NetChoice 18 is likely to succeed in arguing that the provision fails commercial speech scrutiny. See Junior 19 Sports Mags., 2023 WL 5945879, at *4. 20 21 (3) High Default Privacy Settings (CAADCA § 31(a)(6)) CAADCA § 31(a)(6) requires covered businesses to “[c]onfigure all default privacy 22 settings provided to children . . . to settings that offer a high level of privacy, unless the business 23 can demonstrate a compelling reason that a different setting is in the best interests of children.” 24 The State argues that high privacy settings “demonstrably keep children safe.” Def.’s Suppl. Br. 25 3–4 (citing Radesky Decl. ¶¶ 57–60). The evidence before the Court indicates that lower default 26 privacy settings may quickly lead to individuals perceived as adolescents “receiv[ing] direct 27 messages from accounts they did not follow, including being added to group chats with strangers 28 and contacts from marketers of detrimental material such as pornography and diet products.” 24 United States District Court Northern District of California Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 25 of 45 1 Radesky Decl. ¶ 59. Accordingly, the Court finds that the State is likely to establish a real harm, 2 as required under commercial speech scrutiny. See Yim, 63 F.4th at 794. 3 The instant provision, however, does not make clear whether it applies only to privacy 4 settings on accounts created by children—which is the harm discussed in the State’s materials, 5 see, e.g., Radesky Decl. ¶ 59—or if it applies, for example, to any child visitor of an online 6 website run by a covered business. NetChoice has provided evidence that uncertainties as to the 7 nature of the compliance required by the CAADCA is likely to cause at least some covered 8 businesses to prohibit children from accessing their services and products altogether. See, e.g., 9 NYT Am. Br. 5–6 (asserting CAADCA requirements that covered businesses consider various 10 potential harms to children would make it “almost certain that news organizations and others will 11 take steps to prevent those under the age of 18 from accessing online news content, features, or 12 services”). Although the State need not show that the Act “employs . . . the least restrictive 13 means” of advancing the substantial interest, the Court finds it likely, based on the evidence 14 provided by NetChoice and the lack of clarity in the provision, that the provision here would serve 15 to chill a “substantially excessive” amount of protected speech to the extent that content providers 16 wish to reach children but choose not to in order to avoid running afoul of the CAADCA. See 17 Hunt, 638 F.3d at 717 (citation omitted). Accordingly, the State has not met its burden under 18 Central Hudson of showing “a reasonable fit between the means and ends of the regulatory 19 scheme,” Junior Sports Mags., 2023 WL 5945879, at *7 (quoting Lorillard Tobacco Co. v. Reilly, 20 533 U.S. 525, 561 (2001)), so that NetChoice is likely to succeed in showing the restriction fails 21 commercial speech scrutiny. 22 23 (4) Age-Appropriate Policy Language (CAADCA § 31(a)(7)) The CAADCA next requires covered businesses to “[p]rovide any privacy information, 24 terms of service, policies, and community standards concisely, prominently, and using clear 25 language suited to the age of children likely to access that online service, product, or feature.” 26 CAADCA § 31(a)(7). The State argues this provision “protects the safety and well-being of 27 minors” by “giving children the tools to make informed decisions about the services with which 28 they interact.” Def.’s Suppl. Br. 4. 25 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 26 of 45 1 2 consumer understanding of websites’ privacy policies. See id. (citing Egelman Decl.); see also 3 Egelman Decl. ¶ 52. The State has shown that internet users generally do not read privacy 4 policies, and that the reason may be that such policies are often “written at the college level and 5 therefore may not be understood by a significant proportion of the population (much less 6 children).” Egelman Decl. ¶ 27; see id. ¶ 24. The Court notes that the research-based claims in 7 Dr. Egelman’s declaration do not appear to be based on studies involving minors and the impact of 8 policy language on their use of online services. See id. at, e.g., ¶¶ 18–19, 24–27, 52. 9 United States District Court Northern District of California The evidence submitted by the State indicates that the harm it seeks to address is a lack of Even accepting that the manner in which websites present “privacy information, terms of 10 service, policies, and community standards,” CAADCA § 31(a)(7), constitutes a real harm to 11 children’s well-being because it deters children from implementing higher privacy settings, the 12 State has not shown that the CAADCA’s policy language provision would directly advance a 13 solution to that harm. The State points only to a sentence in Dr. Egelman’s declaration stating that 14 he “believe[s] the [Act] addresses this issue [of lack of consumer understanding of privacy 15 policies] by requiring the language to be understandable by target audiences (when their online 16 services are likely to be accessed by children).” Egelman Decl. ¶ 52; see Def.’s Suppl. Br. 4 17 (citing same). Nothing in the State’s materials indicates that the policy language provision would 18 materially alleviate a harm to minors caused by current privacy policy language, let alone by the 19 terms of service and community standards that the provision also encompasses. NetChoice is 20 therefore likely to succeed in showing that the provision fails commercial speech scrutiny. See 21 Yim, 63 F.4th at 794. 22 23 (5) Internal Policy Enforcement (CAADCA § 31(a)(9)) CAADCA § 31(a)(9) requires covered businesses to “[e]nforce published terms, policies, 24 and community standards established by the business, including, but not limited to, privacy 25 policies and those concerning children.” As an initial matter, although the State argues that 26 “businesses have to be accountable for the commitments they make to [] consumers” for “children 27 and parents to make informed decisions about the products children access,” Def.’s Suppl. Br. 5, 28 the State fails to establish a concrete harm. The State points to Dr. Radesky’s declaration, which 26 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 27 of 45 1 asserts that “[s]tudies have shown that businesses are not enforcing their privacy policies,” 2 “mak[ing] it challenging for consumers to make informed decisions about whether they want to 3 join different online communities [without] knowing whether stated policies and standards will be 4 followed.” Radesky Decl. ¶ 93; see Def.’s Suppl. Br. 5. The State has not provided anything 5 remotely nearing a causal link between whether a business consistently follows its “published 6 terms, policies, and community standards”—or even children’s difficulty in making better- 7 informed decisions about whether to use online services—and some harm to children’s well-being. 8 On this basis alone, NetChoice is likely to succeed in showing that the policy enforcement 9 provision fails commercial speech scrutiny. See Yim, 63 F.4th at 794 (noting the government must United States District Court Northern District of California 10 show that “the harms it recites are real”) (citation omitted). 11 Further, even if the State is able to show a concrete harm to children’s well-being, the 12 provision on its face goes beyond enforcement of policies related to children, or even privacy 13 policies generally. See CAADCA § 31(a)(9) (requiring enforcement of terms “including, but not 14 limited to, privacy policies and those concerning children”). The lack of any attempt at tailoring 15 the proposed solution to a specific harm suggests that the State here seeks to force covered 16 businesses to exercise their editorial judgment in permitting or prohibiting content that may, for 17 instance, violate a company’s published community standards. The State argues that businesses 18 have complete discretion to set whatever policies they wish, and must merely commit to following 19 them. See Opp’n 14; Def.’s Suppl. Br. 5. It is that required commitment, however, that flies in 20 the face of a platform’s First Amendment right to choose in any given instance to permit one post 21 but prohibit a substantially similar one. See NetChoice v. Fla., 34 F.4th at 1204–05, 1228 (finding 22 content moderation restrictions impinged on business’s protected curation of content). 23 Lastly, the Court is not persuaded by the State’s argument that the provision is necessary 24 because there is currently “no law holding online businesses accountable for enforcing their own 25 policies,” Def.’s Suppl. Br. 5, as the State itself cites to a Ninth Circuit case permitting a lawsuit to 26 proceed where the plaintiff brought a breach of contract suit against an online platform for failure 27 to adhere to its terms. See id.; Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1108–09 (9th Cir. 2009). 28 For the multiplicity of reasons described above, Court finds that the State has not met its 27 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 28 of 45 1 burden of justifying the policy enforcement provision, and that NetChoice is therefore likely to 2 succeed in showing that the provision fails commercial speech scrutiny. 3 4 Knowingly Harmful Use of Children’s Data (CAADCA § 31(b)(1)) As previously noted, CAADCA § 31(a) contains the Act’s mandates, and CAADCA § 5 31(b) enumerates its prohibitions. The first of these prohibitions forbids a covered business from 6 “[using] the personal information of any child in a way that the business knows, or has reason to 7 know, is materially detrimental to the physical health, mental health, or well-being of a child.” 8 CAADCA § 31(b)(1). 9 United States District Court Northern District of California (6) The Third Circuit’s decision in ACLU v. Mukasey is instructive here. In Mukasey, which 10 went up to the Supreme Court twice and was finally decided by the Court of Appeals, the court 11 held that a law prohibiting the transmission of “material that is harmful to minors” was not 12 narrowly tailored because it required evaluation of a wide range of material that was not in fact 13 harmful, and because the law’s definition of a “minor” as anyone under 17 years of age would 14 cause “great uncertainty in deciding what minor could be exposed to” the material. ACLU v. 15 Mukasey, 534 F.3d 181, 191, 193 (3d Cir. 2008) (cert. denied). The Third Circuit also rejected the 16 government’s affirmative defense that regulated companies could use age verification techniques 17 to achieve greater certainty as to what material was prohibited to a given user. Id. at 196–97. 18 The CAADCA does not define what uses of information may be considered “materially 19 detrimental” to a child’s well-being, and it defines a “child” as a consumer under 18 years of age. 20 See CAADCA § 30. Although there may be some uses of personal information that are 21 objectively detrimental to children of any age, the CAADCA appears generally to contemplate a 22 sliding scale of potential harms to children as they age. See, e.g., Def.’s Suppl. Br. 3, 4 23 (describing Act’s requirements for “age-appropriate” protections). But as the Third Circuit 24 explained, requiring covered businesses to determine what is materially harmful to an “infant, a 25 five-year old, or a person just shy of age seventeen” is not narrowly tailored. Mukasey, 534 F.3d 26 at 191. Although the law in Mukasey was evaluated under a strict scrutiny standard, the Court 27 finds the same concerns apply here, so that the State has not met its burden of showing the instant 28 provision is reasonably tailored to the State’s substantial interest, and thus NetChoice is likely to 28 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 29 of 45 1 succeed in showing that the provision fails commercial speech scrutiny. NetChoice has provided 2 evidence that covered businesses might well bar all children from accessing their online services 3 rather than undergo the burden of determining exactly what can be done with the personal 4 information of each consumer under the age of 18. See, e.g., NYT Am. Br. 5–6 (asserting 5 CAADCA requirements that covered businesses consider various potential harms to children 6 would make it “almost certain that news organizations and others will take steps to prevent those 7 under the age of 18 from accessing online news content, features, or services”). The provision at 8 issue would likely “burden substantially more speech than is necessary to further the government’s 9 legitimate interests,” and therefore NetChoice is likely to succeed in demonstrating that it fails 10 United States District Court Northern District of California 11 12 commercial speech scrutiny. See Yim, 63 F.4th at 795–96 (quoting Fox, 492 U.S. at 478). (7) Profiling Children by Default (CAADCA § 31(b)(2)) CAADCA § 31(b)(2) prevents a covered business from “[p]rofil[ing] a child by default 13 unless” (1) the business “can demonstrate it has appropriate safeguards in place to protect 14 children” and (2) either of the following conditions is met: (a) the profiling is “necessary to 15 provide the online service, product, or feature requested and only with respect to the aspects of the 16 online service, product, or feature with which the child is actively engaged” or (b) the business can 17 “demonstrate a compelling reason that profiling is in the best interests of children.” The State 18 argues this provision protects children’s well-being because businesses commonly profile children 19 by default and place them into target audience categories for products related to harmful content 20 such as smoking, gambling, alcohol, or extreme weight loss. Def.’s Suppl. Br. 5–6; Radesky Decl. 21 ¶ 66. The Court accepts the State’s assertion of a concrete harm to children’s well-being, i.e., the 22 use of profiling to advertise harmful content to children, and turns to the issue of tailoring. 23 NetChoice has provided evidence indicating that profiling and subsequent targeted content 24 can be beneficial to minors, particularly those in vulnerable populations. For example, LGBTQ+ 25 youth—especially those in more hostile environments who turn to the internet for community and 26 information—may have a more difficult time finding resources regarding their personal health, 27 gender identity, and sexual orientation. See Amicus Curiae Br. of Chamber of Progress, IP 28 Justice, & LGBT Tech Inst. (“LGBT Tech Am. Br.”), ECF 42-1, at 12–13. Pregnant teenagers are 29 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 30 of 45 1 another group of children who may benefit greatly from access to reproductive health information. 2 Id. at 14–15. Even aside from these more vulnerable groups, the internet may provide children— 3 like any other consumer—with information that may lead to fulfilling new interests that the 4 consumer may not have otherwise thought to search out. The provision at issue appears likely to 5 discard these beneficial aspects of targeted information along with harmful content such as 6 smoking, gambling, alcohol, or extreme weight loss. United States District Court Northern District of California 7 The State argues that the provision is narrowly tailored to “prohibit[] profiling by default 8 when done solely for the benefit of businesses, but allows it . . . when in the best interest of 9 children.” Def.’s Suppl. Br. 6. But as amici point out, what is “in the best interest of children” is 10 not an objective standard but rather a contentious topic of political debate. See LGBT Tech Am. 11 Br. 11–14. The State further argues that children can still access any content online, such as by 12 “actively telling a business what they want to see in a recommendations profile – e.g., nature, 13 dance videos, LGBTQ+ supportive content, body positivity content, racial justice content, etc.” 14 Radesky Decl. ¶ 89(b). By making this assertion, the State acknowledges that there are wanted or 15 beneficial profile interests, but that the Act, rather than prohibiting only certain targeted 16 information deemed harmful (which would also face First Amendment concerns), seeks to prohibit 17 likely beneficial profiling as well. NetChoice’s evidence, which indicates that the provision would 18 likely prevent the dissemination of a broad array of content beyond that which is targeted by the 19 statute, defeats the State’s showing on tailoring, and the Court accordingly finds that State has not 20 met its burden of establishing that the profiling provision directly advances the State’s interest in 21 protecting children’s well-being. NetChoice is therefore likely to succeed in showing that the 22 provision does not satisfy commercial speech scrutiny. See Yim, 63 F.4th at 794 (noting 23 regulation that burdens substantially more speech than is necessary or undermines and counteracts 24 the state’s interest fails commercial speech scrutiny). 25 26 27 28 (8) Restriction on Collecting, Selling, Sharing, and Retaining Children’s Data (CAADCA § 31(b)(3)) CAADCA § 31(b)(3) states that a covered business shall not “[c]ollect, sell, share, or retain any personal information that is not necessary to provide an online service, product, or feature 30 United States District Court Northern District of California Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 31 of 45 1 with which a child is actively and knowingly engaged . . . unless the business can demonstrate a 2 compelling reason that [such an action] is in the best interests of children likely to access the 3 online service, product, or feature.” The State argues that “[e]xcessive data collection and use 4 undoubtedly harms children” because children are “unable to avoid harmful unsolicited content— 5 including extreme weight loss content and gambling and sports betting ads—directed at them” due 6 to the data collection. Def.’s Suppl. Br. 6. As with the previous provision prohibiting profiling, 7 this restriction throws out the baby with the bathwater. In seeking to prevent children from being 8 exposed to “harmful unsolicited content,” the Act would restrict neutral or beneficial content, 9 rendering the restriction poorly tailored to the State’s goal of protecting children’s well-being. 10 And—in light of the State’s admission that it seeks to prevent children from consuming particular 11 content—the Court emphasizes that the compelling and laudable goal of protecting children does 12 not permit the government to shield children from harmful content by enacting greatly 13 overinclusive or underinclusive legislation. See, e.g., Brown v. Ent. Merchants Ass’n, 564 U.S. 14 786, 802–04 (2011) (holding California law prohibiting sale or rental of violent video games to 15 minors failed strict scrutiny). For the same reasons described above, see supra, at Part 16 III(A)(1)(a)(iv)(9), CAADCA § 31(b)(3) NetChoice is likely to succeed in showing the provision 17 fails commercial speech scrutiny. 18 (9) 19 Unauthorized Use of Children’s Personal Information (CAADCA § 31(b)(4)) 20 CAADCA § 31(b)(4) prohibits a covered business from using a child’s “personal 21 information for any reason other than a reason for which that personal information was collected, 22 unless the business can demonstrate a compelling reason that use of the personal information is in 23 the best interests of children.” The State clarifies this fairly circular restriction with an example: 24 “a business that uses a child’s IP address solely to provide access to its platform cannot also use 25 the IP address to sell ads.” Def.’s Suppl. Br. 6. However, the State provides no evidence of a 26 harm to children’s well-being from the use of personal information for multiple purposes. See id. 27 To the extent the harm is the same profiling concern discussed in the prior two sections, the State 28 has not met its burden to show that the instant provision is not similarly overbroad. See supra, at 31 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 32 of 45 1 Parts III(A)(1)(a)(iv)(7)–(8). Because the State has not established a real harm that the provision 2 materially alleviates, NetChoice will likely succeed in showing that the provision fails commercial 3 speech scrutiny. See Yim, 63 F.4th at 794. 4 United States District Court Northern District of California 5 (10) Use of Dark Patterns (CAADCA § 31(b)(7)) The last CAADCA provision challenged by NetChoice prohibits the “[u]se [of] dark 6 patterns to lead or encourage children to provide personal information beyond what is reasonably 7 expected to provide that online service, product, or feature[,] to forego privacy protections, or to 8 take any action that the business knows, or has reason to know, is materially detrimental to the 9 child’s physical health, mental health, or well-being.” CAADCA § 31(b)(7). Dark patterns are 10 design features that “nudge” individuals into making certain decisions, such as spending more 11 time on an application. Def.’s Suppl. Br 7; see also Opp’n 9 (describing dark patterns as 12 “interfaces designed or manipulated with the substantial effect of subverting or impairing user 13 autonomy, decision-making, or choice”); Radesky Decl. ¶ 54 (“[D]esign features that manipulate 14 or nudge the user in a way that meets the technology developer’s best interests – at the expense of 15 the user’s interests (i.e., time, money, sleep) – have been termed ‘dark patterns.’”). The State 16 argues that businesses use dark patterns to “nudge children into making decisions that are 17 advantageous to businesses,” and that “dark patterns can make it difficult or impossible for 18 children to avoid harmful content.” Def.’s Suppl. Br. 7. NetChoice contends that the term “dark 19 patterns” has also been “construed by scholars to reach commonplace features that simplify and 20 improve user experience, such as standard ‘autoplay’ and ‘newsfeed’ functions that recommend 21 personalized content.” Mot. 6 (citation omitted). 22 The instant provision can be analytically divided into three parts. It first prohibits the use 23 of dark patterns to encourage children to “provide personal information beyond what is reasonably 24 expected to provide that online service, product, or feature.” CAADCA § 31(b)(7). This 25 prohibition is similar to the profiling restrictions discussed above in that (1) the State has not 26 shown a harm resulting from the provision of more personal information “beyond what is 27 reasonably expected” for the covered business to provide its online service, product, or feature, 28 and (2) to the extent the harm is the use of profiling information to present harmful content to a 32 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 33 of 45 1 child, the State has not shown that the instant provision is sufficiently tailored to survive 2 commercial speech scrutiny. See supra, at Parts III(A)(1)(a)(iv)(7)–(9). United States District Court Northern District of California 3 Second, the provision prohibits the use of dark patterns to encourage a child to “forego 4 privacy protections.” CAADCA § 31(b)(7). However, the State has not shown that dark patterns 5 causing children to forego privacy protections constitutes a real harm. See Yim, 63 F.4th at 794. 6 Many of the examples of dark patterns cited by the State’s experts—such as making it easier to 7 sign up for a service than to cancel it or creating artificial scarcity by using a countdown timer, 8 Egelman Decl. ¶ 51, or sending users notifications to reengage with a game or auto-advancing 9 users to the next level in a game, Radesky Decl. ¶ 55—are not causally connected to an identified 10 harm. See Brown, 564 U.S. at 799 (finding lack of “direct causal link between violent video 11 games and harm to minors” showed government had not identified “actual problem in need of 12 solving,” so that law failed strict scrutiny); Yim, 63 F.4th at 794 (noting commercial speech 13 scrutiny requires government to show “the harms it recites are real”) 14 The most concrete potential harm the Court can find is in Dr. Radesky’s assertion that 15 “[m]anipulative dark patterns are known to cause monetary harm to children,” based on a March 16 2023 FTC complaint requiring a game developer to pay $245 million “as a penalty for the use of 17 dark patterns to manipulate users into making purchases.” Radesky Decl. ¶ 56. The State does 18 not, however, suggest that the CAADCA is an attempt to address monetary harms to children. See 19 generally Opp’n; Def.’s Suppl. Br. Similarly, although the State points to an existing federal law 20 limiting the practice of making it inconvenient for users to prevent their data from being sold or 21 shared, see Def.’s Suppl. Br. 7 (citing 16 CFR § 312.7), the State does not show how this law 22 indicates a harm to minors caused by the sale of personal information. See generally id.; Radesky 23 Decl.; Egelman Decl. To the extent the harm is the use of data to profile users, including children, 24 the State has not shown that the provision is appropriately tailored to survive commercial speech 25 scrutiny for the same reasons described above. See supra, at Parts III(A)(1)(a)(iv)(7)–(9). The 26 Court accordingly finds that the State is not likely to show a harm in dark patterns causing 27 children to forego privacy protections, so that NetChoice is likely to succeed in showing that this 28 restriction fails commercial speech scrutiny. See Junior Sports Mags., 2023 WL 5945879, at *7 33 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 34 of 45 1 (reversing denial of preliminary injunction and reasoning that “[i]n the end, California spins a web 2 of speculation—not facts or evidence—to claim that its restriction on speech will significantly 3 curb” an alleged harm). United States District Court Northern District of California 4 The last of the three prohibitions of CAADCA § 31(b)(7) concerns the use of dark patterns 5 to “take any action that the business knows, or has reason to know, is materially detrimental” to a 6 child’s well-being. The State here argues that dark patterns cause harm to children’s well-being, 7 such as when a child recovering from an eating disorder “must both contend with dark patterns 8 that make it difficult to unsubscribe from such content and attempt to reconfigure their data 9 settings in the hope of preventing unsolicited content of the same nature.” Def.’s Suppl. Br. 7; see 10 also Amicus Curiae Br. of Fairplay & Public Health Advocacy Inst. (“Fairplay Am. Br.”) 4 11 (noting that CAADCA “seeks to shift the paradigm for protecting children online,” including by 12 “ensuring that children are protected from manipulative design (dark patterns), adult content, or 13 other potentially harmful design features.”) (citation omitted), ECF 53-1. The Court is troubled by 14 the “has reason to know” language in the Act, given the lack of objective standard regarding what 15 content is materially detrimental to a child’s well-being. See supra, at Part III(A)(1)(a)(iv)(7). 16 And some content that might be considered harmful to one child may be neutral at worst to 17 another. NetChoice has provided evidence that in the face of such uncertainties about the statute’s 18 requirements, the statute may cause covered businesses to deny children access to their platforms 19 or content. See NYT Am. Br. 5–6. Given the other infirmities of the provision, the Court declines 20 to wordsmith it and excise various clauses, and accordingly finds that NetChoice is likely to 21 succeed in showing that the provision as a whole fails commercial speech scrutiny. 22 23 iii. Conclusion re Commercial Speech Scrutiny For the foregoing reasons, the Court finds that NetChoice is likely to succeed in showing 24 that the CAADCA’s challenged mandates and prohibitions fail commercial speech scrutiny and 25 therefore are invalid. 26 e. Severability 27 NetChoice argues that the CAADCA must be enjoined in its entirety because the 28 challenged provisions of the CAADCA—which are likely invalid—cannot be severed from the 34 United States District Court Northern District of California Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 35 of 45 1 Act’s remaining prohibitions and mandates, or from other provisions related to the CAADCA’s 2 application, penalties, and compliance. Pl.’s Suppl. Br. 6–7 (discussing CAADCA § 31(a)(8), 3 31(a)(10), 31(b)(5)–(6), 31(b)(8), 32, 33, and 35). The State argues that almost every provision is 4 severable, and urges the Court to sustain any provisions not found invalid. Def.’s Suppl. Br. 2. 5 “Severability is a matter of state law.” Sam Francis Found. v. Christies, Inc., 784 F.3d 6 1320, 1325 (9th Cir. 2015) (quoting Leavitt v. Jane L., 518 U.S. 137, 139 (1996)) (alterations 7 omitted). Under California law, the severability of the invalid parts of a statute depends on 8 whether such provisions are grammatically, functionally, and volitionally severable from the valid 9 remainder. See Calfarm Ins. Co. v. Deukmejian, 48 Cal. 3d 805, 821–22 (1989) (en banc). 10 Putting aside the CAADCA provisions setting forth the statute’s title, findings, and definitions, 11 CAADCA §§ 28–30, the valid remainder of the statute involve: restrictions on monitoring 12 children’s online behavior and tracking location, CAADCA § 31(a)(8); the provision of responsive 13 tools for children to exercise their privacy rights and report concerns, id. § 31(a)(10); the 14 collection of precise geolocation data, id. §§ 31(b)(5)–(6); the use of age-estimation information, 15 id. § 31(b)(8); the creation of a working group to deliver a report on best practices under the 16 CAADCA, id. § 32; the July 1, 2024 deadline for covered businesses to complete DPIA reports, 17 id. § 33; and the penalties for violations of the CAADCA, id. § 35. See Pl.’s Suppl. Br. 6–7. 18 The Court first notes that there is no severability clause in the CAADCA that would create 19 a presumption in favor of “sustaining the valid part” of the statute. See Garcia, 11 F.4th at 1120 20 (citing Cal. Redevelopment Ass’n v. Matosantos, 53 Cal. 4th 231, 270 (2011). Turning to the 21 question of functional severability, the Court finds dispositive the status of the DPIA provisions. 22 As noted by NetChoice, the CAADCA provides that the State shall not initiate an action for any 23 violation of the statute without providing written notice to a covered business identifying specific 24 provisions of the Act that are alleged to have been violated. CAADCA § 35(c); see Pl.’s Suppl. 25 Br. 7. The Court’s determination that NetChoice is likely to succeed in showing that the DPIA 26 report requirement is invalid, see supra, at Part III(A)(1)(d)(ii)(1), similarly renders likely invalid 27 a condition precedent for enforcement of the remainder of the statute. Because the CAADCA is 28 not capable of “separate enforcement” without the DPIA requirement, the DPIA provisions are not 35 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 36 of 45 1 functionally severable from the otherwise valid portions of the statute. People’s Advocate, Inc. v. 2 Super. Ct., 181 Cal. App. 3d 316, 332 (1986) (“The remaining provisions must stand on their own, 3 unaided by the invalid provisions nor rendered vague by their absence nor inextricably connected 4 to them by policy considerations. They must be capable of separate enforcement.”). United States District Court Northern District of California 5 Although the Court need not review the severability of any other provision in light of the 6 DPIA report requirement’s impact on the entire CAADCA, it notes that the age estimation 7 provision, CAADCA § 31(a)(5), is the linchpin of most of the CAADCA’s provisions, which 8 specify various data and privacy protections for children. See id. §§ 31(a)(6), (b)(1)–(8). The 9 State concedes only that CAADCA § 31(b)(8)—which prevents the use of personal information 10 collected to estimate age for any other purpose—is rendered obsolete if the age estimation 11 provision is deemed unconstitutional. Def.’s Suppl. Br. 3. However, compliance with the 12 CAADCA’s requirements would appear to generally require age estimation to determine whether 13 each user is in fact under 18 years old. The age estimation provision is thus also not functionally 14 severable from the remainder of the statute. See People’s Advocate, 181 Cal. App. at 1332. 15 The futility of severance is apparent when one considers the outcome if the Court were to 16 preliminarily enjoin only the challenged provisions that NetChoice has shown are likely violative 17 of the First Amendment. The Act would consist of the provisions setting forth the statute’s title, 18 findings, and definitions; two mandates; three prohibitions; and provisions establishing a working 19 group, DPIA report deadlines, and penalties for violating the Act. See CAADCA §§ 28–30, 20 31(a)(8), 31(a)(10), 31(b)(5)–(6), 31(b)(8), 32–33, 35. The DPIA report deadline, id. § 33, is 21 meaningless without a DPIA report requirement. Five of the six required recommendations of the 22 working group track provisions of the Act that are likely invalid. See id. § 32(d)(1)–(5). Further, 23 even the State agrees that one of the three remaining prohibitions—that on collecting age 24 estimation data, id. § 31(b)(8)—“would be made obsolete” in the absence of § 31(a)(5), which 25 NetChoice has shown is likely invalid. Def.’s Suppl. Br. 3. Accordingly, the only meat left of the 26 Act would be four unchallenged mandates and prohibitions that together would require covered 27 businesses to provide children with obvious tracking signals and prominent and responsive tools to 28 exercise their privacy rights, and to refrain from collecting children’s precise geolocation data. 36 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 37 of 45 1 See CAADCA §§ 31(a)(8), 31(a)(10), 31(b)(5)–(6). All of these provisions require businesses to 2 know their users’ ages, but the Court has found NetChoice will likely succeed in showing the age 3 estimation provision does not pass commercial speech scrutiny. And none of the provisions can 4 be enforced without the penalty provision, id. § 35, which, as described above, is hamstrung if the 5 State cannot determine whether a covered business is in substantial compliance with the likely- 6 invalid DPIA report requirement. These interdependencies indicate how intertwined with—and 7 thus inseverable from—the challenged provisions are with respect to the valid remainder. 8 United States District Court Northern District of California 9 Given that multiple provisions of the CAADCA will be preliminarily enjoined by this order, and the Court’s determination that these provisions are not functionally severable from the 10 presumably valid remainder of the statute, the Court concludes that it cannot sever the likely 11 invalid portions from the statute and sustain the remainder. See Acosta v. City of Costa Mesa, 718 12 F.3d 800, 820 (9th Cir. 2013) (refusing to “rewrite[e] the ordinance in order to save it”) (internal 13 alterations and citation omitted). 14 f. Conclusion re First Amendment Arguments (Claims 1 and 3) 15 Based on the foregoing, the Court concludes that NetChoice has demonstrated a likelihood 16 of success on Claim 1, which asserts that the CAADCA violates the First Amendment because the 17 Act’s “speech restrictions . . . fail strict scrutiny and also would fail a lesser standard of scrutiny.” 18 Compl. ¶ 82. As noted above, see supra, at Part III(A)(1), the Court need not and does not here 19 address NetChoice’s likelihood of success on its allegations of additional First Amendment 20 violations in Claims 1 and 3. 21 22 2. Other Claims NetChoice has demonstrated a likelihood of success on the merits of Claim 1 brought 23 under the First Amendment and, as discussed below, has satisfied the remaining Winter factors 24 with respect to Claim 1. NetChoice is entitled to preliminary injunctive relief on that basis. Under 25 these circumstances, the Court must determine whether it is necessary or advisable to address the 26 likelihood of success of NetChoice’s other claims for relief at this time: Claim 4, asserting that 27 the CAADCA violates the dormant Commerce Clause; Claim 5, asserting that the CAADCA is 28 preempted by COPPA; and Claim 6, asserting that the CAADCA is preempted by Section 230. 37 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 38 of 45 Once a plaintiff demonstrates that a preliminary injunction is warranted based on the United States District Court Northern District of California 1 2 likelihood of success on one claim, district courts in this circuit generally do not consider whether 3 the same injunctive relief could be granted based on other claims. See, e.g., Shawarma Stackz 4 LLC v. Jwad, No. 21-CV-01263-BAS-BGS, 2021 WL 5827066, at *19 (S.D. Cal. Dec. 8, 2021) 5 (“The Court need not reach the merits of the remaining state torts claims that SSL raises because 6 the Lanham Act claim and the UCL claim are sufficient to sustain a preliminary injunction.”); 7 Seiko Epson Corp. v. Nelson, No. 5:21-cv-00320-JWH-SPx, 2021 WL 5033486, at *3 (C.D. Cal. 8 Mar. 31, 2021) (“The Court therefore finds that Plaintiffs have demonstrated a likelihood of 9 success on the merits with respect to their first claim for relief. Plaintiffs have thus satisfied the 10 preliminary injunction standard; the Court need not analyze Plaintiffs’ other two claims for 11 relief.”); Faison v. Jones, 440 F. Supp. 3d 1123, 1136 n.3 (E.D. Cal. 2020) (“Because the Court 12 finds Plaintiffs are likely to succeed on the merits of their viewpoint discrimination theory, the 13 Court need not and does not address Plaintiffs’ remaining theories.”); Medina v. Becerra, No. 14 3:17-CV-03293 CRB, 2017 WL 5495820, at *12 (N.D. Cal. Nov. 16, 2017) (“As Medina has 15 shown a likelihood of success on the merits for his First Amendment claim, this Court need not 16 address Medina’s other claims for relief.”). This Court sees no reason to depart from the approach 17 adopted by other district courts in the Ninth Circuit. Deferring consideration of NetChoice’s Commerce Clause claim is particularly appropriate 18 19 here, because the claim presents thorny constitutional issues that the parties briefed prior to 20 receiving the Supreme Court’s latest guidance in Nat’l Pork Producers Council v. Ross, 598 U.S. 21 356 (2023).8 Ross provides a comprehensive review of case law on the dormant Commerce 22 Clause, emphasizing that “the Commerce Clause prohibits the enforcement of state laws driven by 23 economic protectionism—that is, regulatory measures designed to benefit in-state economic 24 interests by burdening out-of-state competitors,” and clarifying that this “antidiscrimination 25 principle lies at the ‘very core’ of the Court’s dormant Commerce Clause jurisprudence.” Id. 26 27 28 8 The motion and opposition were filed before Ross issued. The reply was filed approximately one week after Ross was decided, and Ross is cited once therein as secondary authority for an assertion made in the brief. See Reply 13. 38 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 39 of 45 1 (quotation marks, alterations, and citation omitted). The decision may call into question the 2 dormant Commerce Clause’s application where, as here, the state law at issue does not 3 discriminate against out-of-state competitors but does have an extraterritorial effect. Ross 4 observes that “[i]n our interconnected national marketplace, many (maybe most) state laws have 5 the ‘practical effect of controlling’ extraterritorial behavior,” and concludes that extraterritorial 6 effects alone are insufficient to implicate the dormant Commerce Clause. See id. at 1156–57. In 7 the Court’s view, it would be imprudent to engage in an analysis of NetChoice’s dormant 8 Commerce Clause claim where such analysis is unnecessary to a ruling on the present motion and 9 the Court does not have the benefit of the parties’ views on the impact of Ross. United States District Court Northern District of California 10 With respect to NetChoice’s preemption claims, the Court’s initial view is that neither 11 would support the requested preliminary injunction. Claim 5 asserts that the CAADCA is 12 preempted by COPPA, which contains a preemption clause providing, “No State or local 13 government may impose any liability for commercial activities or actions by operators in interstate 14 or foreign commerce in connection with an activity or action described in this chapter that is 15 inconsistent with the treatment of those activities or actions under this section.” 15 U.S.C.A. § 16 6502(d) (emphasis added). NetChoice claims that the CAADCA is “inconsistent” with COPPA in 17 the following respects: the CAADCA applies broadly to services “likely to be accessed” by 18 children, whereas COPPA applies only to online services “directed” to children; the CAADCA 19 imposes privacy obligations that are not required by COPPA; and the CAADCA imposes 20 substantive obligations that far exceed those imposed by COPPA. See id. ¶¶ 114–16. NetChoice 21 additionally claims that the statutes are inconsistent because the CAADCA prohibits conduct that 22 is permitted under COPPA, including profiling a child by default and using dark patterns to 23 encourage children to provide personal information. See id. ¶ 117. 24 The Ninth Circuit recently held in Jones v. Google LLC, 73 F.4th 636, 642 (9th Cir. 2023), 25 that a state law is not “inconsistent” with COPPA for preemption purposes unless the state law 26 contains requirements that contradict those of COPPA or “stand as obstacles to federal objectives” 27 embodied in COPPA. A state law that supplements or requires the same thing as COPPA is not 28 inconsistent with COPPA. See id. In the Court’s view, it is not clear that the cited provisions of 39 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 40 of 45 1 the CAADCA contradict, rather than supplement, those of COPPA. Nor is it clear that the cited 2 provisions of the CAADCA would stand as an obstacle to enforcement of COPPA. An online 3 provider might well be able to comply with the provisions of both the CAADCA and COPPA, 4 with the possible exception of the CAADCA provisions identified in paragraph 117 of the 5 complaint. However, a determination whether those are inconsistent with COPPA for preemption 6 purposes would require a careful and nuanced analysis. It would make little sense to engage in 7 such analysis at this stage of the proceedings in light of the fact that NetChoice is entitled to the 8 requested injunctive relief based on its First Amendment claims. United States District Court Northern District of California 9 Claim 6 asserts that the CAADCA is preempted by Section 230. Section 230 “protects 10 certain internet-based actors from certain kinds of lawsuits.” Barnes, 570 F.3d at 1099. As 11 relevant here, Section 230(c)(1) provides that “[n]o provider or user of an interactive computer 12 service shall be treated as the publisher or speaker of any information provided by another 13 information content provider.” 47 U.S.C. § 230(c)(1). Section 230(c)(2) provides that “[n]o 14 provider or user of an interactive computer service shall be held liable on account of . . . any action 15 voluntarily taken in good faith to restrict access to or availability of material that the provider or 16 user considers to be . . . objectionable[.]” 47 U.S.C. § 230(c)(2)(A). NetChoice contends that the 17 CAADCA’s requirement that online providers enforce their “published terms, policies, and 18 community standards,” CAADCA § 31(a)(9), and restrictions on the use of minors’ personal 19 information, CAADCA § 31(b)(1), (3), (4), (7), are inconsistent with Section 230. NetChoice 20 claims that those inconsistencies result in preemption of the CAADCA under § 230(e), which 21 provides that “[n]o cause of action may be brought and no liability may be imposed under any 22 State or local law that is inconsistent with this section.” 47 U.S.C. § 230(e)(3). Section 230 may 23 be implicated by an online provider’s enforcement of its policies and other acts in compliance with 24 the CAADCA, but it is difficult (if not impossible) to make that determination without knowing 25 what policies or acts are at issue. For that reason, it is the Court’s view that a facial challenge to 26 the CAADCA is not the appropriate context in which to consider the applicability of § 230. 27 28 Accordingly, the Court need not and does not determine whether NetChoice is likely to succeed on the merits of its claims grounded in the dormant Commerce Clause, COPPA, and 40 United States District Court Northern District of California Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 41 of 45 1 Section 230. The Court limits its consideration of the remaining Winter factors to Claim 1 under 2 the First Amendment, namely, irreparable harm, the balance of equities, and the public interest. 3 B. 4 “The loss of First Amendment freedoms, for even minimal periods of time, unquestionably Irreparable Harm 5 constitutes irreparable injury.” Elrod v. Burns, 427 U.S. 347, 373 (1976); see also Baird, 2023 6 WL 5763345, at *3. Loss of free speech rights resulting from a threat of enforcement rather than 7 actual enforcement constitutes irreparable harm. See Cuviello v. City of Vallejo, 944 F.3d 816, 8 833 (9th Cir. 2019). Consequently, “[i]rreparable harm is relatively easy to establish in a First 9 Amendment case.” CTIA - The Wireless Ass’n v. City of Berkeley, 928 F.3d 832, 851 (9th Cir. 10 2019). “[A] party seeking preliminary injunctive relief in a First Amendment context can 11 establish irreparable injury . . . by demonstrating the existence of a colorable First Amendment 12 claim.” Id. (quotation marks and citation omitted). As discussed above, NetChoice has done more 13 than merely assert a colorable First Amendment claim; it has established a likelihood of success 14 on the merits of its claim that the CAADCA violates the First Amendment. 15 The Court finds unpersuasive the State’s argument that the threat of enforcement is 16 insufficient to establish irreparable injury because the Act’s challenged provisions do not take 17 effect until July 1, 2024. That date is less than a year away. “One does not have to await the 18 consummation of threatened injury to obtain preventive relief. If the injury is certainly impending, 19 that is enough.” Pac. Gas & Elec. Co. v. State Energy Res. Conservation & Dev. Comm’n, 461 20 U.S. 190, 201 (1983) (citation omitted). Moreover, NetChoice presents evidence that businesses 21 already are expending time and funds preparing for enforcement of the CAADCA. See Roin Decl. 22 ¶¶ 20, 24–25; Cairella Decl. ¶¶ 14, 19–22; Masnick Decl. ¶¶ 12, 14–19; Paolucci Decl. ¶¶ 16–18; 23 Szabo Decl. ¶¶ 5–7, 12–17. Requiring businesses to proceed with such preparations without 24 knowing whether CAADCA is valid “would impose a palpable and considerable hardship” on 25 them. See Pac. Gas & Elec., 461 U.S. at 201–02 (“To require the industry to proceed without 26 knowing whether the moratorium is valid would impose a palpable and considerable hardship on 27 the utilities[.]”). 28 The Court has no difficulty finding that NetChoice has established a likelihood of 41 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 42 of 45 United States District Court Northern District of California 1 irreparable harm absent issuance of the requested preliminary injunction. 2 C. 3 “Where the government is a party to a case in which a preliminary injunction is sought, the Balance of Equities / Public Interest 4 balance of the equities and public interest factors merge.” Roman v. Wolf, 977 F.3d 935, 940-41 5 (9th Cir. 2020); see also Baird, 2023 WL 5763345, at *2. As discussed above, NetChoice has 6 demonstrated a likelihood of success in proving that the CAADCA violates the First Amendment. 7 “[I]t is always in the public interest to prevent the violation of a party’s constitutional rights.” 8 Melendres v. Arpaio, 695 F.3d 990, 1002 (9th Cir. 2012) (quotation marks and citation omitted). 9 Moreover, the State “cannot reasonably assert that it is harmed in any legally cognizable sense by 10 being enjoined from constitutional violations.” Zepeda v. U.S. I.N.S., 753 F.2d 719, 727 (9th Cir. 11 1983). 12 The State cites Maryland v. King, 567 U.S. 1301, 1303 (2012), for the proposition that 13 “[a]ny time a State is enjoined by a court from effectuating statutes enacted by representatives of 14 its people, it suffers a form of irreparable injury.” King did not involve a motion for preliminary 15 injunction, but rather Maryland’s application for a stay of a state appellate court’s decision 16 overturning King’s rape conviction pending disposition of Maryland’s petition for writ of 17 certiorari. See id. at 1301. The state appellate court had determined that Maryland’s DNA 18 collection statute, which had authorized law enforcement officers to collect King’s DNA sample, 19 violated the Fourth Amendment. See id. The Supreme Court found that a stay was warranted 20 based on its determination that there was a reasonable probability it would grant certiorari. See id. 21 at 1302. It was in that context that the Supreme Court discussed the harm to the State of Maryland 22 flowing from its inability to effectuate its DNA collection statute. See id. at 1303. The quoted 23 language has no application here, where (unlike the State of Maryland) the State of California has 24 not made a showing that the challenged statute passes constitutional muster. 25 26 The Court finds that NetChoice has established that the last two factors, the balance of equities and the public interest, favor issuance of the requested injunction. 27 D. 28 In conclusion, the Court finds that all of the Winter factors favor granting the requested Conclusion 42 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 43 of 45 1 preliminary injunction. With respect to the first and most important factor, likelihood of success 2 on the merits, NetChoice has demonstrated that it is likely to succeed on at least one of its First 3 Amendment theories set forth in Claim 1 of the complaint. NetChoice also has satisfied the 4 second factor by demonstrating a likelihood that it will suffer irreparable injury if the requested 5 preliminary injunction does not issue. Finally, NetChoice has satisfied the third and fourth factors 6 by showing that the balance of the equities and the public interest favor issuance of the requested 7 preliminary injunction. United States District Court Northern District of California 8 “If a movant makes a sufficient demonstration on all four Winter factors (three when as 9 here the third and fourth factors are merged), a court must not shrink from its obligation to enforce 10 his constitutional rights, regardless of the constitutional right at issue.” Baird, 2023 WL 5763345, 11 at *3 (quotation marks, citation, and brackets omitted). “It may not deny a preliminary injunction 12 motion and thereby allow constitutional violations to continue simply because a remedy would 13 involve intrusion into an agency’s administration of state law.” Id. (quotation marks and citation 14 omitted). 15 NetChoice’s motion for preliminary injunction is GRANTED. 16 E. 17 Federal Rule of Civil Procedure 65(c) provides that “[t]he court may issue a preliminary Security 18 injunction or a temporary restraining order only if the movant gives security in an amount that the 19 court considers proper to pay the costs and damages sustained by any party found to have been 20 wrongfully enjoined or restrained.” Fed. R. Civ. P. 65(c). The Ninth Circuit has “recognized that 21 Rule 65(c) invests the district court with discretion as to the amount of security required, if any.” 22 Jorgensen v. Cassiday, 320 F.3d 906, 919 (9th Cir. 2003) (internal quotation marks and citation 23 omitted) (italics in original). Thus, the district court has discretion to dispense with the filing of a 24 bond altogether, or to require only a nominal bond. See id. (“The district court may dispense with 25 the filing of a bond when it concludes there is no realistic likelihood of harm to the defendant from 26 enjoining his or her conduct.”); see also Save Our Sonoran, Inc. v. Flowers, 408 F.3d 1113, 1126 27 (9th Cir. 2005) (“The district court has discretion to dispense with the security requirement, or to 28 request mere nominal security, where requiring security would effectively deny access to judicial 43 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 44 of 45 1 review.”) (citation omitted). 2 Neither party addresses the issue of security in its briefing. NetChoice’s proposed order, 3 filed with its motion for preliminary injunction, provides that the requested injunctive relief will 4 issue without the requirement of any security bond because NetChoice has shown a likelihood of 5 success and the State will not suffer any harm from maintaining the status quo. See Proposed 6 Order, ECF 29-31. The State argues, as a reason to deny injunctive relief altogether, that issuance 7 of the injunction “would inflict irreparable harm upon California by preventing enforcement of a 8 statute enacted by representatives of the people.” Opp’n at 30. The State’s argument gives no 9 indication, however, whether the State believes a bond should be required in the event a 10 preliminary injunction issues, or the appropriate amount of such bond. See id. The Court finds it appropriate to issue the preliminary injunction without requiring security United States District Court Northern District of California 11 12 based on NetChoice’s showing that it is likely to prevail on its claim that enforcement of the 13 CAADCA violates the First Amendment—and thus could not be lawfully enforced by the State— 14 and the absence of any argument that a security bond should be required. 15 // 16 // 17 // 18 19 20 21 22 23 24 25 26 27 28 44 Case 5:22-cv-08861-BLF Document 74 Filed 09/18/23 Page 45 of 45 1 2 IV. ORDER (1) Plaintiff NetChoice’s motion for preliminary injunction is GRANTED as follows: (a) 3 Rob Bonta, Attorney General of the State of California, and 4 anyone acting in concert with his office are ENJOINED from enforcing the 5 California Age-Appropriate Design Code Act; (b) 6 bond; and 7 (c) 8 This preliminary injunction shall take effect immediately and shall remain in effect until otherwise ordered by the Court. 9 10 This preliminary injunction shall issue without the requirement of a security (2) This order terminates ECF 29. United States District Court Northern District of California 11 12 Dated: September 18, 2023 13 14 BETH LABSON FREEMAN United States District Judge 15 16 17 18 19 20 21 22 23 24 25 26 27 28 45

Some case metadata and case summaries were written with the help of AI, which can produce inaccuracies. You should read the full case before relying on it for legal research purposes.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.