Netchoice, LLC v. Attorney General, State Of Florida

34 F.4th 1196 (11th Cir. 2022)

Facts

Social media platforms collect speech created by third parties and then make that speech available to others, who might be either individuals who have chosen to 'follow' the 'post'-er or members of the general public. Social media platforms are private enterprises, not governmental (or even quasi-governmental) entities. No one has an obligation to contribute to or consume the content that the platforms make available. No one has a vested right to force a platform to allow her to contribute to or consume social-media content. The vast majority of the content is created by individual users, not the companies that own and operate the sites. The platforms do engage in some speech of their own. The platforms generally exercise editorial judgment in two key ways: First, the platform will have removed posts that violate its terms of service or community standards-for instance, those containing hate speech, pornography, or violent content. Second, it will have arranged available content by choosing how to prioritize and display posts-effectively selecting which user's speech the viewer will see, and in what order, during any given visit to the site. The platforms invest significant time and resources into editing and organizing users' posts into collections of content that they then disseminate to others. D enacted S.B. 7072 to combat the 'biased silencing' of 'our freedom of speech as conservatives . . . by the 'big tech' oligarchs in Silicon Valley.' S.B. 7072 compares private platforms to 'public utilities,' and that they should be 'treated similarly to common carriers.' The Act claims that social media platforms 'have unfairly censored, shadow banned, de-platformed, and applied post-prioritization algorithms to Floridians.' S.B. 7072 imposes (1) content-moderation restrictions; (2) disclosure obligations; and (3) a user-data requirement on all platforms. Those highlights include:


(1) A social media platform 'may not willfully de-platform a candidate for office.'

(2) 'A social media platform may not apply or use post-prioritization or shadow banning algorithms for content and material posted by or about . . . a candidate for office'

(3) A social-media platform may not 'censor, de-platform, or shadow ban a journalistic enterprise based on the content of its publication or broadcast.'

(4) A social-media platform must 'apply censorship, de-platforming, and shadow banning standards in a consistent manner among its users on the platform.'

(5) A platform must 'categorize' its post-prioritization and shadow-banning algorithms and allow users to opt out of them; for users who opt out, the platform must display material in 'sequential or chronological' order.

(6) A social-media platform must 'publish the standards, including detailed definitions, it uses or has used for determining how to censor, de-platform, and shadow ban.'

(7) A platform must inform its users 'about any changes to' its 'rules, terms, and agreements before implementing the changes.'

(8) Upon request, a platform must provide a user with the number of others who viewed that user's content or posts. 

(9) Platforms that 'willfully provide free advertising for a candidate must inform the candidate of such in-kind contribution.”

(10) Before a social-media platform de-platforms, censors, or shadow-bans any user, it must provide the user with a detailed notice.

(11) A social-media platform must allow a de--platformed user to 'access or retrieve all of the user's information, content, material, and data for at least 60 days' after the user receives notice of de-platforming.

Ps sued the Florida officials charged with enforcing S.B. 7072 under 42 U.S.C. § 1983 for a violation of their right to free speech under the First Amendment. The district court granted P's motion and preliminarily enjoined enforcement. The district court held that the Act's provisions implicated the First Amendment because they restrict platforms' constitutionally protected exercise of 'editorial judgment.' It applied strict First Amendment scrutiny because the Act's provisions were content-based and, more broadly, because it found that the entire bill was motivated by the state's viewpoint-based purpose to defend conservatives' speech from perceived liberal 'big tech' bias. D appealed. D contends that S.B. 7072 doesn't even implicate-let alone violate-the First Amendment because the platforms aren't engaged in protected speech. The Act merely requires platforms to 'host' third-parties' speech, which, it says, they may constitutionally be compelled to do under two Supreme Court decisions-PruneYard Shopping Center v. Robins, and Rumsfeld v. Forum for Academic & Institutional Rights, Inc. D says, the Act doesn't trigger First Amendment scrutiny because it reflects D's permissible decision to treat social-media platforms like 'common carriers.' Ps claim they are making editorial decisions that are protected by the First Amendment.  Thus, strict scrutiny applies to the entire law 'several times over' because it is speaker, content-, and viewpoint-based.