NETCHOICE LLC et al v. MOODY et al
Filing
1
COMPLAINT for Declaratory and Injunctive Relief against JASON TODD ALLEN, PATRICK GILLESPIE, JOHN MARTIN HAYES, ASHLEY BROOKE MOODY, JONI ALEXIS POITIER, KYMBERLEE CURRY SMITH ( Filing fee $ 402 receipt number AFLNDC-6111479.), filed by COMPUTER & COMMUNICATIONS INDUSTRY ASSOCIATION, NETCHOICE, LLC. (Attachments: #1 Summons (Ashley Moody), #2 Summons (Joni Poitier), #3 Summons (Jason Allen), #4 Summons (John Hayes), #5 Summons (Kymberlee Smith), #6 Summons (Patrick Gillespie)) (OPRISON, CHRISTOPHER)
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 1 of 70
IN THE UNITED STATES DISTRICT COURT
FOR THE NORTHERN DISTRICT OF FLORIDA
TALLAHASSEE DIVISION
NETCHOICE, LLC d/b/a NETCHOICE,
a 501(c)(6) District of Columbia
organization; and COMPUTER &
COMMUNICATIONS INDUSTRY
ASSOCIATION d/b/a CCIA, a 501(c)(6)
non-stock Virginia corporation,
Plaintiffs,
v.
ASHLEY BROOKE MOODY, in her
official capacity as Attorney General of
the State of Florida; JONI ALEXIS
POITIER, in her official capacity as
Commissioner of the Florida Elections
Commission; JASON TODD ALLEN, in
his official capacity as Commissioner of
the Florida Elections Commission;
JOHN MARTIN HAYES, in his official
capacity as Commissioner of the Florida
Elections Commission; KYMBERLEE
CURRY SMITH, in her official capacity
as Commissioner of the Florida
Elections Commission; and PATRICK
GILLESPIE, in his official capacity as
Deputy Secretary of Business Operations
of the Florida Department of
Management Services,
Defendants.
Civil Action No.:
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 2 of 70
COMPLAINT FOR DECLARATORY AND INJUNCTIVE RELIEF
Plaintiffs NetChoice, LLC (“NetChoice”) and Computer & Communications
Industry Association (“CCIA”)—trade associations of online businesses that share
the goal of promoting and protecting free speech and free enterprise on the Internet—
jointly bring this Complaint for declaratory and injunctive relief against the
Defendants in their official capacities, to enjoin the enforcement of Florida’s S.B.
7072, 2021 Leg. (Fla. 2021) (hereinafter, the “Act”),1 which infringes on the rights
to freedom of speech, equal protection, and due process protected by the First and
Fourteenth Amendments to the U.S. Constitution. The Act also exceeds the State of
Florida’s authority under the Constitution’s Commerce Clause and is preempted by
Section 230 of the Communications Decency Act. Because the Act violates the
constitutional rights of Plaintiffs’ members and contravenes federal law, it should be
promptly enjoined before it takes effect on July 1, 2021.
Overview
1.
The Act, a first-of-its-kind statute, was enacted on May 2, 2021 and
signed into law on May 24, 2021 to restrict the First Amendment rights of a targeted
selection of online businesses by having the State of Florida dictate how those
businesses must exercise their editorial judgment over the content hosted on their
1
The Act is codified in scattered sections of the Florida Statutes, including §§ 106.072, 287.137,
501.2041, 501.212. Below, the Act’s specific provisions are identified by Section (e.g., “Act § 2”),
as well as the provision of the Florida Statutes where they will be codified (e.g., “§ 106.072”).
-2-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 3 of 70
privately owned websites. The Act discriminates against and infringes the First
Amendment rights of these targeted companies, which include Plaintiffs’ members,
by compelling them to host—and punishing them for taking virtually any action to
remove or make less prominent—even highly objectionable or illegal content, no
matter how much that content may conflict with their terms or policies.
2.
These unprecedented restrictions are a blatant attack on a wide range of
content-moderation choices that these private companies have to make on a daily
basis to protect their services, users, advertisers, and the public at large from a
variety of harmful, offensive, or unlawful material:
pornography, terrorist
incitement, false propaganda created and spread by hostile foreign governments,
calls for genocide or race-based violence, disinformation regarding Covid-19
vaccines, fraudulent schemes, egregious violations of personal privacy, counterfeit
goods and other violations of intellectual property rights, bullying and harassment,
conspiracy theories denying the Holocaust or 9/11, and dangerous computer viruses.
Meanwhile, the Act prohibits only these disfavored companies from deciding how
to arrange or prioritize content—core editorial functions protected by the First
Amendment—based on its relevance and interest to their users. And the Act goes
so far as to bar those companies from adding their own commentary to certain
content that they host on their privately owned services—even labeling such
-3-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 4 of 70
commentary as “censorship” and subjecting the services to liability simply for
“post[ing] an addendum to any content or material posted by a user.”
3.
Under the Act, these highly burdensome restrictions apply only to a
select group of online businesses, leaving countless other entities that offer similar
services wholly untouched by Florida law—including any otherwise-covered online
service that happens to be owned by The Walt Disney Company (“Disney”) or other
large entities that operate a “theme park.”
This undisguised singling out of
disfavored companies reflects the Act’s true purpose, which its sponsors freely
admitted: to target and punish popular online services for their perceived views and
for certain content-moderation decisions that state officials opposed—in other
words, to retaliate against these companies for exercising their First Amendment
rights of “editorial discretion over speech and speakers on their property.”
Manhattan Community Access Corp. v. Halleck, 139 S. Ct. 1921, 1931 (2019).
4.
Rather than preventing what it calls “censorship,” the Act does the
exact opposite: it empowers government officials in Florida to police the protected
editorial judgment of online businesses that the State disfavors and whose perceived
political viewpoints it wishes to punish. This is evident from Governor Ron
DeSantis’ own press release that touts the Act as a means to “tak[e] back the virtual
public square” from “the leftist media and big corporations,” who supposedly
-4-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 5 of 70
“discriminate in favor of the dominant Silicon Valley ideology.”2 The Governor’s
press release also leaves no doubt about the Legislature’s unconstitutional viewpoint
discrimination: quoting a state legislator, it proclaims that “our freedom of speech
as conservatives is under attack by the ‘big tech’ oligarchs in Silicon Valley. But in
Florida, [this] … will not be tolerated.”3
5.
Although the Act uses scare terms such as “censoring,” “shadow
banning,” and “deplatforming” to describe the content choices of the targeted
companies, it is in fact the Act that censors and infringes on the companies’ rights
to free speech and expression; the Act that compels them to host speech and speakers
they disagree with; and the Act that engages in unconstitutional speaker-based,
content-based, and viewpoint-based preferences. The legislative record leaves no
doubt that the State of Florida lacks any legitimate interest—much less a compelling
one—in its profound infringement of the targeted companies’ fundamental
constitutional rights.
To the contrary, the Act was animated by a patently
unconstitutional and political motive to target and retaliate against certain companies
based on the State’s disapproval of how the companies decide what content to
display and make available through their services.
2
Press Release, Governor Ron DeSantis Signs Bill to Stop the Censorship of Floridians by Big
Tech (May 24, 2021) (“May 24, 2021 Gov. DeSantis Press Release”), www.flgov.com/2021/05/24
/governor-ron-desantis-signs-bill-to-stop-the-censorship-of-floridians-by-big-tech (last accessed
May 26, 2021).
3
Id.
-5-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 6 of 70
6.
The Act is a frontal assault on the First Amendment and an
extraordinary intervention by the government in the free marketplace of ideas that
would be unthinkable for traditional media, book sellers, lending libraries, or
newsstands. Could Florida require that the Miami Herald publish, or move to the
front page, an op-ed or letter to the editor that the State favored, or demand that the
Herald publish guest editorials in a state-sanctioned sequence? The answer is
obviously no—as the Supreme Court unanimously held five decades ago in Miami
Herald Publishing Co. v. Tornillo, 418 U.S. 241 (1974). Yet the State now seeks to
repeat that history—and to go even further by, for example, compelling the targeted
companies to alter and disclose their editorial standards and to provide “detailed”
information about the algorithms they use to curate content.
7.
The Act is so rife with fundamental infirmities that it appears to have
been enacted without any regard for the Constitution. The Act imposes a slew of
hopelessly vague content-based, speaker-based, and viewpoint-based restrictions on
the editorial judgments and affirmative speech of the selected online businesses that
it targets. These include the following unconstitutional provisions (the “Moderation
Restrictions”), all of which facially violate the First Amendment:
a.
Through its unprecedented “deplatforming” provision, the Act
prohibits targeted online services from terminating or suspending the accounts of
-6-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 7 of 70
“candidate[s]” for state or local political office.4 This ban applies no matter how
egregious or illegal the candidate’s conduct on a platform is—and regardless of
whether that conduct violates the online businesses’ terms of use and community
standards. Its prohibition on the use of judgment over the display of content
favored by the Legislature is backed by draconian fines of $250,000 per day.5
b.
The Act simultaneously bans the use of algorithms to organize,
prioritize, or otherwise curate “content and material posted by or about” anyone
paying the filing fee necessary to qualify as a political candidate.6 Under this
sweeping moderation restriction, any post that even mentions a candidate is
virtually immune from algorithmic moderation. This provision makes it unlawful
for covered online businesses to use their editorial discretion to curate content
posted by or about candidates in ways that respond to their users’ interests. It
would even prevent them from removing defamatory statements or “deepfake”
falsifications of a candidate’s words or movements. One Florida legislator who
voted for the Act succinctly describes the issue: “My concern is about potential
4
Act § 2 (adding § 106.072(2)). The Act adopts the preexisting definition of “candidate” under
Florida’s election laws, id. (adding § 106.072(6)), which includes (among other things) “[a] person
who files qualification papers and subscribes to a candidate’s oath as required by law.” F.S.
§ 106.011(3)(e). To qualify as a candidate for certain offices, the filing fee is only $25. F.S.
§ 99.061(3); see also Florida Dep’t of State, Elections Div., 2020 State Qualifying Handbook 17
(2020), files.floridados.gov/media/702970/state-qualifying-handbook-2020-20200408.pdf (last
accessed May 26, 2021).
5
Act § 2 (adding § 106.072(2)).
6
Act § 4 (adding § 501.2041(2)(h)) (emphasis added).
-7-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 8 of 70
candidates, about crazy people, Nazis and child molesters and pedophiles who
realize they can say anything they want . . . if all they do is fill out those two
pieces of paper.”7
c.
The Act bans covered online businesses from engaging in a broad
range of constitutionally protected moderation activities—not only removing or
taking down content, but also editing content and even “post[ing] an addendum
to any content” (i.e., engaging in their own affirmative speech)—with respect to
the novel and loosely defined concept of a “journalistic enterprise.”8 The term
“journalistic enterprise” reaches far beyond traditional media outlets (sweeping
in online propaganda outlets and conspiracy theorists, among others), without
affording protections to prevent imposters, foreign agents, or insurrectionists
from exploiting these rigid content-based mandates. And these mandates make
no exception for violent, sexually explicit, fraudulent, or otherwise unlawful
content.9
d.
The Act establishes a vague and unworkable requirement that
covered online businesses, which moderate billions of posts from billions of users
7
Steven Lemongello & Gary Rohrer, Florida law seeks to rein in large social media companies,
S. Fla. Sun Sentinel (May 24, 2021), www.sun-sentinel.com/news/politics/os-ne-desantis-signsbig-tech-bill-20210524-dvycnrscjjbfnnh7vbs3wimv5q-story.html (last accessed May 26, 2021)
(statement of Rep. Fine).
8
Act § 4 (adding § 501.2041(2)(j)).
9
Id.
-8-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 9 of 70
around the world every day, apply nearly all content decisions “in a consistent
manner”—a term not defined or clarified in any way, but that necessarily requires
reference to the underlying content and thus is content-based.10 Even if this
mandate were sufficiently clear and administrable (which it is not), this is yet
another example of the State dictating how online businesses exercise their
discretion in organizing and displaying content on their private websites. Like
the provisions discussed above, the chilling effect on speech is amplified by a
new private right of action authorizing awards of up to $100,000 in statutory
damages per claim and potential “punitive damages.”11
e.
The Act compels covered online businesses to allow users to “opt
out” of algorithms governing content moderation altogether12—again without
regard to the egregious, unlawful, or dangerous nature of the content—and
requires targeted businesses to publicly disclose and justify their exercise of
curatorial judgment, including revealing highly confidential and proprietary
methodologies used to moderate content.13 The Act further prohibits covered
online businesses from changing their editorial policies more than once every 30
10
Id. (adding § 501.2041(2)(b)).
Id. (adding § 501.2041(6)).
12
The Act requires covered businesses to allow all users to opt out of the presentation of content
that the websites normally offer, and to “allow sequential or chronological posts and content.” Id.
(adding § 501.2041(2)(f)(2)).
13
Id. (adding § 501.2041(2)(a) & (d), (3), (8)).
11
-9-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 10 of 70
days, even in response to changed circumstances, newly discovered threats, or
local or national emergencies.14
8.
The Act further violates the First Amendment and Equal Protection
Clause by (i) targeting only larger digital services and social media companies, while
(ii) irrationally exempting Disney and Universal Studios (owned by Comcast
Corporation) from its scope, simply because they own well-attended “theme parks”
in Florida.15 The Act’s legislative sponsors acknowledged that they chose this
protectionist carveout to ensure that companies with especially large economic
footprints in Florida—like Disney—are not “caught up in this.”16 None of the
Moderation Restrictions apply to traditional media or non-digital hosts of third-party
material (such as book publishers or businesses that use traditional bulletin boards).
Nor do they apply to online businesses that offer the same types of services, but do
not meet the arbitrary statutory requirements of having $100 million in annual
revenues or 100 million users anywhere in the world and thus qualifying as covered
“social media platforms.”17 None of these arbitrary distinctions are supported by
any legislative findings, or anything other than the impermissible desire to punish
14
Id. (adding § 501.2041(2)(c)).
Id. (adding § 501.2041(1)(g)).
16
Jim Saunders, Florida’s ‘Big Tech’ crackdown bill goes to DeSantis, but with a special
exemption for Disney, CL Tampa Bay (Apr. 30, 2021), www.cltampa.com/news-views/floridanews/article/21151908/floridas-big-tech-crackdown-bill-goes-to-desantis-but-with-a-specialexemption-for-disney (last accessed May 26, 2021).
17
Act § 4 (adding § 501.2041(1)(g)(4)).
15
-10-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 11 of 70
specific, disfavored online services. This underscores that the Act unconstitutionally
discriminates against only certain speakers, that it is gravely under- and
overinclusive, that it is neither narrowly tailored nor closely drawn, and that it is not
justified by any legitimate (much less compelling) governmental interest.
9.
The Act doubles down on its unconstitutional singling out of “social
media platforms” (a misleading term that also covers other digital services) by
allowing the State Attorney General to create a blacklist of companies (and a broad
range of related persons) that may be banned from bidding on or doing business with
the State merely because they are accused of violating state or federal antitrust
laws.18 This blacklist applies only to targeted “social media platforms”—not to any
other kind of business that may have been accused of violating or found to have
actually violated antitrust laws. The legislative and public record of the Act shows
that this punitive provision, like the rest of the Act, was designed to retaliate against
the targeted digital companies precisely because of their exercise of core First
Amendment free speech rights, including their perceived political viewpoints, their
prior exercise of editorial judgment, and their alleged views on particular political
candidates and office holders. The statements about the Act by the Governor of
Florida and the law’s sponsors confirm that its passage was motivated by retaliatory
and discriminatory animus, including their characterizations of Plaintiffs’ members
18
Act § 3 (adding § 287.137(2)(a)-(b)).
-11-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 12 of 70
as part of “leftist media” that are advancing a supposedly “dominant Silicon Valley
ideology.”19
10.
The Act is also unconstitutionally vague and overbroad. It fails to
define with sufficient definiteness what conduct is punishable. It sets nebulous
standards for enforcement that encourage arbitrary and discriminatory enforcement
of the law. And its astronomical fines and punitive damages for violations of these
opaque provisions will inevitably chill constitutionally protected practices and the
availability of protected expression.20
11.
The Act exceeds the limitations on state authority under federal law by
seeking to regulate wholly extraterritorial conduct in ways prohibited by the
Constitution’s Commerce and Due Process Clauses. First, the Act impermissibly
engages in protectionist discrimination against online businesses—and at the same
time, discrimination in favor of major Florida-based businesses and Florida
candidates. Second, the Act regulates large swaths of content-moderation decisions
that have no meaningful connection to (and indeed nothing at all to do with) the State
of Florida, based on business operations and transactions conducted outside of
Florida.
19
20
May 24, 2021 Gov. DeSantis Press Release.
Act § 2 (adding § 106.072(2)), § 4 (adding § 501.2041(6)).
-12-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 13 of 70
12.
On top of all these constitutional infirmities, the Act’s restrictions on
content moderation conflict with Section 230 of the Communications Decency Act,
a federal statute enacted with the specific goal of protecting the decisions of online
services from state-based regulation and liability. As Congress intended, Section
230 affords online service providers the freedom to make their own decisions about
whether and how to restrict objectionable content.21 Because the Act purports to
apply “to the extent not inconsistent with federal law,” including Section 230, its
limitations on content moderation are not only preempted by federal law, but also
rendered unenforceable under the Act itself. And given the vague sweep of the Act
and its harsh penalties, its inclusion of a one-line claim that it, in effect, does not do
any of the things it otherwise purports to do will not avoid its chilling effect on the
moderation of content protected by the U.S. Constitution and federal law.
13.
For all these reasons, and as described further below, Plaintiffs seek
(1) an order declaring the Act unconstitutional on its face and (2) a preliminary and
permanent injunction enjoining its enforcement.22
21
See 47 U.S.C. § 230(c)(2), (e)(3).
Plaintiffs separately reserve all rights to challenge the lawfulness of the Act under the Florida
Constitution in the state courts of Florida. This Complaint is limited to claims arising under federal
law, and it does not raise issues of state constitutional law.
22
-13-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 14 of 70
Jurisdiction
14.
This Court has jurisdiction over this federal civil rights action under 28
U.S.C. § 1331 because the claims in this action arise under the U.S. Constitution and
federal law. Plaintiffs’ claims arise under the First and Fourteenth Amendments,
and seek to invalidate certain provisions of the Act based on federal preemption
under the Constitution’s Supremacy Clause.
15.
This Court has authority to grant relief under the Declaratory Judgment
Act, 28 U.S.C. §§ 2201, 2202, and the Civil Rights Act, 28 U.S.C. 1343(a), 42
U.S.C. § 1983.
16.
In addition, this Court has authority to issue injunctive relief under the
All Writs Act, 28 U.S.C. § 1651.
17.
This Court’s jurisdiction is properly exercised over the Defendants in
their official capacities, Ex parte Young, 209 U.S. 123 (1908), as Plaintiffs are
seeking declaratory and injunctive relief.
18.
There is an actual controversy of sufficient immediacy and
concreteness relating to the legal rights and duties of Plaintiffs’ members to warrant
relief under 42 U.S.C. § 1983 and 28 U.S.C. §§ 2201, 2202. The harm to Plaintiffs’
members as a direct result of the actions and threatened actions of Defendants is
sufficiently real and imminent to warrant the issuance of a conclusive declaratory
judgment and prospective injunctive relief.
-14-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 15 of 70
19.
The restrictive and discriminatory provisions of the Act will become
law effective July 1, 2021. Plaintiffs’ members will then become subject to the risk
of liability, as described more fully below.
20.
Plaintiffs’ members include online businesses, online social media
platforms, online marketplaces, and e-commerce businesses and range from wellknown online businesses to individual users of e-commerce services.23
21.
As private businesses, Plaintiffs’ members have the right to decide what
content is appropriate for their sites and platforms.
Those decisions are a
constitutionally protected form of speech.
22.
Plaintiffs’ members are the direct targets of the Act, engage in content-
moderation activities that are covered by the Act, and will face serious legal
consequences from failing to comply with its requirements. These members meet
the statutory definition of a covered “social media platform” under the Act, because
they (i) allow users to post or upload content onto their platforms, (ii) are
incorporated legal business entities, (iii) do business in the State of Florida, (iv) meet
the Act’s revenue or user-based thresholds, and (v) are not exempted under the
23
Members of one or both Plaintiff organizations include Airbnb, Alibaba.com, Amazon.com,
AOL, DJI, DRN, eBay, Etsy, Expedia, Facebook, Fluidtruck, Google, HomeAway, Hotels.com,
Lime, Nextdoor, Lyft, Oath, OfferUp, Orbitz, PayPal, Pinterest, StubHub, TikTok, Travelocity,
TravelTech, Trivago, Turo, Twitter, Verisign, VRBO, Vigilant Solutions, VSBLTY, Waymo,
Wing, and Yahoo!. See NetChoice, www.netchoice.org/about; & CCIA, www.ccianet.org/
about/members. Collectively, these members employ tens of thousands of Floridians.
-15-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 16 of 70
exception for certain operators of theme parks.
See Act § 4 (adding
§ 501.2041(1)(g)). Accordingly, the members have standing to challenge the Act.
23.
In addition, the Act’s Moderation Restrictions compel members to host
content or speakers contrary to their policies and community standards, require that
they fundamentally change the types of content available on their privately owned
platforms, and force them to subject certain of their users and posters to arbitrary
and irrational disfavored treatment because of the content- and speaker- based
restrictions that the State of Florida has imposed. These requirements will have
long-term reputational effects on Plaintiffs’ members, which are enduring and thus
irreparable. Failure to comply would expose members to severe penalties, including
civil and administrative actions by the Attorney General, fines of $250,000 per day
by the Florida Elections Commission, as well as private rights of action that include
up to $100,000 in statutory damages per claim, “[a]ctual damages,” “equitable
relief,” and potential “punitive damages.” Id. (adding § 501.2041(6)). That risk
casts a serious chilling effect on activity protected by the First Amendment,
including both members’ content-moderation practices and their own speech
concerning user-generated content.
24.
Given the Act’s inevitable and imminent impact on Plaintiffs’
members’ ability to engage in their moderation practices consistent with their terms
of service and community standards, the Act will harm Plaintiffs’ members in
-16-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 17 of 70
numerous ways, including by (i) interfering with their content judgments on their
privately owned sites, (ii) exposing them to potential liability at the hands of the
State Attorney General and Florida Elections Commission, (iii) exposing them to
potential liability under the new private right of action discussed above,
(iv) subjecting them to unlawful compelled disclosure of private, competitively
sensitive and proprietary business information, and (v) making it harder for them to
provide high-quality services to their users and customers. Specifically, the Act
would compel Plaintiffs’ members to degrade the services they provide and the
content they host on their private platforms: the Act requires members to display
and prioritize user-generated content that runs counter to their terms, policies, and
business practices; content that will likely offend and repel their users and
advertisers; and even content that is unlawful, dangerous to public health and
national security, and grossly inappropriate for younger audiences.
25.
In addition, Plaintiffs’ members will be required to expend time and
substantial resources to change the operations of and redesign their privately owned
services and platforms to comply with numerous arbitrary and State-mandated
requirements. These include obligations to (i) “[c]ategorize algorithms used for
post-prioritization and shadow banning,” Act § 4 (adding § 501.2041(2)(f)(1));
(ii) develop processes and procedures to track and manage user opt-outs, id. (adding
§ 501.2041(2)(f)(2)); (iii) “allow a user who has been deplatformed to access or
-17-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 18 of 70
retrieve all of the user’s information, content, material, and data for at least 60 days”
after receipt of notice, id. (adding § 501.2041(2)(i)); (iv) “[p]rovide a mechanism
that allows a user to request the number of other individual platform participants
who were provided or shown the user’s content or posts,” id. (adding
§ 501.2041(2)(e)(1)); and (v) “[p]rovide, upon request, a user with the number of
other individual platform participants who were provided or shown content or
posts,” id. (adding § 501.2041(2)(e)(2)). And if Plaintiffs’ members do not comply
with these highly burdensome obligations, they face the imminent threat of massive
penalties under an unconstitutional and federally preempted law.
Plaintiffs’
members will thus suffer an immediate injury or would be threatened by one if the
Act were allowed to stand. Plaintiffs anticipate that their members will face
enforcement actions, brought by the Attorney General or by private litigants,
immediately after the law goes into effect because they are engaging in and intend
to continue engaging in moderation activity that is covered by the Act and that the
Attorney General would likely allege to be a violation of the Act.
26.
Because the statute so clearly targets, and was specifically intended
to target, Plaintiffs’ members and their activities, this fear is well-founded and
credible.
The statements of Governor Ron DeSantis and the law’s sponsors
demonstrate that Defendants and the State of Florida plan to target Plaintiffs’
members in state proceedings to enforce the Act’s unconstitutional restraint of their
-18-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 19 of 70
editorial judgment, content-moderation practices, and First Amendment rights. For
example, Governor DeSantis proclaimed in his May 24 press release that “[i]f Big
Tech censors enforce rules inconsistently, to discriminate in favor of the dominant
Silicon Valley ideology, they will now be held accountable.”24 Similarly, on
February 2, 2021, Governor DeSantis stated that “if a technology company uses their
content- and user-related algorithms to suppress or prioritize the access of any
content related to a political candidate or cause on the ballot, that company will also
face daily fines,” and added that “[t]he message is loud and clear: When it comes to
elections in Florida, Big Tech should stay out of it.”25 Governor DeSantis also
declared that Florida was “going to take aim at those companies,” which include
Plaintiffs’ members.26
27.
Plaintiffs have associational standing to bring this suit on behalf of their
members. As described above, Plaintiffs’ members have standing to challenge the
statute. See supra ¶¶ 20-26. Further, the Act is fundamentally at odds with
Plaintiffs’ policy objectives, and challenging the Act is germane to Plaintiffs’
24
May 24, 2021 Gov. DeSantis Press Release.
Michael Moline, Gov. DeSantis pushing to punish ‘Big Tech’ companies that ‘censor’ political
speech, Florida Phoenix (Feb. 2, 2021), www.floridaphoenix.com/2021/02/02/gov-desantispushing-to-punish-big-tech-companies-that-censor-political-speech-such-as-trump-speech (last
accessed May 26, 2021).
26
Corbin Barthold & Berin Szóka, No, Florida Can’t Regulate Online Speech, Lawfare (March
12, 2021) www.lawfareblog.com/no-florida-cant-regulate-online-speech (last accessed May 26,
2021); see also Gov. Ron DeSantis, Facebook, www.facebook.com/GovRonDeSantis/posts/
3849516841773014 (last accessed May 26, 2021).
25
-19-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 20 of 70
respective missions. See supra ¶¶ 32-34. The claims and relief sought do not require
proof specific to particular members and, in any event, Plaintiffs are able to provide
evidence about the Act’s impact on the companies they represent. The members’
individual participation is thus not required.
28.
This Court’s immediate review of the Act’s constitutionality is
necessary to prevent an imminent infringement of Plaintiffs’ members’ fundamental
constitutional rights.
29.
Under these circumstances, judicial intervention is warranted to resolve
a genuine case or controversy within the meaning of Article III of the U.S.
Constitution regarding the constitutionality and legality of the Act.
30.
A declaration that the Act is unconstitutional and preempted by federal
law would definitively resolve that controversy for the parties.
Venue
31.
Venue is proper in this Court under 28 U.S.C. § 1391(b)(1)-(2). The
Defendants are considered to reside in the Northern District of Florida because this
is where they perform their official duties. 28 U.S.C. § 1391(b)(1). Additionally,
the Attorney General of Florida, in her official capacity, regularly conducts business
and proceedings in her offices in this District, and the events giving rise to Plaintiffs’
claims occurred in this District.
-20-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 21 of 70
The Parties
Plaintiffs
32.
Plaintiff NetChoice is a national trade association of online businesses
who share the goal of promoting free speech and free enterprise on the Internet.
NetChoice is a 501(c)(6) nonprofit organization headquartered in Washington, D.C.
33.
For over two decades, NetChoice has worked to promote online
commerce and speech and to increase consumer access and options through the
Internet, while minimizing burdens on businesses that are making the Internet more
accessible and useful.
34.
Plaintiff CCIA is an international, not-for-profit membership
association representing a broad cross-section of companies in the computer,
Internet, information technology, and telecommunications industries. CCIA is a
501(c)(6) trade association headquartered in Washington, D.C. For almost fifty
years, CCIA has promoted open markets, open systems, and open networks.
Defendants
35.
Defendant Ashley Brooke Moody is the Attorney General of the State
of Florida. She is the State’s chief law enforcement officer and representative of the
State in “all suits or prosecutions, civil or criminal or in Empowerment,” brought or
opposed by the State. F.S. §§ 16.01, et. seq. In her official capacity, Ms. Moody
oversees the Florida Department of Legal Affairs, which is responsible for enforcing
-21-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 22 of 70
Section 4 of the Act. Section 4 expressly authorizes the Attorney General to
“investigate” a “suspect[ed] violation” of that section of the Act and “to bring a civil
or administrative action under this part.” Section 3 instructs the Attorney General
to determine whether “there is probable cause that a person has likely violated the
underlying antitrust laws,” and, if so, to initiate procedures for temporarily placing
that person on the Antitrust Violator Vendor List. Defendant Moody is sued for
declaratory and injunctive relief in her official capacity as the Attorney General of
the State of Florida.
36.
Defendant Joni Alexis Poitier is a Commissioner of and the Vice Chair
of the Florida Elections Commission, which is the administrative agency charged
with enforcing, among other things, Chapter 106 of Florida’s Election Code and thus
has jurisdiction under Florida law to investigate and determine violations of Section
2 of the Act.27 Section 2 expressly authorizes the Elections Commission to find a
violation of subsection (2) of that Section and to assess fines of up to $250,000 per
day for “deplatforming” a candidate for statewide office, and of $25,000 per day for
“deplatforming” a candidate for any other office. Ms. Poitier is sued for declaratory
and injunctive relief in her official capacity as Florida Elections Commission
Commissioner and Vice Chair.
27
The term of service for each of the Commissioners of the Florida Elections Commission has
expired. However, the named individuals are still serving as Commissioners and will continue to
do so until Florida’s Governor makes new appointments to their positions.
-22-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 23 of 70
37.
Defendant Jason Todd Allen is a Commissioner of the Florida Elections
Commission, which is the administrative agency charged with enforcing, among
other things, Chapter 106 of Florida’s Election Code and thus has jurisdiction under
Florida law to investigate and determine violations of Section 2 of the Act. Mr.
Allen is sued for declaratory and injunctive relief in his official capacity as Florida
Election Commissions Commissioner.
38.
Defendant John Martin Hayes is a Commissioner of the Florida
Elections Commission, which is the administrative agency charged with enforcing,
among other things, Chapter 106 of Florida’s Election Code and thus has jurisdiction
under Florida law to investigate and determine violations of Section 2 of the Act.
Mr. Hayes is sued for declaratory and injunctive relief in his official capacity as
Florida Elections Commission Commissioner.
39.
Defendant Kymberlee Curry Smith is a Commissioner of the Florida
Elections Commission, which is the administrative agency charged with enforcing,
among other things, Chapter 106 of Florida’s Election Code and thus has jurisdiction
under Florida law to investigate and determine violations of Section 2 of the Act.
Ms. Smith is sued for declaratory and injunctive relief in her official capacity as
Florida Elections Commission Commissioner.
40.
Defendant Patrick Gillespie is the Deputy Secretary of Business
Operations of the Florida Department of Management Services (the “Department”).
-23-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 24 of 70
Under Florida law, the Department is responsible for developing and overseeing the
procedures under which the State and its agencies purchase commodities and
services. The Act tasks the Department and the Deputy Secretary with enforcing
Section 3 of the Act by, among other things, creating and maintaining the “Antitrust
Violator Vendor List.” In February 2021, the Secretary of the Department resigned,
and Governor DeSantis has not appointed a replacement. Accordingly, Deputy
Secretary Gillespie is currently responsible for enforcing Section 3 of the Act.
41.
The above-identified Defendants (collectively, the “Defendants”) are
charged with enforcing the provisions of the Act challenged by this action. The
Defendants have the authority under the Act to investigate, fine, and otherwise
penalize Plaintiffs’ members for exercising their constitutional rights under the First
and Fourteenth Amendments to the U.S. Constitution.
42.
The Defendants are charged to act—and would continue to act if not
enjoined—under color of state law.
43.
Plaintiffs sue the Defendants here in their official capacities to prevent
imminent violations of the constitutional rights of Plaintiffs’ members.
Plaintiffs’ Members Engage In Beneficial Content Moderation That Is
Directly Restricted By The Act
44.
Plaintiffs’ members operate online services that host and publish an
enormous amount and variety of user-generated content, including text, videos,
audio clips, and photographs. The material that is uploaded to these services comes
-24-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 25 of 70
from all over the world and is unfathomably diverse.
These online services
showcase the best of human thought: material that is endlessly creative, humorous,
intellectually stimulating, educational, and politically engaging. Unfortunately,
however, some of the material submitted to these services is none of these things.
The openness of the Internet is a magnet for some of the best and worst aspects of
humanity, and any online service that allows users to easily upload material will find
some of its users attempting to post highly offensive, dangerous, illegal, or simply
unwanted content. This content may be problematic in a variety of ways, including
(among other things) featuring hardcore and illegal “revenge” pornography,
depictions of child sexual abuse, terrorist propaganda, efforts by foreign adversaries
to foment violence and manipulate American elections, efforts to spread white
supremacist and anti-Semitic conspiracy theories, misinformation disseminated by
bot networks, fraudulent schemes, malicious efforts to spread computer viruses or
steal people’s personal information, spam, virulent racist or sexist attacks, death
threats, attempts to encourage suicide and self-harm, efforts to sell illegal weapons
and drugs, pirated material that violates intellectual property rights, and false and
defamatory statements.
45.
Without serious and sustained effort by online services to stop, limit,
and control such content—and the people or entities who seek to disseminate it—
-25-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 26 of 70
these services could be flooded with abusive and objectionable material, drowning
out the good content and making their services far less enjoyable, useful, and safe.
46.
That is why Plaintiffs’ online service members—and nearly every
online service that is open to hosting user-generated content—have rules and
policies setting out what content and activities are, and are not, permitted on their
services. And it is why those services devote enormous amounts of time, resources,
personnel, and effort to engaging in content moderation. As is clear from the above
discussion of their moderation practices, Plaintiffs’ members make individualized
decisions and do not serve the public indiscriminately. They are private speech
forums operated by private companies that “exercise editorial control over speech
and speakers on their properties or platforms.” Manhattan Community Access Corp.,
139 S. Ct. at 1932.
47.
Content moderation can take many different forms, involving both
human review and algorithmic or other automated moderation tools. Sometimes,
content moderation involves removing objectionable or unlawful content or
terminating the accounts of users who post such material. Sometimes it is more
nuanced, involving decisions about how to arrange and display content, what content
to recommend to users based on their interests, and how easy or difficult it should
be to find or search for certain kinds of content. Content moderation sometimes can
take the form of “zoning” or “age-gating,” whereby certain content is made
-26-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 27 of 70
accessible to adults but not minors, or to teenagers but not younger children. In other
instances, content moderation involves empowering users with tools so they can
decide for themselves what content to avoid, such as by blocking or muting others,
making certain content inaccessible to their children, or opting into special sections
of an online service that exclude material likely to offend or upset especially
sensitive users. Content moderation can also involve direct speech by service
providers themselves, in the form of warning labels, disclaimers, or commentary
appended to certain user-submitted material.
For example, an online service
provider might inform users that the relevant content was posted by a hostile foreign
government, that it has not been verified by official sources, that the information has
been found to be false, or that it contains sensitive or potentially upsetting imagery
that is not appropriate for everyone. It would then be up to the user to decide whether
to review the content. Content moderation is even necessary for the most basic
online functions that users may take for granted, like searching for local businesses,
movie showtimes, or weather reports.
Without organizing and curating the
unfathomable volume of online content, online services would have no way to
identify and deliver to users the content that they want—or may critically need—to
see.
48.
Content moderation, in these myriad forms, serves many significant
functions. First, it is the means by which the online service expresses itself. Just as
-27-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 28 of 70
a newspaper or magazine has the freedom to choose a cover story, leave out certain
letters to the editor, or ban profanity from its pages, an online service performs the
same curation function according to its terms and policies. At the same time, a
service’s policies concern more than just what is does or does not publish: they
influence the kind of online community, environment, and atmosphere that users
experience. A website aiming to be family-friendly, for example, cannot produce
that experience for its users if it is prevented from limiting or removing graphic or
viscerally offensive posts. Content moderation thus goes to the heart of their
editorial judgment, just as it does when a newspaper like the Miami Herald decides
whether to publish a letter to the editor.
49.
Second, moderating content on services open to billions of users,
including families and children, is essential to ensure safer communities online. For
instance, restricting access for younger users to adult content is analogous to
applying age-based ratings to movies or scheduling mature programming for later
hours.
To constrain how the online service can manage offensive content,
conspiracy theories, incitements to violence, and other egregious forms of content is
to require them, against their will, to offer their virtual tools and space for unintended
uses that endanger the public.
50.
Third, aside from public safety, State-mandated controls on how
platforms must permit, organize, and present content also renders an online service
-28-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 29 of 70
less useful and undermines Plaintiffs’ members’ core business models. Imagine if a
search engine or social media company returned its results in a random or purely
chronological order instead of prioritizing what is most helpful or relevant to the
user based on her own activities and demonstrated preferences. As a result, the user
might miss content from her close friends and family and instead see a slew of more
recent, but less relevant content. Or imagine if an e-commerce website presented a
random assortment of products or listings instead of those for which the user is
searching. The main value many online services offer is curating, sorting, and
displaying the vast amount of information available online.
51.
Florida’s Act directly targets—and would profoundly disrupt—these
vital, and constitutionally protected, content moderation efforts. As discussed
below, the Act’s expansive restrictions constrain and burden nearly every type of
content moderation activity that is critical to online services’ ability to express their
editorial judgments; protect users from offensive, harmful or dangerous material;
and provide useful online tools on which billions of people rely every day. The Act
applies not merely to decisions removing content or users from a service. It equally
covers—in some instances outright prohibits—more fine-grained approaches, such
as limiting the exposure of younger or more sensitive users to potentially upsetting
content. The Act goes so far as to control how the services can use automated
processes like algorithms to arrange and curate content, and it seeks to limit these
-29-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 30 of 70
services’ own direct speech by prohibiting them from posting warning labels or
commentary. In short, the Act subjects nearly every content-moderation judgment
a covered service might make to the State’s regulatory control, saddling those
judgments with burdensome new obligations, restrictions, and the ever-present
threat of government or private enforcement action. The Act thus threatens not just
the types of experience and community those services can offer, but also how they
fundamentally operate.
Florida’s Unconstitutional Act
52.
The Act was enacted by the Florida Legislature on May 2, 2021, signed
into law by Governor DeSantis on May 24, 2021, and goes into effect on July 1,
2021. Act § 7.
53.
The Act’s legislative history, as well as public statements by state
legislators and public officials, make clear that the Act was motivated by animus
toward popular technology companies—animus specifically driven by disapproval
of the companies’ perceived political and other viewpoints. See supra ¶¶ 3-4. One
of the Act’s sponsors declared during the signing ceremony, “[D]o not think a
handful of kids behind a desk in Silicon Valley get to be the arbiter of what free
speech is … it’s about time someone took them head on.”28 Lieutenant Governor
28
Governor Ron DeSantis press conference in Miami, YouTube (May 24, 2021),
www.youtube.com/watch?v=O67BF-2IWiY, at 18:08 (last accessed May 26, 2021) (statement of
Rep. Ingoglia).
-30-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 31 of 70
Jeanette Nuñez agreed, condemning what she characterized as “an effort to silence,
intimidate, and wipe out dissenting voices by the leftist media and big
corporations.”29 And Governor DeSantis praised the Act as a way to “tak[e] back
the virtual public square” from “the leftist media and big corporations.”30
54.
This animus toward disfavored online businesses is well documented
in the public record. When discussing the proposed legislation in February 2021,
Governor DeSantis described online businesses targeted by the Act as “big brother,”
because of his stated view that these companies are “tougher on those on the political
right than left.”31 Speaker of the Florida House of Representatives, Chris Sprowls,
has expressed similar sentiments.32
i.
“Social Media Platforms”
55.
The Act targets various online businesses (including operators of social
media platforms, search engines, and online marketplaces) that the Florida
Legislature sweeps under the misleading term, “social media platforms.”
56.
Consistent with the legislative history described above, the Act was
drafted to target popular technology companies, while carving out Florida-based
29
May 24, 2021 Gov. DeSantis Press Release.
Id.
31
John Kennedy, Gov. DeSantis says ‘big tech’ looks like ‘big brother’, Sarasota Herald-Tribune
(Feb. 2, 2021), www.heraldtribune.com/story/news/politics/2021/02/02/ron-desantis-backingeffort-stop-tech-censorship/4352705001 (last accessed May 26, 2021).
32
May 24, 2021 Gov. DeSantis Press Release.
30
-31-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 32 of 70
Disney and Universal Studios. To single out these targeted companies, the Act
applies its Moderation Restrictions, onerous affirmative obligations, and antitrust
blacklist only to the defined “social media platforms.” And the Act limits these
covered online businesses to those that host third-party content and have either
(i) “annual gross revenues in excess of $100 million, as adjusted in January of each
odd-numbered year to reflect any increase in the Consumer Price Index” or (ii) “at
least 100 million monthly individual platform participants globally”—subject to an
arbitrary exception (see infra ¶ 57) for powerful and influential Florida-based
businesses. Act § 4 (adding § 501.2041(1)(g)). Nothing in the Act, including the
legislative findings, explains how or why the perceived problems that the statute
supposedly addresses is limited to these entities.
57.
For openly protectionist reasons, the Act excludes companies that are
politically influential in Florida from its definition of “social media platform,” even
when those companies operate online services that would otherwise meet the
statutory definition. The Act carves out companies that own and operate wellattended theme parks—an exemption that conveniently covers Disney and Universal
Studios (owned by Comcast Corporation).33 No legitimate government interest
33
Under the law, “social media platform” does not include any “information service, system,
Internet search engine, or access software provider operated by a company that owns and operates
a theme park or entertainment complex as defined in s. 509.013.” Act § 4 (adding
§ 501.2041(1)(g)).
-32-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 33 of 70
could be advanced by such an exemption, nor was any such interest identified.
Rather, as one of the law’s sponsors remarked, the exemption was added with the
undisguised objective of ensuring that certain companies with big economic
footprints in Florida—like Disney—are not “caught up in this.” 34 The decision to
exempt those major companies confirms that the law’s true objective is to control
the private speech of politically disfavored companies who have online platforms,
but not to control the speech of similarly situated but politically favored companies
with power and influence in the State of Florida.
58.
As explained above (see supra ¶¶ 20-22 & n.23), several of Plaintiffs’
members fall within the statutory definition of “social media platform,” and do not
“own and operate a theme park or entertainment complex.”
59.
The Act infringes on the rights of Plaintiffs’ members in numerous
ways. Key provisions of the Act are summarized below.
ii.
Ban on Restricting Postings by Candidates (Section 2)
60.
Section 2 of the Act prohibits any “social media platform” from
“willfully deplatforming a candidate for office who is known by the social media
platform to be a candidate, beginning on the date of qualification and ending on the
34
Jim Saunders, Florida’s ‘Big Tech’ crackdown bill goes to DeSantis, but with a special
exemption for Disney, CL Tampa Bay (Apr. 30, 2021), www.cltampa.com/news-views/floridanews/article/21151908/floridas-big-tech-crackdown-bill-goes-to-desantis-but-with-a-specialexemption-for-disney (last accessed May 26, 2021).
-33-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 34 of 70
date of the election or the date the candidate ceases to be a candidate.” Act § 2
(adding § 106.072(2)). Section 2 further requires covered online businesses to
“provide each user a method by which the user may be identified as a qualified
candidate and which provides sufficient information to allow the platform to confirm
the user’s qualifications.” Id.
61.
Under the Act, “deplatform” is broadly defined to mean the “action or
practice by a social media platform to permanently delete or ban a user or to
temporarily delete or ban a user from the social media platform for more than 14
days.” Act § 4 (adding § 501.2041(1)(c)); cf. Act § 2 (adding § 106.072(1)(b)).
62.
The Act inexplicably contains exemptions that allow online businesses
to favor paid content by third parties or candidates over unpaid content—seemingly
in violation of the “post-prioritization” and “shadow banning” prohibitions. Act § 4
(adding § 501.2041(1)(e)-(f), (2)(d)).
63.
The Florida Elections Commission is vested with jurisdiction to
determine whether Section 2 has been violated, and to impose fines as high as
$250,000 per day for violations involving candidates for statewide office (and
$25,000 per day for candidates for other offices). Act § 2 (adding § 106.072(3)).
64.
The Act provides that Section 2 may not be enforced if it is inconsistent
with federal law or 47 U.S.C. 230(e)(3). Id. (adding § 106.072(5)). Section 2 is
-34-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 35 of 70
inconsistent with the First and Fourteenth Amendments of the U.S. Constitution, and
other federal law, for the reasons explained below.
iii.
Additional Moderation Restrictions (Section 4)
65.
Section 4 of the Act is a frontal attack on the constitutional rights of
Plaintiffs’ members to make editorial judgments about speech hosted on their
property. It directly restricts and burdens the content moderation judgments of
covered online businesses. In particular, Section 4 enacts restrictions that effectively
ban most, if not all, moderation of content posted “by or about” political candidates.
And it severely restricts and burdens moderation practices with respect to postings
or content from a loosely defined category of “journalistic enterprises.” These
provisions compel a disfavored group of private businesses to host—and
dramatically limit their ability to restrict, decide how to display, or even offer their
own commentary on—highly objectionable or even illegal content, such as sexually
explicit material, user posts that incite or glorify violence and acts of terrorism,
online harassment and bullying, anti-Semitic and racist hate speech, defamation, and
misinformation (such as hoaxes involving public health issues).
66.
Section 4 also imposes on covered online businesses a broad, but
wholly undefined, mandate to apply any possible editorial judgments they might
make about the virtually unlimited amount of content they host “in a consistent
manner among [their] users”—an obligation that is all but impossible to understand,
-35-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 36 of 70
much less comply with. And Section 4 imposes onerous notice and other affirmative
requirements regarding the editorial judgments made by these businesses. The
notice requirements are particularly burdensome and problematic because by
prescribing specific disclosures about the reason for the removal of virtually any
category of content, covered online businesses would be providing a host of badfaith actors (from terrorists to hostile foreign governments and spammers) a roadmap
for how to post unwanted, harmful content by circumventing the protections
currently in place.
67.
In sum, Section 4 impermissibly subordinates covered businesses’
judgments about what content to display on their services and in what manner to the
State’s fiat. This is the modern-day equivalent of the unconstitutional attempt to
force the Miami Herald to publish a letter affording a “right of reply,” which the
Supreme Court soundly rejected in Tornillo. And it is eerily reminiscent of efforts
by authoritarian regimes around the world to control private online services and
force them to conform to a state-approved message. As just one example, Human
Rights Watch has noted Russia’s enactment of “increasingly oppressive” laws
targeting social media platforms that “forc[e] them” to alter their moderation
practices concerning “online content deemed illegal by the government.”35 A
35
See Russia: Social Media Pressured to Censor Posts, Human Rights Watch (Feb. 5, 2021),
www.hrw.org/news/2021/02/05/russia-social-media-pressured-censor-posts (last accessed May
-36-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 37 of 70
Russian bill currently under consideration “proposes fines for social media
companies that ‘illegally block users,’” and “aims to prevent the potential blocking
of Russian politicians’ social media profiles.”36
68.
Section 4 specifically delineates a list of “[u]nlawful acts and practices
by social media platforms,” Act § 4 (adding § 501.2041), all of which seek to deprive
covered online businesses of their editorial discretion and replace it with statecompelled speech by prohibiting numerous activities protected by the First
Amendment. For example:
a.
Covered online businesses must not edit the content of a “journalistic
enterprise,” “post an addendum to” any content of such an enterprise, or
“deplatform” the enterprise based on “the content of its publication or broadcast.”
Id. (adding § 501.2041(2)(j), (1)(b)).37 A “journalistic enterprise” is broadly
26, 2021). For instance, one recently enacted law “empower[s] the authorities to block websites”
that restrict access to “Russian state media content.” Id.
36
Id.; see also Adam Satariano & Oleg Matsnev, Russia Raises Heat on Twitter, Google and
Facebook in Online Crackdown, N.Y. Times (May 26, 2021), www.nytimes.com/2021/05/26/
technology/russia-twitter-google-facebook-censorship.html (last accessed May 26, 2021).
37
While “censorship” is traditionally used to refer to the actions of government officials to limit
free expression, the Act uses the misleading scare-terms “censorship” and “shadow banning” to
cover routine moderation practices, such as editing objectionable content. See Act § 4 (adding
§ 501.2041(1)(b) (defining “censor” as “any action taken by a social media platform to delete,
regulate, restrict, edit, alter, inhibit the publication or republication of, suspend a right to post,
remove, or post an addendum to any content or material posted by a user. The term also includes
actions to inhibit the ability of a user to be viewable by or to interact with another user of the social
media platform.”); see also id. (adding § 501.2041(1)(f)) (defining “shadow ban” as “action by a
social media platform, through any means, whether the action is determined by a natural person or
an algorithm, to limit or eliminate the exposure of a user or content or material posted by a user to
other users of the social media platform.”). Under these definitions, a decision that sexually
-37-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 38 of 70
defined as “an entity doing business in Florida” that (1) publishes more than
100,000 words online and has at least 50,000 paid subscribers or 100,000
monthly active users; (2) publishes online at least 100 hours of audio or video
and has at least 100 million viewers annually; (3) “operates a cable channel that
provides more than 40 hours of content per week to more than 100,000 cable
television subscribers”; or (4) “[o]perates under a broadcast license issued by the
Federal Communications Commission.” Id. (adding § 501.2041(1)(d)). This
sweeping definition would shield many outlets that publish foreign propaganda
and conspiracy theories.
b.
Covered online businesses must not use any algorithms to curate and
arrange “content and material posted by or about” a candidate. Id. (adding
§ 501.2041(2)(h)) (characterizing actions as “post-prioritization” and “shadow
banning”).
c.
Covered online businesses must not edit a user’s content or
“deplatform” the user, unless the social media platform gives the user detailed
written notice, including “a thorough rationale” justifying such actions, a “precise
and thorough explanation of how the social media platform became aware of the
censored content or material, including a thorough explanation of the algorithms
explicit or violent content should be restricted to users above the age of 18 would potentially
constitute forbidden “shadow banning.”
-38-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 39 of 70
used, if any, to identify or flag the user’s content or material as objectionable.”
Id. (adding § 501.2041(2)(d), (3)) (characterizing actions as “censoring” and
“shadow banning”). This obligation to thoroughly justify content decisions
applies even if the online business takes action to protect its users from highly
objectionable material posted by terrorist groups or hostile foreign governments.
d.
Covered online businesses must not use algorithms that arrange
content other than in chronological order if the user has opted out of such
algorithms
under
the
mandatory
opt-out
provision.
Id.
(adding
§ 501.2041(2)(f)(2)).
e.
Covered online businesses must not change editorial policies more
than once every 30 days, even if responding to new and changed circumstances
and threats. Id. (adding § 501.2041(2)(c)).
69.
Section 4 also includes the vague mandate that these censorship,
deplatforming, and shadow banning standards be implemented in a “consistent
manner” among users on the platform. Act § 4 (adding § 501.2041(2)(b)). This
subjective standard is not defined in the Act and may serve as the basis for a private
cause of action by users with statutory damages of $100,000 per day, actual
damages, and “punitive” damages. Id. (adding § 501.2041(6)). The Act also
includes the vague requirement that covered websites must “[c]ategorize algorithms
used for post-prioritization and shadow banning.” Id. (adding § 501.2041(2)(f)(1)).
-39-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 40 of 70
Similarly vague is the requirement that covered online businesses “inform” a
candidate if they have “willfully provide[d] free advertising for” the candidate, in
which case the Act treats this “free advertising” (an undefined concept) as an “inkind contribution” for purposes of Florida’s election laws.
Act § 2 (adding
§ 106.072(4)).38
70.
In addition, the Act places numerous affirmative burdens on covered
online businesses to:
a.
“inform each user about any changes to its user rules, terms, and
agreements before implementing the changes” (in addition to the ban on changes
more frequent than once a month). Id. (adding § 501.2041(2)(c)).
b.
“provide users with an annual notice on the use of algorithms for
post-prioritization and shadow banning and reoffer annually the opt-out
opportunity in subparagraph (f)2.” Id. (adding § 501.2041(2)(g)).
c.
“allow a user who has been deplatformed to access or retrieve all of
the user’s information, content, material, and data for at least 60 days after the
user receives the notice required under subparagraph (d)1.”
Id. (adding
§ 501.2041(2)(i)).
38
The Act merely states that certain things will not be deemed free advertising, without specifying
what will be considered to fall within that category. See id. (“Posts, content, material, and
comments by candidates which are shown on the platform in the same or similar way as other
users’ posts, content, material, and comments are not considered free advertising.”).
-40-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 41 of 70
d.
“publish the standards, including detailed definitions, it uses or has
used for determining how to censor, deplatform, and shadow ban.” Id. (adding
§ 501.2041(2)(a)).
e.
“provide a mechanism that allows a user to request the number of
other individual platform participants who were provided or shown the user’s
content or posts.” Id. (adding § 501.2041(2)(e)(1)).
f.
“[p]rovide, upon request, a user with the number of other individual
platform participants who were provided or shown content or posts.” Id. (adding
§ 501.2041(2)(e)(2)).
71.
A covered online business that fails to comply with Section 4 is deemed
to have committed “an unfair or deceptive act or practice as specified in
[§] 501.204,” and is subject to an investigation by the Department of Legal Affairs
and civil or administrative enforcement action. Id. (adding § 501.2041(5)). The Act
also empowers the State to use its subpoena power to intrusively investigate the
highly confidential and competitively sensitive methodologies online companies use
to exercise their content judgment. Id. (adding § 501.2041(8)). Finally, the Act
creates a private right of action against any platform that (i) applies its “censorship,
deplatforming, and shadow standards in an [in]consistent way,” or that
(ii) “censor[s] or shadow ban[s] a user’s content or material” without giving written
notice of its reasons for doing so. Id. (adding § 501.2041(6)).
-41-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 42 of 70
iv.
Antitrust Blacklist (Section 3)
72.
Section 3 of the Act creates a new statutory provision, F.S. § 287.137,
that imposes state contracting restrictions for covered online businesses that are
alleged to have violated antitrust laws and placed on a newly established “Antitrust
Violator Vendor List.” Act § 3 (adding § 287.137(2)(a)-(b)). The targeted “social
media platforms” are the only businesses that may be placed on the antitrust vendor
list. Id. (adding § 287.137(1)(b), (1)(f)). Again, other large businesses—including
the favored theme-park owners—are exempted.
73.
Section 3 is another example of the Act’s irrational targeting of a select,
disfavored group of online businesses.
Although federal antitrust laws—and
Florida’s counterpart statutes—apply across different industries, Section 3
irrationally singles out only the defined “social media platforms” for disfavored
treatment because of their role in hosting and moderating online content. Id.
Section 3 establishes an “Antitrust Violator Vendor List” of companies and
individuals subject to an absolute contracting bar with the State of Florida. Id.
(adding § 287.137(3)(b)). These persons and affiliates are also prohibited from
receiving “economic incentives” such as “state grants, cash grants, tax exemptions,
tax refunds, tax credits, state funds, and other state incentives” under Florida law.
Id. (adding § 287.137(5)).
-42-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 43 of 70
74.
The Antitrust Violator Vendor List may include those merely “accused
of” violations by the Florida “Attorney General,” “a state attorney,” or federal
authorities (subject to a cumbersome and inadequate process for contesting the
Attorney General’s decision before a state administrative law judge). The Act
empowers the Florida Attorney General to place an accused company “temporarily”
on the Antitrust Violator Vendor List upon a finding of mere “probable cause that a
person has likely violated the underlying antitrust laws.”
Id. (adding
§ 287.137(3)(d)(1)). The absolute state contracting bar extends to an ill-defined
group of officers, directors, shareholders, and even employees involved in
“management” of a company placed on the List, as well as a broad group of
“affiliates” of companies that are permanently placed on the List. Id. (adding
§ 287.137(1)(a), (f)-(g)).
*
75.
*
*
The Act is a smorgasbord of constitutional violations. Sections 2, 3,
and 4—specifically, those provisions adding F.S. §§ 106.072, 287.137 and
510.2041(2)(a)-(j)—violate the First Amendment, due process, and equal protection
principles, and run afoul of the Commerce Clause and Supremacy Clause.
-43-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 44 of 70
COUNT I
(42 U.S.C. § 1983)
Violation of Free Speech and Free Press Rights Under the First and
Fourteenth Amendments to the Constitution of the United States
(Challenge to Sections 2, 3, and 4 of the Act)
(As to All Defendants)
76.
Plaintiffs incorporate by reference paragraphs 1 to 75 above as if fully
and separately set forth herein.
77.
Sections 2 and 4 of the Act—specifically, those sections adding F.S.
§§ 106.072 and 510.2041(2)(a)-(j)—violate the First and Fourteenth Amendments.
As discussed above, in numerous, interrelated ways, all of the Moderation
Restrictions, as well as the affirmative obligations discussed above,39 impose
content-based, viewpoint-based, and speaker-based restrictions and burdens on
covered online businesses’ speech rights and editorial judgment entitled to full First
39
This includes the requirements to (i) “inform” a candidate if the covered online business
“willfully provide[d] free advertising for” the candidate, Act § 2 (adding § 106.072(4)); (ii) provide
users with a “a thorough rationale explaining the reason” for a covered online business’ moderation
decision, including a “precise and thorough explanation of how the [business] became aware of
the … content or material” and “a thorough explanation of the algorithms used, if any, to identify
or flag the user’s content or material as objectionable,” Act § 4 (adding § 501.2041(3));
(iii) “inform each user about any changes to its user rules, terms, and agreements before
implementing the changes,” id. (adding § 501.2041(2)(c)); (iv) “provide users with an annual
notice on the use of algorithms for post-prioritization and shadow banning and reoffer annually
[an] opt-out opportunity,” id. (adding § 501.2041(2)(g)); (v) “allow a user who has been
deplatformed to access or retrieve all of the user’s information, content, material, and data for at
least 60 days after the user receives the [mandated] notice,” id. (adding § 501.2041(2)(i));
(vi) “publish the standards, including detailed definitions, it uses or has used for determining how
to censor, deplatform, and shadow ban,” id. (adding § 501.2041(2)(a)); (vii) “[p]rovide a
mechanism that allows a user to request the number of other individual platform participants who
were provided or shown the user’s content or posts,” id. (adding § 501.2041(2)(e)(1)); and
(viii) “[p]rovide, upon request, a user with the number of other individual platform participants
who were provided or shown content or posts,” id. (adding § 501.2041(2)(e)(2)).
-44-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 45 of 70
Amendment protection. These provisions also unconstitutionally compel covered
online businesses to speak in ways that significantly burden and chill their
constitutionally protected content judgments and speech. In addition, the provisions
lack the scienter requirements that the First Amendment demands, effectively
imposing a set of strict-liability speech bans and mandates.
Separately and
collectively, these provisions single out the covered online businesses for disfavored
treatment. Because Sections 2 and 4 restrict speech based on its content and based
on its speaker, they are subject to strict scrutiny and are presumptively
unconstitutional. Further, the Act authorizes the State to engage in highly intrusive
investigations of content moderation processes and judgments, separately burdening
speech. Because the State has no legitimate (much less compelling) governmental
interest that supports these provisions, and because none of the provisions are
narrowly tailored, they do not survive strict scrutiny. Indeed, they would fail under
any standard of review.
78.
Plaintiffs’ members include online businesses subject to the Act. They
are private companies that have the right to choose what content they host on their
platforms and how to arrange, display, organize, and curate such content,
irrespective of the platforms’ popularity. The operative provisions of Sections 2 and
4 of the Act violate those rights.
-45-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 46 of 70
79.
Government action that compels speech by forcing a private social
media platform to carry content that is against its policies or preferences violates the
First Amendment.
80.
The First Amendment is not limited to traditional forms of media and
expression, but applies with equal force to modern media, technology, and
communications. Online businesses that make editorial decisions regarding what
content to publish, including content created or posted by third parties, engage in
speech that is fully protected by the First Amendment. Reno v. Am. Civil Liberties
Union, 521 U.S. 844, 870 (1997).
81.
In addition to prohibiting the government from directly restricting
speech, the First Amendment prohibits the government from compelling a person or
business to communicate a message (including host a third party’s message). In
other words, it “prohibits the government from telling people what they must say.”
Rumsfeld v. Forum for Acad. & Inst. Rights, 547 U.S. 47, 61 (2006). A State may
not require an online or other business to host or promote another’s speech unless it
meets the extraordinarily demanding standard of “strict scrutiny.” Riley v. Nat’l
Fed’n of the Blind of N. Carolina, Inc., 487 U.S. 781, 795 (1988).
82.
A compelled-speech edict is presumptively invalid unless the State can
show that its regulation is necessary to advance a “compelling” governmental
interest, is narrowly tailored to serve that interest, and is the least restrictive means
-46-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 47 of 70
available for establishing that interest. Reed v. Town of Gilbert, 576 U.S. 155, 163
(2015); United States v. Playboy Entm’t Grp., 529 U.S. 803, 813 (2000). Unless a
State can satisfy this extremely demanding standard, it may not interfere with a
private company’s choices about what to say or not to say, and what content to
distribute or not to distribute. See, e.g., Tornillo, 418 U.S. at 258. These settled
principles apply with full force to protect the rights of online businesses, including
“social media platforms” as defined in the Act.
83.
Laws that regulate speech (1) based on its content or (2) based on the
identity of the speaker are presumptively unconstitutional under the First
Amendment. Reed, 576 U.S. at 163, 170. Moreover, “[w]hen the government
targets not subject matter, but particular views taken by speakers on a subject, the
violation of the First Amendment is all the more blatant. Viewpoint discrimination
is thus an egregious form of content discrimination. The government must abstain
from regulating speech when the specific motivating ideology or the opinion or
perspective of the speaker is the rationale for the restriction.” Rosenberger v. Rector
and Visitors of Univ. of Va., 515 U.S. 819, 829 (1995).
84.
Further, where, as here, a regulation elevates certain speakers over
others and disfavors the latter, it “suggests that the goal of the regulation is not
unrelated to suppression of expression, and such a goal is presumptively
-47-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 48 of 70
unconstitutional.” Minneapolis Star & Tribune Co., 460 U.S. 575, 585, 592–93
(1983); Arkansas Writers’ Project, Inc. v. Ragland, 481 U.S. 221 (1987).
85.
Content-based, viewpoint-based, and speaker-based discrimination can
be discerned from both the text of the statute and evidence of the State’s purposes in
enacting the statute. Sorrell v. IMS Health Inc., 564 U.S. 552, 564–65 (2011). Thus,
where, as here (see supra ¶¶ 3-4, 53), a statute is animated by a desire to target
selected speakers for disfavored treatment, and especially where the motive is to
punish or retaliate against private parties for their perceived political or ideological
viewpoints, evidence of that improper motive can further confirm that the statute
amounts to impermissible speech regulation. Sorrell, 564 U.S. at 564–65.
86.
First Amendment rights “‘are protected not only against heavy-handed
frontal attack, but also from . . . more subtle governmental interference.’” Gibson v.
Fla. Legislative Investigation Comm., 372 U.S. 539, 544 (1963) (citation omitted).
Thus, a requirement that a company publish and disclose the rationale, processes,
data, or methods concerning its editorial decisions runs afoul of the First
Amendment. United States v. Rumely, 345 U.S. 41, 57 (1953) (Douglas, J., joined
by Black, J., concurring). “It is the presence of compulsion from the state itself that
compromises the First Amendment,” which “extends ‘not only to expressions of
value, opinion, or endorsement, but equally to statements of fact the speaker would
rather avoid.’” Washington Post v. McManus, 944 F.3d 506, 518 (4th Cir. 2019)
-48-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 49 of 70
(quoting Hurley v. Irish-American Gay, Lesbian & Bisexual Grp. of Boston, 515
U.S. 557, 570 (1995)).
87.
These principles—both collectively and individually—establish that
Sections 2 and 4 of the Act violate the First Amendment.
88.
Sections 2 and 4 force covered online businesses to host content they
otherwise would not allow under their policies and standards, or do not wish to
feature, organize, display, or prioritize in the way that the Act mandates. No one,
not even someone who has paid a filing fee to run for office, has a First
Amendment right to compel a private actor to carry speech on their private property.
On the contrary, the online businesses subject to the Act (including Plaintiffs’
members) have a First Amendment right to free speech—and may therefore decide
whom they will and will not host and with which speakers and speech they wish to
associate (or not associate).
89.
Sections 2 and 4 also limit and burden the exercise of covered online
business’ judgments about the display of content in myriad ways—including, but
not limited to, by restricting their ability to (i) edit, remove, organize, de-prioritize,
or prioritize certain third-party content or postings, (ii) to add commentary on or
advisories or warnings to accompany such content or postings (e.g., flagging
unverified factual claims), or (iii) curate or filter content so it is appropriate for
certain audiences (e.g., restricting access to adult content based on parental settings).
-49-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 50 of 70
90.
Sections 2 and 4 also unconstitutionally restrict, burden, compel, and
otherwise regulate speech based on its content. These sections reflect legislative
preferences for certain types of content (i.e., postings by or about political candidates
and by certain “journalistic enterprises,” as well as paid versus unpaid content). This
triggers strict scrutiny.
91.
The Act is also motivated by a viewpoint-based attack on the “social
media platforms” it targets. As the Act’s champions trumpeted when the bill was
signed into law, the core goal of the Act was to punish the targeted companies
specifically because the Legislature and Governor dislike the perceived political and
ideological viewpoints that those private businesses supposedly express through
their content judgments.
This is the essence of impermissible viewpoint-
discrimination, and it violates the First Amendment.
92.
Strict scrutiny also applies on the independent ground that the Act
engages in speaker-based discrimination and targets a discrete category of speakers
for disfavored treatment. The speech restrictions and compelled-speech
requirements under Sections 2 and 4 apply only to covered online businesses that
qualify as “social media platforms,” but do not apply to (a) non-digital hosts of thirdparty content with large audiences (such as certain book publishers or hosts of
traditional bulletin boards); (b) online businesses that provide the same types of
services but do not meet the arbitrary thresholds to qualify as a “social media
-50-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 51 of 70
platform” under the Act; and (c) any business that would otherwise be subject to the
Act except that it also happens to own and operate a large “theme park or
entertainment complex” (defined to include Disney and Universal Studios). This
speaker-based discrimination is also evidenced by the legislative history and public
record discussed above.
93.
By forcing covered online businesses to prioritize postings by or about
candidates and content from the loosely defined category of “journalistic
enterprises,” the Act further exacerbates the speaker-based discrimination, including
in an area (political speech) where the covered online businesses’ First Amendment
protections are strongest.
94.
The Act compounds these First Amendment violations by authorizing
the State to conduct highly intrusive investigations into how the targeted companies
organize and select content for inclusion on their private platforms, which separately
burdens First Amendment rights.
95.
For each of these independent reasons, Sections 2 and 4 are
presumptively unconstitutional, and the State bears the burden of establishing that
these requirements satisfy strict scrutiny.
96.
Section 3—which adds F.S. § 287.137—also violates the First and
Fourteenth Amendments. As discussed above, that section singles out certain
speakers and online media businesses—covered “social media platforms”—for
-51-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 52 of 70
discriminatory treatment, including prohibiting covered entities from contracting
with the State and from receiving tax breaks, refunds, and other economic incentives.
And the Act’s irrational exceptions for favored entities show that “the State has left
unburdened” other, favored speakers, in violation of the First Amendment. Nat’l
Inst. of Family & Life Advocs. v. Becerra, 138 S. Ct. 2361, 2378 (2018) (quoting
Sorrell, 564 U.S. at 580). For each of those reasons, Section 3 is presumptively
unconstitutional and subject to strict scrutiny.
97.
Additionally, as discussed above, Section 3 burdens “affiliates” of
companies placed on an antitrust blacklist, where “affiliates” is defined to include
any entities controlled by agents who are active in the management of the blacklisted
company. If a company is blacklisted, its affiliates are subject to blacklisting as well,
and a showing that the entity is an affiliate “constitutes a prima facie case” that
blacklisting is warranted for the affiliate. An affiliate on the list may not bid on or
be awarded any work under a public contract, or transact business with the State,
and it may be ineligible for economic incentives. Section 3’s use of guilt by
association violates the First Amendment rights of Plaintiffs, as well as their
affiliates. It is not an “appropriate requirement” for the State to require disaffiliation
in order to access public contracting and benefits.
98.
The State of Florida’s decision to subject “social media platforms” (as
defined in the Act) to “differential treatment, unless justified by some special
-52-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 53 of 70
characteristic of [their services], suggests that the goal of the regulation is not
unrelated to suppression of expression, and such a goal is presumptively
unconstitutional.” Minneapolis Star, 460 U.S. at 585. Section 3’s imposition of
non-generally applicable burdens on “social media platforms,” including Plaintiffs’
members, is not justified by any special characteristic of their services, and therefore
triggers strict scrutiny.
99.
Sections 2, 3, and 4 do not meet the requisite standard of strict scrutiny
(and would fail any standard of constitutional review).
100. First, the State cannot show that there is any real problem in need of
solving and that these statutory provisions further a “compelling” governmental
interest (or even any legitimate governmental interest).
101. Second, the State cannot show that Sections 2, 3, and 4 are narrowly
tailored to meet the State’s asserted interest. To the contrary, these provisions are
both over- and underinclusive in numerous respects. See supra ¶¶ 56-57. Among
other fatal defects, they arbitrarily and punitively target speech by some companies
with larger platforms, but not other companies that the Legislature favors.
102. Additionally, Section 4 of the Act regulates the speech of covered
online businesses without the necessary scienter protections required by the First
Amendment. For example, while the Act broadly prohibits covered businesses from
“deplatforming,” “censoring,” or “shadow banning” a “journalistic enterprise,” there
-53-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 54 of 70
is no requirement that the business know (or have reason to know) that the content
at issue was posted by such an enterprise. Thus, a covered business that removes or
posts an addendum to a video (even one posted by a propaganda outlet) could be
held strictly liable and subject to severe penalties if it turns out that, unbeknownst to
the provider, the video was posted by an entity deemed to be a “journalistic
enterprise.” The chilling effect of the lack of a scienter requirement is exacerbated
by the breadth and vagueness of the Act’s terms.
103. The same is true of the Act’s notice provisions, which apply only where
actions are taken with respect to a poster or content provider “who resides in or is
domiciled in Florida.” There is no requirement that the covered online business
know, or have reason to know, where that person actually lives. Nor is this residency
information something that many covered online businesses should be expected to
have. As a result, a covered business that takes moderation actions concerning an
account could face strict liability if it turns out that, unbeknownst to the business,
the person happens to live in Florida. The First Amendment forbids such strictliability speech regulations.
104. Unless they are enjoined, Sections 2, 3, and 4 will operate to unlawfully
deprive Plaintiffs’ members of their fundamental First Amendment rights, including
the chilling of Plaintiffs’, their members’, and their affiliates’ exercise of
associational freedoms.
-54-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 55 of 70
COUNT II
(42 U.S.C. § 1983)
Violation of Due Process Rights Under the Fifth and Fourteenth
Amendments to the Constitution of the United States
(Challenge to Sections 2 and 4 of the Act)
(As to the Commissioners of the Florida Elections Commission
and the Florida Attorney General)
105. Plaintiffs incorporate by reference paragraphs 1 to 75 above as if fully
and separately set forth herein.
106. The U.S. Constitution guarantees all persons the right to due process.
U.S. Const. amend. V. The Fifth Amendment’s guarantee of due process applies to
state governments through the Fourteenth Amendment. U.S. Const. amend. XIV.
107. The Act violates due process because it fails to provide fair warning of
what conduct is being regulated. FCC v. Fox Television Stations, Inc., 567 U.S.
239 (2012). A law is unconstitutionally vague when people “of common intelligence
must necessarily guess at its meaning,” Connally v. Gen. Constr. Co., 269 U.S. 385,
391 (1926), or where the law lacks definite and explicit standards thereby
encouraging “arbitrary and discriminatory” application, Kolender v. Lawson, 461
U.S. 352 (1983).
108. These concerns are especially acute where, as here, the Act both
regulates the content of speech and permits state enforcement actions. See Reno,
521 U.S. at 871.
-55-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 56 of 70
109. Various provisions of the Act, including Sections 2 and 4, regulate
speech in vague terms that do not give businesses subject to the Act reasonable and
fair notice of the conduct that is expected of them and the conduct that may be
subject to penalties. The Act is also riddled with such vague terms that it invites
arbitrary and discriminatory enforcement, including the arbitrary imposition of
draconian civil penalties. These infirmities include, but are not limited to, the
following:
a.
The Act establishes an undefined requirement that a social media
platform engage in content moderation “in a consistent manner among its users
on the platform.” Act § 4 (adding § 501.2041(2)(b)). In addition to facing “civil
or administrative action” by the Florida Attorney General for an alleged violation
of this provision, the Act provides a private cause of action for violations of this
requirement, with statutory damages of $100,000 per claim and potential punitive
damages. Id. (adding § 501.2041(6)(a)).
b.
The Act prohibits “censoring,” “deplatforming,” or “shadow
banning” of “a journalistic enterprise,” but employs a vague and amorphous
definition to describe what entities qualify as a “journalistic enterprise.” Id.
(adding § 501.2041(2)(j), (1)(d)).
This vagueness places covered online
businesses in the impossible position of having to conduct extensive and costly
investigations to determine whether the State might consider a poster to be a
-56-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 57 of 70
“journalistic enterprise”—all without any clear understanding of what that
definition actually covers.
c.
The Act requires covered online businesses to “inform” a candidate
if they have “willfully provide[d] free advertising for” the candidate, in which
case the Act treats this “free advertising” as an “in-kind contribution” for
purposes of Florida’s election laws. Act § 2 (adding § 106.072(4)). But, other
than a confusing definition of what does not count as “free advertising,” id., the
Act provides no guidance as to what will fall within that vague category
triggering election-law compliance requirements.40
d.
The Act prohibits applying or using any “post-prioritization or
shadow banning algorithms for content and material posted by or about a user
who is known by the social media platform to be a candidate.” Id. (adding
§ 501.2041(2)(h)). The definition of “post-prioritization” covers any “action by
a social media platform to place, feature, or prioritize certain content or material
ahead of, below, or in a more or less prominent position than others in a newsfeed,
a feed, a view, or in search results.” It is impossible to understand what this
provision allows and does not allow. Read according to its terms, the provision
40
If the Florida Elections Commission construed the Act to govern candidates for federal office,
see Act § 2 (adding § 106.072(6), adopting the definition of “candidate” in F.S. § 106.011(3)(e)),
that would raise additional federal preemption concerns given the comprehensive regulation of inkind contributions involving such candidates under the Federal Election Campaign Act. See 52
U.S.C. § 30116(a)(7)(B)(i); 11 C.F.R. § 109.20(a); see also Cipollone v. Liggett Grp., Inc., 505
U.S. 504, 516 (1992).
-57-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 58 of 70
would suggest that a search engine is forbidden from placing content “by or
about” a political candidate (whether or not it is defamatory or otherwise illegal
or objectionable) ahead of—or below—any other content. It forbids placing such
content in a more prominent position—or a less prominent position—than other
content. Due process does not allow the State to enforce such a paradoxical, selfdefeating, and incomprehensible prohibition.
110. Because covered businesses lack fair notice about what conduct is
allowed and what is prohibited—subject to exposure to potentially massive
penalties, including fines of $250,000 per day—these provisions of the Act violate
basic principles of due process. Id. (adding § 106.072(3)).
111. Vagueness is also rife in other aspects of the Act, including its key
definitions of concepts such as “shadow banning,” “deplatforming,” and
“censoring.” Because these are the operative provisions under Sections 2 and 4, they
render the entirety of those Sections void for vagueness under due process
protections.
112. Unless it is enjoined, the Act will operate to unlawfully deprive
Plaintiffs’ members of their fundamental due process rights.
-58-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 59 of 70
COUNT III
(42 U.S.C. § 1983)
Violation of Equal Protection Rights Under the Fourteenth
Amendment to the Constitution of the United States
(Challenge to Sections 2, 3, and 4 of the Act)
(As to All Defendants)
113. Plaintiffs incorporate by reference paragraphs 1 to 75 above as if fully
and separately set forth herein.
114. The Fourteenth Amendment to the United States Constitution
guarantees to all citizens “equal protection of the laws,” and it forbids any state
government from denying that protection “to any person within its jurisdiction[.]”
U.S. Const. amend. XIV. At a minimum, it forbids state governments from engaging
in arbitrary discrimination against its citizens. The Equal Protection Clause “is
essentially a direction that all persons similarly situated should be treated alike.”
City of Cleburne, Tex. v. Cleburne Living Ctr., 473 U.S. 432, 439 (1985).
115. Distinctions “affecting fundamental rights,” including the exercise of
First Amendment rights, trigger strict scrutiny under the Equal Protection Clause,
even if the distinctions do not themselves constitute suspect or invidious
classifications. Clark v. Jeter, 486 U.S. 456, 461 (1988). “The Equal Protection
Clause requires that statutes affecting First Amendment interests be narrowly
tailored to their legitimate objectives.” Police Dep’t of Chicago v. Mosley, 408 U.S.
92, 101 (1972).
-59-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 60 of 70
116. Sections 2, 3, and 4 of the Act all purport to regulate the conduct of
“social media platforms.”
The Act’s definition of that term is arbitrary and
discriminatory, thereby rendering Sections 2, 3, and 4 in violation of basic equal
protection principles.
117. First, the Act’s carveout for companies that own large theme parks
violates equal protection. Whether or not a company owns a theme park has no
conceivable bearing on whether that company’s social media platform presents the
purported risks against which the Act was designed to protect. The Act would not
apply to a targeted company that, for example, bought a zoo or other “theme park or
entertainment complex” that met the following statutorily defined criteria: “a
complex comprised of at least 25 contiguous acres owned and controlled by the same
business entity and which contains permanent exhibitions and a variety of
recreational activities and has a minimum of 1 million visitors annually.” F.S. §
509.013(9); see Act § 4 (adding § 501.2041(1)(g)). These specific thresholds have
nothing to do with any government interest in free speech or online policy. Nor is
there any reason to believe that the State’s purported interest in protecting against
“unfair” conduct from social media platforms is furthered by protecting theme park
operators (specifically including Disney and Universal Studios).
118. Second, the definition of businesses that are subject to the Act further
irrationally discriminates against larger and more popular websites and social media
-60-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 61 of 70
companies by targeting them for restrictions and disfavored governmental treatment.
It targets only select companies that have either (i) at least $100 million in annual
gross revenues, or (ii) over 100 million monthly participants, while irrationally
excluding other companies. See supra ¶¶ 8, 56-57. Such arbitrary distinctions
demonstrate that the Act unconstitutionally discriminates against the speech of
certain speakers, that it is gravely under- and over-inclusive, and that it is not
justified by any legitimate (much less compelling) governmental interest.
119. Because the definition of “social media platforms” is both arbitrary and
discriminatory, Sections 2, 3, and 4 will operate to unlawfully deprive Plaintiffs’
members of their fundamental equal protection rights.
120. Additionally, Section 4 establishes multiple new affirmative and
onerous obligations that would impact Plaintiffs’ members, but irrationally exclude
other, favored entities. See supra ¶¶ 56-57, 65-71. This separately violates equal
protection.
121. Similarly, the antitrust provisions in Section 3 suffer from the same
flaws by irrationally targeting the covered online businesses, but not other
companies. See supra ¶¶ 72-74.
122. The State cannot show any rational basis for crafting this statutory
scheme—much less satisfy strict scrutiny—and, accordingly, the statutory
-61-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 62 of 70
provisions discussed above violate the equal protection rights of Plaintiffs’
members.
COUNT IV
(42 U.S.C. § 1983)
Violation of the Commerce Clause of the Constitution of the United States
and the Due Process Clause of the Fourteenth Amendment
to the Constitution of the United States
(Challenge to Sections 2, 3, and 4 of the Act)
(As to All Defendants)
123. Plaintiffs incorporate by reference paragraphs 1 to 75 above as if fully
and separately set forth herein.
124. The U.S. Constitution entrusts the regulation of commerce “among the
several States” to the federal government. U.S. Const. art. I., § 8, cl. 3. Thus, an
individual State may not usurp this authority by regulating interstate commerce
unilaterally. See, e.g., C & A Carbone, Inc. v. Town of Clarkstown, 511 U.S. 383
(1994).
125. “[T]he Commerce Clause by its own force restricts state
protectionism.” Tennessee Wine & Spirits Retailers Assoc. v. Thomas, 139 S. Ct.
2449, 2460 (2019). “[I]f a state law discriminates against … nonresident economic
actors, the law can be sustained only on a showing that it is narrowly tailored to
‘advance a legitimate local purpose.’” Id. at 2461 (cleaned up).
126. The Commerce Clause also prohibits any “state regulation that
‘discriminates against or unduly burdens interstate commerce and thereby imped[es]
-62-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 63 of 70
free private trade in the national marketplace.’” Gen. Motors Corp. v. Tracy, 519
U.S. 278, 287 (1997) (citation omitted) (emphasis added).
127. Courts have long recognized that state laws that attempt to regulate the
inherently global communications medium that is the Internet must respect the
constitutional limits on state authority under the Commerce Clause. Am. Libraries
Ass’n v. Pataki, 969 F. Supp. 160, 169,173-74 (S.D.N.Y. 1997); Am. Booksellers
Found. v. Dean, 342 F.3d 96, 103-104 (2d Cir. 2003).
128. The Act violates the Commerce Clause by imposing uniquely
burdensome operational requirements on businesses headquartered (and with
substantial business operations) outside of Florida, while expressly exempting
favored in-state businesses through a status-based “theme park” ownership
exemption that is based on economic protectionism. Cf. Tennessee Wine & Spirits,
139 S. Ct. at 2472-74. Both on its face and in its practical effects, the Act
impermissibly discriminates against out-of-state businesses, and favors in-state
businesses.
The Act also imposes onerous and undue burdens on interstate
commerce by predominantly targeting online businesses headquartered outside the
State.
Florida has no legitimate reason for discriminating against interstate
commerce—and in favor of companies with in-state theme parks—and the burden
on interstate commerce is clearly excessive in relation to any of the purported
-63-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 64 of 70
benefits that the State claims will result from the Act. The State cannot show that
its stated goals could not be served by other available nondiscriminatory means.
129. In addition, the Act regulates wholly out-of-state conduct because there
is no requirement that the moderation take place in Florida or that the content being
moderated is posted in Florida. Such extraterritorial regulation is forbidden by the
Commerce Clause and Due Process Clause of the Fourteenth Amendment.
130. Unless enjoined, Sections 2, 3, and 4 of the Act will operate to
unconstitutionally burden interstate commerce and effect extraterritorial regulation
in violation of the Commerce Clause and Due Process Clause.
COUNT V
(Declaratory Judgment Act)
(42 U.S.C. § 1983)
Preemption under the Supremacy Clause of the Constitution
of the United States and 47 U.S.C. 230(e)(3)
(Challenge to Sections 2 and 4 of the Act)
(As to the Commissioners of the Florida Elections Commission
and the Florida Attorney General)
131. Plaintiffs incorporate by reference paragraphs 1 to 75 above as if fully
and separately set forth herein.
132. Section 2 of the Act permits the Florida Elections Commission to
impose fines of up to $250,000 per day against any “social media platform” that
chooses to “permanently delete or ban a user or to temporarily delete or ban a user
from the social media platform for more than 14 days,” if that user is a candidate for
statewide public office. Act § 2 (adding § 106.072(2)). The Commission may
-64-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 65 of 70
impose fines of up to $25,000 per day against a platform that bans the account of a
candidate for a local public office. Id.
133. Section 4 of the Act permits any private individual to bring a cause of
action against a platform that has applied its “censorship” standards inconsistently,
and/or against a platform that has “censor[ed]” or “shadow ban[ned]” a user without
providing adequate notification. Act § 4 (adding § 501.2041(2)(b), (d)(1)). Such a
civil action could, for instance, be brought against a platform that removed content
posted by one user, but not similar content posted by another user.
134. Section 4 also permits the Department of Legal Affairs to bring “a civil
or administrative action” against any platform suspected of violating any provision
of Section 4. Id. (adding § 501.2041(5)). Such violations include the decision to
“censor, deplatform or shadow ban a journalistic enterprise,” or to “shadow ban[]”
a candidate for elected office. Id. (adding § 501.2041(2)(h), (2)(j)). They also
include violations of the other Moderation Restrictions and affirmative obligations
contained in Section 4.
135. Under 47 U.S.C. § 230, it is federal policy “to promote the continued
development of the Internet and other interactive computer services and other
interactive media” and “preserve the vibrant and competitive free market that
presently exists for the Internet and other interactive computer services, unfettered
by Federal or State regulation.” 47 U.S.C. § 230(b)(1), (2). Among the important
-65-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 66 of 70
purposes advanced by Section 230, Congress sought “to encourage service providers
to self-regulate the dissemination of offensive material over their services.” Fair
Housing Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157,
1163 (9th Cir. 2008) (en banc). This is its principal purpose. Id. at n.12.
136.
Under Section 230, “[n]o provider or user of an interactive computer
service shall be treated as the publisher or speaker of any information provided by
another information content provider.”
47 U.S.C. § 230(c)(1).
Section 230
“establish[es] broad ‘federal immunity to any cause of action that would make
service providers liable for information originating with a third-party user of the
service.’” Almeida v. Amazon.com, Inc., 456 F.3d 1316, 1321 (11th Cir. 2006)
(quoting Zeran v. Am. Online, Inc., 129 F.3d 327, 330 (4th Cir. 1997)). Under
Section 230, laws or claims that “seek[] to hold a service provider liable for its
exercise of a publisher’s traditional editorial functions—such as deciding whether to
publish, withdraw, postpone or alter content—are barred.” Zeran, 129 F.3d at 330.
137. Moreover, under Section 230, “[n]o provider or user of an interactive
computer service shall be held liable on account of . . . any action voluntarily taken
in good faith to restrict access to or availability of material that the provider or user
considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or
otherwise objectionable, whether or not such material is constitutionally protected.”
47 U.S.C. § 230(c)(2).
-66-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 67 of 70
138. Section 230 similarly prohibits liability for “any action taken to enable
or make available to information content providers or others the technical means to
restrict access to material” that falls within the Section 230(c)(2) category above. Id.
§ 230(c)(2)(B). This provision applies to tools that online service providers make
available to users to help them avoid or limit their exposure to potentially
objectionable content.
139. For purposes of Section 230, an “interactive computer service” is “any
information service, system, or access software provider that provides or enables
computer access by multiple users to a computer server.” Id. § 230(f)(2). The
“provider” of such a service includes those who own or operate websites, such as
social media platforms, and therefore covers Plaintiffs’ members who are subject to
the Florida Act.
140. Section 230 expressly provides that “[n]o cause of action may be
brought and no liability may be imposed under any State or local law that is
inconsistent with this section.” Id. § 230(e)(3). This provision expressly preempts
inconsistent state laws that seek to hold online service providers liable for engaging
in content moderation covered by Section 230(c). Preemption applies equally to
private causes of action and public enforcement actions.
141. Sections 2 and 4 of the Act are inconsistent with Section 230 and
therefore are expressly preempted because they (i) purport to impose liability on
-67-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 68 of 70
“social media platforms” covered by Section 230 for taking actions protected by
Sections 230(c)(1) and (c)(2), and (ii) would impermissibly treat the platforms as
publishers of third-party content. Id. § 230(c)(1) & (2)(A).
142. Sections 2 and 4 are also preempted under the principles of implied
preemption and “obstacle preemption.” Sections 2 and 4 frustrate and undermine
the basic purposes and policy goals of Section 230. See Geier v. Am. Honda Motor
Co., 529 U.S. 861, 873 (2000).
143. The Court should issue a declaration confirming that preemption
applies to the Act.
PRAYER FOR RELIEF
Plaintiffs request the following relief:
(1)
An order declaring the Act unconstitutional on its face for violating
Plaintiffs’ members’ rights under the First and Fourteenth Amendments to the
Constitution of the United States (including the Fourteenth Amendment’s due
process and equal protection requirements) and for violating the Commerce Clause
and the Fourteenth Amendment’s Due Process Clause;
(2)
An order declaring Sections 2 and 4 of the Act preempted by federal
law, including 47 U.S.C. § 230(e)(3) and principles of implied preemption;
(3)
An order preliminarily and permanently enjoining the defendants from
enforcing Sections 2, 3, and 4 of the Act;
-68-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 69 of 70
(4)
An order for costs incurred in bringing this action;
(5)
An order for reasonable attorneys’ fees under 42 U.S.C. § 1988(b); and
(6)
Such other relief as the Court deems appropriate.
Respectfully submitted,
Date: May 27, 2021
Brian M. Willen
(pro hac vice forthcoming)
Steffen N. Johnson
(pro hac vice forthcoming)
WILSON SONSINI GOODRICH &
ROSATI, P.C.
1700 K St NW
Washington, DC 20006
Phone: (202) 973-8800
Email: bwillen@wsgr.com
sjohnson@wsgr.com
Lauren Gallo White
(pro hac vice forthcoming)
Meng Jia Yang
(pro hac vice forthcoming)
WILSON SONSINI GOODRICH &
ROSATI, P.C.
One Market Plaza
Spear Tower, Suite 3300
San Francisco, CA 94105
Phone: (415) 947-2000
Email: lwhite@wsgr.com
mjyang@wsgr.com
/s/ Christopher G. Oprison
.
Christopher G. Oprison
Florida Bar No. 0122080
J. Trumon Phillips
Florida Bar No. 84568
DLA PIPER LLP (US)
200 South Biscayne Blvd., Suite 2500
Miami, Florida 33131
Phone: 305-423-8500
Fax: 305-675-6366
Email: chris.oprison@dlapiper.com
trumon.phillips@dlapiper.com
sheila.hall@dlapiper.com
Peter Karanjia (pro hac vice
forthcoming)
James J. Halpert (pro hac vice
forthcoming)
DLA PIPER LLP (US)
500 Eighth Street, NW
Washington, DC 20004
Phone: 202-799-4000
Fax: 202-799-5000
Email: peter.karanjia@dlapiper.com
jim.halpert@dlapiper.com
-69-
Case 4:21-cv-00220-RH-MAF Document 1 Filed 05/27/21 Page 70 of 70
Douglas L. Kilby
Florida Bar No. 0073407
Glenn Burhans, Jr.
Florida Bar No. 0605867
Bridget Smitha
Florida Bar No. 0709581
Christopher R. Clark
Florida Bar No. 1002388
Stearns Weaver Miller Weissler
Alhadeff & Sitterson, P.A.
Highpoint Center
106 East College Avenue, Suite 700
Tallahassee, FL 32301
Phone: (850) 580-7200
Email: dkilby@stearnsweaver.com
gburhans@stearnsweaver.com
bsmitha@stearnsweaver.com
cclark@stearnsweaver.com
Ilana H. Eisenstein (pro hac vice
forthcoming)
Ben C. Fabens-Lassen (pro hac vice
forthcoming)
Danielle T. Morrison (pro hac vice
forthcoming)
Jonathan Green (pro hac vice
forthcoming)
DLA PIPER LLP (US)
One Liberty Place
1650 Market Street, Suite 5000
Philadelphia, PA 19103-7300
Phone: 215-656-3300
Fax: 215-656-3301
Email: ilana.eisenstein@dlapiper.com
ben.fabens-lassen@dlapiper.com
danielle.morrison@dlapiper.com
jonathan.green@dlapiper.com
Attorneys for Plaintiffs NetChoice, LLC
and Computer & Communications
Industry Association
-70-
Disclaimer: Justia Dockets & Filings provides public litigation records from the federal appellate and district courts. These filings and docket sheets should not be considered findings of fact or liability, nor do they necessarily reflect the view of Justia.
Why Is My Information Online?