reclaimthenet.org
Federal Judge Blocks Arkansas Social Media Law on First Amendment Grounds
If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.
A federal judge blocked Arkansas Act 900 today, one day before the law was set to take effect, handing the state its second courtroom defeat in the same fight over who gets to decide what people can see and say online.
We obtained a copy of the order for you here.
US District Judge Timothy L. Brooks granted NetChoice’s motion for a preliminary injunction, freezing enforcement of a statute that would have imposed strict liability on social media platforms for a growing list of “addictive practices,” forced default settings on anyone in Arkansas the platform couldn’t verify as an adult, and required platforms to build parental dashboards tracking minors who don’t even have accounts. The ruling came in the Western District of Arkansas, Fayetteville Division.
The First Amendment problem is obvious. The government wrote a law that restricts what platforms can say, who they can say it to, and when. It restricts what minors can see and post. Then it backed those restrictions with $10,000-per-day fines and rules so vague that platforms cannot tell in advance what will trigger liability. Each of those features is a constitutional problem on its own. Act 900 combined all of them.
Act 900 was Arkansas’s second try. The first, Act 689 of 2023, was permanently enjoined by the same court last year on First Amendment and vagueness grounds. That appeal is still pending before the Eighth Circuit. Rather than wait for the appellate ruling, the Arkansas General Assembly passed Act 900 to patch the definitional problems and layer on new obligations. Judge Brooks found the new version suffers from the same constitutional defects, and in some places, worse ones.
“Addictive features” is the new framing for the old project
The language has changed in the last two years. The first wave of state social media laws talked openly about content. Legislatures tried to regulate “harmful” posts, “misinformation,” and categories of speech they wanted gone. Courts kept striking those laws down. The speech was protected, the definitions were vague, and the state’s role in deciding what counted as harmful was obviously the problem.
The new framing is “addictive design.” The theory goes like this: The government is not regulating speech. It is regulating the features that deliver speech. That includes notifications, recommendations, infinite scroll, algorithmic feeds, and the little hit of validation when a post gets likes. The argument is that these are engineering choices, not editorial ones, so the First Amendment is not really in play.
This is a convenient reframing. It lets legislators tell the public they are addressing child safety while avoiding the cases that blocked the old laws. It lets them avoid the word “censorship” while building tools that do the same work. Notifications carry speech. Recommendations are editorial decisions about what content a user sees. Algorithmic feeds are the platform’s curation of protected expression. Regulating those features is regulating how platforms speak and how users receive that speech, and the Supreme Court said so directly in Moody v. NetChoice.
Judge Brooks saw the move for what it is. The opinion identifies Act 900’s addictive practices rule as a content-neutral pretext that collapses on contact with the actual text.
The statute does not define “addictive.” It does not define “compulsive behaviors.” It lists notifications, recommended content, and “artificial sense of accomplishment” as examples, which covers virtually every design choice any modern website makes. A platform is liable if a single minor develops compulsive behavior in response to anything the platform does, on or off the platform, whether the company could have foreseen it or not.
It is a license for the state to decide which design choices, and therefore which speech-delivery mechanisms, it will permit. A regulator who doesn’t like recommendation algorithms can call them addictive. A regulator who doesn’t like notifications about political content can call them addictive. The word does whatever the enforcer needs it to do. Arkansas’s own example provisions make the point. “Artificial sense of accomplishment” is not a legal term. It’s a mood.
The court’s language on this is sharp. The provision “fails to specify a standard of conduct to which platforms must conform[,] and its violation entirely depends upon the sensitivities of some unspecified user.” That is the design. The enforcer decides which user, which sensitivity, and which feature triggered the unacceptable response. The platform learns the rule by being punished for breaking it. In the meantime, every platform has to guess which features to strip out, which speech to throttle, and which audiences to wall off. The safest move is to do less, show less, recommend less, reach fewer people. That is a speech outcome, produced by a statute the state insists is not about speech.
The “addiction” frame also picks winners. Platforms that already serve mostly adults, or already have the infrastructure to age-verify and surveil, can absorb the compliance cost. Smaller platforms cannot. Nextdoor told the court it would block every Arkansan under 16 rather than try to comply, as it already has in Texas, Mississippi, and Tennessee. The result is a narrower internet, with fewer voices, and a regulatory structure that favors the largest incumbents, achieved through a law that claims to be protecting children from those same incumbents.
The “default” provisions fail for a different constitutional reason. One required platforms to silence non-safety notifications for Arkansas minors between 10 p.m. and 6 a.m. The other mandated the most restrictive privacy and safety settings available as the default for minor accounts.
Both provisions burden speech. Platforms communicate through notifications. Privacy settings control who can see whose posts and whose posts a user can see, which the Supreme Court has long recognized as speech within the First Amendment’s protection. The government cannot impose those kinds of restrictions unless the law is narrowly tailored to a significant government interest. Act 900 flunks the tailoring analysis on both provisions.
The nighttime notification rule would effectively silence platform notifications for every Arkansas user the platform cannot confirm is an adult account holder, a third of the day. Parents are free to override the setting anyway. If parents wanted their children to sleep, Judge Brooks noted, they could take the phones away. The state offered no evidence that parents lack the ability to do so. The law silences speech without advancing the interest the state invoked to justify it.
The privacy default is worse. Anyone can change it, including the minor the law is supposedly protecting. The court called it “wildly underinclusive,” because the statute “in effect, allows children to decide whether they need protection from sexual exploitation online because they are free to depart from the protective default.” The provision burdens platforms’ speech across the board while accomplishing nothing for the children it claims to shield.
The court’s conclusion on the defaults is the line Taske quoted in his statement. “Imposing small burdens on vast quantities of speech for no appreciable benefit is not consistent with the First Amendment. Arkansas cannot sentence speech on the internet to death by a thousand cuts.”
The compelled speech problem: forced surveillance dressed as disclosure
The dashboard provision produced the strangest result in the opinion. Because Act 900 defines a “user” as someone who views content but isn’t an account holder, the requirement that platforms build a parental monitoring dashboard for “minor users” would force platforms to identify every minor who visits the site, collect identifying data, locate their parents, and track usage across devices.
Compelled speech ordinarily triggers strict scrutiny. Arkansas asked the court to apply the more lenient Zauderer standard for mandated commercial disclosures. The court didn’t need to resolve the dispute because, even under the easier test, the provision fails. Forcing platforms to compile surveillance infrastructure on every minor visitor, identify each one’s parents, and enforce parental restrictions across devices is unduly burdensome by any measure. The court found it “likely to chill platforms’ dissemination of speech to or from anyone who is not an account holder.”
Judge Brooks quoted Packingham on the central constitutional point: “the government ‘may not suppress lawful speech as the means to suppress unlawful speech.'” The Supreme Court struck down a North Carolina law in that case for barring registered sex offenders from accessing commonplace social media sites. Act 900 isn’t as broad, but it operates on the same logic. Protect children from online predators by restricting the speech of everyone the state classifies as a child, and by extension, the platforms’ ability to speak to them.
NetChoice and the state’s response
NetChoice’s lead counsel on the case framed the ruling bluntly. “Once again, the District Court hit the nail on the head. Left to its own devices, Arkansas would ‘sentence speech on the internet to death by a thousand cuts.'” NetChoice Litigation Center Co-Director Paul Taske said in a statement. He added that “Act 900 is deeply flawed. It burdens speech without providing any upside.”
The legal stakes extend beyond Arkansas. The court’s vagueness analysis tracks a recent Ninth Circuit decision, NetChoice v. Bonta, which reached similar conclusions about a California law using similar language. Write the rule vaguely enough that platforms cannot know what compliance looks like, impose strict liability with daily penalties, invoke child safety to justify the structure, and let the enforcement discretion do the rest. The result is that platforms either surveil their users to comply, restrict access to avoid the risk, or self-censor to stay inside whatever the enforcing authority decides the vague terms mean this month. Each option damages speech. The First Amendment treats laws built this way as constitutionally unserious, regardless of how the sponsoring legislature frames them.
On irreparable harm, the court applied the Eighth Circuit standard and found it met. “The loss of First Amendment freedoms, for even minimal periods of time, unquestionably constitutes irreparable injury.” The state’s interest in enforcing its statute does not outweigh the public’s interest in not having speech silenced under an unconstitutional law.
Arkansas has already filed interlocutory appeals on both the Act 689 permanent injunction and the Act 901 preliminary injunction. A third appeal is likely. For now, Act 900 does not go into effect tomorrow.
If you're tired of censorship and dystopian threats against civil liberties, subscribe to Reclaim The Net.
The post Federal Judge Blocks Arkansas Social Media Law on First Amendment Grounds appeared first on Reclaim The Net.