Skip to main content

The Supreme Court Just Handed Every Platform the Legal Framework to Ignore What Happens on Their Infrastructure

A unanimous Supreme Court ruling that ISPs can't be liable for user piracy just became the legal precedent every platform has been waiting for — and it extends far beyond copyright.

Abstract visualization of network infrastructure — server cables, data flow, or a network diagram — overlaid with legal/judicial imagery like a gavel or Supreme Court building. Should feel...
Image via Deadline

The Supreme Court ruled 9-0 that Cox Communications cannot be held liable for music piracy committed by its subscribers, even if the company failed to adequately police copyright infringement on its network. The decision, reported by Deadline, ends a lengthy legal battle between the internet service provider and Sony Music Entertainment, which had sought massive damages for Cox's alleged negligence in curbing piracy. The ruling is narrow in its immediate application — it addresses secondary liability for ISPs under contributory infringement doctrine — but its implications stretch far beyond cable modems and torrent files.

What the Court established is a legal framework that treats infrastructure providers as neutral conduits rather than active participants in what happens on their networks. Cox argued it couldn't be held responsible for monitoring every byte of data passing through its pipes, and the justices agreed. That logic — infrastructure as passive utility — is now precedent. And every platform company with a liability problem just added this case to their legal playbook.

The decision arrives at a moment when platform accountability is the defining tension across the internet economy. YouTube faces lawsuits over radicalization content. Meta is being sued by states claiming Instagram harms teen mental health. AI companies are getting dragged into court over synthetic attribution and copyright violations. TikTok's algorithm is under scrutiny for amplifying dangerous challenges. Every one of these cases hinges on the same question: Is the platform responsible for what happens on it, or is it just the infrastructure?

The Cox ruling doesn't directly answer that question for social platforms — ISPs and user-generated content hosts operate under different legal regimes — but it establishes a rhetorical and structural blueprint. If an ISP can't be held liable for failing to stop piracy despite knowing it's happening, how do you argue that a social platform should be liable for failing to stop misinformation, or harassment, or copyright infringement in user uploads? The legal distinctions exist, but the Cox decision hands platforms a narrative: We're infrastructure. We can't monitor everything. Holding us liable would break the internet.

That narrative has been the tech industry's favorite defense for two decades, and it's never been entirely wrong. The internet's scale makes comprehensive moderation impossible. Platforms do operate, in part, as neutral infrastructure. But the Cox case lets them claim neutrality while still exercising enormous editorial control. ISPs don't algorithmically promote certain types of traffic. YouTube does. Cox doesn't curate what you see when you open your browser. TikTok's For You page is nothing but curation. The infrastructure argument works when you're a dumb pipe. It falls apart when you're also the editor.

The Supreme Court's reasoning in the Cox case rests on the idea that requiring ISPs to actively police infringement would impose an undue burden and chill innovation. That logic has merit in the ISP context — no one wants their internet provider deciding what they can download. But when the same logic gets imported into platform liability cases, it becomes a shield for companies that have built empires on algorithmic amplification. Europe has spent years trying to draw legal lines around this distinction, requiring platforms to take down illegal content while protecting infrastructure providers. The U.S. just made that line harder to draw.

Sony Music, which brought the case against Cox, was asking for damages on the theory that Cox knew piracy was rampant on its network and did nothing meaningful to stop it. The company pointed to internal communications showing Cox employees were aware of repeat infringers but failed to terminate their accounts. The Supreme Court ruled that knowledge alone isn't enough to establish liability — Sony needed to prove Cox materially contributed to the infringement, not just that it failed to prevent it. That's a high bar, and it's a bar that platforms will now cite every time they're accused of enabling harm through inaction.

'Morning Joe' Co-Hosts Joe Scarborough and Mika Brzezinski Renew Deal To Stay At MS NOW
Image via Deadline

The business incentives here are straightforward. Platforms make money by keeping users engaged. Aggressive moderation reduces engagement. Algorithmic promotion increases engagement. The Cox ruling gives platforms legal cover to optimize for engagement while claiming they can't be held responsible for what that engagement produces. It's not that they're actively promoting piracy, or misinformation, or harmful content — they're just infrastructure, and infrastructure can't be expected to police every user action. The fact that the infrastructure is also making editorial decisions through algorithmic curation becomes legally irrelevant.

The music industry has been fighting this battle for 25 years, ever since Napster made piracy frictionless and platforms started claiming they were just hosting user content, not distributing it. The Napster case established that platforms could be held liable if they had knowledge of infringement and failed to act. The Cox case narrows that doctrine significantly. Now, knowledge isn't enough. Material contribution is required. That's a much harder standard to meet, especially when platforms can argue that algorithmic ranking isn't the same as active distribution.

What makes this ruling particularly dangerous is its unanimity. A 9-0 decision sends a clear signal that the Court sees infrastructure liability as a bright-line issue. That unanimity will be cited in every subsequent case involving platform responsibility. It doesn't matter that the legal doctrines differ — copyright law, Section 230, product liability, negligence — the Cox precedent establishes a philosophical framework that treats infrastructure as inherently less responsible than active participants. And platforms will use that framework to argue they're infrastructure, even when they're clearly not.

The real test will come when this logic collides with cases that involve human harm rather than economic loss. Copyright infringement is a property issue. Radicalization, mental health crises, and misinformation are public safety issues. Courts have historically treated those categories differently. But the Cox ruling gives platforms a template for arguing that the same principles apply: We're too big to monitor everything. Holding us liable would chill innovation. We're infrastructure, not editors. Even privacy advocates who've spent careers fighting platforms are now building tools inside them, recognizing that the infrastructure isn't going away and the fight has to happen from within.

The Supreme Court Just Handed Every Platform the Legal Framework to Ignore What Happens on Their Infrastructure
Image via Deadline

The Cox decision also arrives just as AI companies are facing their first wave of serious copyright litigation. OpenAI, Stability AI, and others are being sued for training models on copyrighted material without permission. Their defense is structurally similar to Cox's: We didn't infringe — our users might have, but we're just providing the tool. The Cox ruling doesn't directly apply to AI training, but it establishes a precedent that infrastructure providers can't be held liable for downstream infringement unless they materially contributed to it. That's a useful defense when you're arguing that a generative model isn't infringing, it's just learning from patterns in data.

What the Supreme Court has done, intentionally or not, is create a legal environment where scale becomes its own defense. The bigger the platform, the harder it is to prove material contribution to any individual harm. The more automated the system, the easier it is to claim neutrality. The Cox ruling doesn't say platforms are immune from liability — it just makes the bar for proving liability high enough that most cases won't clear it. And in a legal system where the cost of litigation is prohibitive and the burden of proof is on the plaintiff, a high bar is functionally the same as immunity.

The music industry will adapt, as it always has. Streaming made piracy less attractive by offering convenience, and labels learned to work with platforms rather than fight them. But the precedent the Cox case sets extends far beyond music. Every platform facing accountability pressure — for misinformation, for mental health harms, for labor exploitation, for environmental impact — now has a Supreme Court decision they can cite to argue that infrastructure can't be held responsible for what happens on it. The fact that the infrastructure is also the business model, the editorial engine, and the primary driver of harm becomes a legal footnote. And that might be the most significant shift in platform accountability in a decade.

More in

See All →