Opinion 8 min read

Platform Degradation Is Not a Bug. It Is the Business Model.

degradación plataformas
🎧 Listen
Mar 11, 2026

Opinion.

Experience what this article is about.

Platform degradation is the defining consumer technology story of the 2020s, and almost nobody is telling it correctly. The standard narrative blames greed, or stupidity, or a specific CEO’s personality disorder. The reality is more interesting and more disturbing: the platforms are not breaking because someone decided to break them. They are breaking because nobody decided they shouldn’t.

YouTube: Selling You What Was Already Yours

YouTube launched in 2005 as a free video platform. No ads on most videos. Background playback worked because, well, why wouldn’t it? You loaded a video, you switched apps, the audio kept playing. That is how media players work. Downloads were not a feature because the web itself was the feature.

Then came monetization. Ads appeared. Background play was quietly disabled on mobile. In 2015, YouTube Red (now YouTube Premium) launched at $9.99 per month, selling back the ad-free, background-play, downloadable experience that had been the default. By 2025, the individual plan costs $13.99, with grandfathered early adopters facing a 75% price increase as YouTube eliminated their legacy rates. In early 2026, Google confirmed it had closed the last remaining loophole allowing free background playback through mobile browsers.

The platform degradation pattern here is textbook: take a feature that exists by default, remove it, then charge for its restoration. YouTube Premium Lite, introduced at $7.99, added background play and downloads in February 2026, creating a three-tier system where users pay escalating fees to claw back functionality that a 2010 web browser provided for nothing. Google did not invent this playbook, but they perfected it.

Spotify: The Platform That Learned to Replace Its Own Product

Spotify’s problem is quieter and, in many ways, worse. According to Deezer (the only major streaming platform transparent enough to publish numbers), over 50,000 fully AI-generated tracks are uploaded daily, accounting for 34% of all new music delivered to Deezer as of late 2025. Spotify refuses to disclose its own figures. When asked directly, CEO Daniel Ek declined to share the percentage of AI-generated uploads. That silence is itself a data point.

But the AI flood is only half the story. In January 2025, music journalist Liz Pelly exposed Spotify’s “Perfect Fit Content” program, running since 2017, in which the company commissions stock music attributed to ghost artists and places it on popular playlists. The Ambient Chill playlist, once populated by Brian Eno, Bibio, and Jon Hopkins, was largely stripped of recognizable artists and replaced with anonymous filler tracks. The economics are straightforward: Spotify pays lower royalties on content it controls. Every playlist slot occupied by a ghost artistPerson who creates music, art, or content under a pseudonym or anonymously for hire, with no public credit or visibility. is a slot not generating royalties for a real musician.

Spotify denies creating “fake artists.” The distinction they draw is that Perfect Fit Content involves real recordings by real session musicians, just ones commissioned specifically for algorithmic placement. Whether you find that reassuring depends on how you feel about a platform designed to connect listeners with artists instead quietly replacing those artists with in-house product. The advertising industry has spent decades perfecting the art of giving people what they think they want while actually serving institutional interests. Spotify learned from the best.

Meanwhile, 97% of listeners in a Deezer-commissioned study could not distinguish AI-generated music from human-made tracks. The platform has no financial incentive to help them learn.

X: Pay to Exist, Get Grok for Free

X, formerly Twitter, represents the most brazen case of platform degradation. The verification checkmark, originally a trust signal assigned by Twitter to confirm identity, now costs $8 per month. Without it, posts receive roughly one-sixth the impressions of Premium accounts and one-fifteenth of Premium+ accounts. Basic features that were once universal, like meaningful reach and chronological feeds, have been quietly gated behind subscriptions.

Premium+ costs $40 per month (up from $16 before the Grok 3 launch in February 2025), bundling access to xAI’s chatbot whether users want it or not. That chatbot has generated its own controversies: in late 2025, Grok’s image generation capabilities were found to produce non-consensual sexual deepfakes, including of minors, prompting investigations by regulators in the EU, UK, and Australia in early 2026. Users paying $40 per month for better reach on a social network are subsidizing an AI tool that has attracted regulatory action across three continents.

The platform simultaneously fights AI-generated bot content while promoting its own AI product. This is not hypocrisy. It is the logical outcome of a company that needs to justify a $44 billion acquisition by extracting maximum revenue from every possible vector, coherence optional.

The Structural Problem: Nobody Is in Charge

The tempting explanation is that a few greedy executives ruined good products. The accurate explanation is less satisfying: platform degradation is an emergent property of systems optimizing for quarterly returns under conditions of permanent growth expectations.

A product manager at YouTube does not wake up thinking, “Today I will make the internet worse.” They think, “Background play conversion is a key Premium driver, and my team’s OKR is 12% subscriber growth.” A Spotify executive does not scheme to replace real musicians. They observe that commissioned content fills playlist gaps at lower cost and higher margin, and their fiduciary duty runs to shareholders, not to Brian Eno. Each decision is locally rational. The person making it can justify it with data, precedent, and competitive pressure.

This is the systems thinking problem applied to consumer technology. As Stafford Beer put it: the purpose of a system is what it does, not what it claims to do. YouTube’s purpose is not “broadcast yourself.” It is “convert free users to paid subscribers.” Spotify’s purpose is not “music for everyone.” It is “maximize per-stream margin.” These are not conspiracy theories. They are observable outputs.

The structural incentive is perpetual escalation. Public companies do not have the option of “enough.” Share prices reflect expected future growth, not current performance. A company generating $10 billion in profit that announces it expects to generate $10 billion next year will see its stock price fall. The market demands acceleration, and acceleration in a mature platform means either finding new users (increasingly difficult) or extracting more value from existing ones (increasingly aggressive). Platform degradation is not a failure of the model. It is the model working correctly.

Anarchy Wins Because Rules Are Slow

Regulation exists to prevent exactly this kind of consumer harm. It does not work fast enough. The EU’s Digital Markets Act, the most ambitious attempt to constrain platform behavior, took years to draft, negotiate, and implement. In that time, YouTube raised prices twice, Spotify’s Perfect Fit Content program operated undetected for seven years, and X restructured its entire business model around paid verification.

The asymmetry is structural. Platforms can deploy a new feature, a new restriction, or a new pricing tier in a sprint cycle (two weeks). Regulators operate on legislative timescales (two to ten years). By the time a rule addresses a specific harm, the platform has already moved to the next extraction mechanism. This is not because regulators are incompetent. It is because democratic governance is, by design, deliberative, and deliberation is slow. The regulatory map is always several versions behind the territory it tries to describe.

The result is a kind of functional anarchy. Not the philosophical kind with a coherent theory of mutual aid. The other kind: the absence of effective constraint, where the fastest-moving actors set the terms and everyone else adapts to the new reality. Nobody voted for background play to cost money. Nobody held a referendum on ghost artists. Nobody decided that a blue checkmark should be a subscription fee rather than an identity verification. These changes arrived as accomplished facts, and the collective response was a brief period of complaint followed by acceptance.

What Comes Next for Platform Degradation

Platform degradation will continue because the incentives driving it have not changed. As long as publicly traded technology companies are evaluated on growth metrics, they will find new features to remove and sell back, new ways to substitute cheaper inputs for expensive ones, and new services to bundle with subscriptions users did not ask for.

The honest assessment is that consumers have limited options. Switching costs are high (your playlists, your followers, your watch history are hostage data), network effects make alternatives unviable until they reach critical mass, and critical mass requires the kind of capital investment that reproduces the same incentive structures. The next YouTube will eventually need to sell background play too.

Understanding the mechanism does not fix it. But it does clarify one thing: this is not a story about bad people making bad decisions. It is a story about a system that converts good products into revenue extraction tools as reliably as gravity converts potential energy into kinetic energyThe energy an object possesses due to its motion. A mass moving at high speed carries kinetic energy proportional to its mass and the square of its velocity, determining its destructive capacity upon impact.. The process is not malicious. It is not even intentional in any meaningful sense. It is simply what happens when growth is mandatory and nothing else is.

Sources

Did you spot a factual error? Let us know: contact@artoftruth.org

Share
Facebook Email