Opinion.
Platform user intent is supposed to be sacred in product design. You type a query, click a button, make a selection, and the system fulfills it. Our favorite human — who, fittingly, got fed up after Google ignored his own quotation marks — pointed us toward what happens when that contract breaks, when major platforms decide they know better than you what you really wanted. It turns out the problem is well-founded, well-documented, and probably getting worse.
Here is a simple test. Go to Google, put a phrase in quotation marks, and search. Quotation marks, in every search engine since the 1990s, have meant one thing: find this exact phrase. Google no longer reliably does this. Search for “verbatim phrase here” and you will get results that contain neither the phrase nor all the words in it. The operator still exists. It just does not always work. Google’s own Search Liaison has acknowledged that quoted searches may not return exact matches. The company’s explanation is that its systems sometimes determine other results are “more helpful.”
Read that again. You told the machine exactly what you wanted. The machine decided you were wrong.
The Pattern of Overriding Platform User Intent
This is not a Google-specific problem. It is a structural tendency that emerges whenever a platform reaches sufficient scale and market dominance that users cannot practically leave. The mechanism is straightforward: once switching costsThe cost or friction a user faces when moving from one platform to another, including time, money, and effort invested in the original. Also called switching barriers. are high enough, the platform can begin substituting what it wants to show you for what you asked to see, because you have nowhere else to go.
YouTube’s search function is the most visible example after Google itself. Search for a specific video title, creator, or topic, and YouTube will return a few vaguely related results followed by a wall of algorithmically recommended content that has nothing to do with your query. The platform has gradually transformed its search bar from a search tool into another recommendation surface. You typed a query; YouTube heard “show me whatever maximizes watch time.” Users have documented this extensively, and YouTube has not denied it. The company’s incentive is clear: a search that returns exactly what you wanted might send you to one video. An algorithm that ignores your search and serves you optimized content keeps you scrolling.
X (formerly Twitter) has taken this further. The platform’s “For You” feed is the default view, and it aggressively surfaces content from accounts you do not follow, on topics you have not expressed interest in, with engagement patterns that suggest algorithmic amplificationAlgorithmic promotion of content beyond organic reach, independent of user relevance or intent. Platforms use this to maximize engagement metrics regardless of whether it serves what users requested. rather than organic reach. You can switch to “Following,” which theoretically shows you only accounts you chose, but even this feed is not chronological and includes injected content. The platform does not merely supplement your choices; it overrides them. Platform user intent, in this model, is a suggestion, not a directive.
Amazon’s search results are another case study. Search for a specific product and the first several results are frequently “sponsored,” meaning advertisers paid for placement regardless of relevance. Amazon’s first page of results for many queries is now heavily populated by sponsored listings. The product you searched for may be on the page, but it has been pushed below products that paid to be there. Amazon sellers now pay an estimated 50% or more of their revenue in various platform fees, according to Marketplace Pulse. Those costs get passed to consumers, but they also mean that search results increasingly reflect advertising budgets rather than relevance.
The Academic Evidence on Platform User Intent
This is not just user frustration dressed up as analysis. Researchers at Leipzig University and Bauhaus-University Weimar conducted a year-long study examining 7,392 product-review search terms across Google, Bing, and DuckDuckGo. Their findings, published in 2024, were stark: higher-ranked pages were on average more optimized for search algorithms, more heavily monetized through affiliate marketing, and showed signs of lower text quality. The majority of high-ranking product reviews used affiliate marketing, even though only a small portion of product reviews on the broader web use affiliate links at all. The search engines were not surfacing the best content. They were surfacing the most optimized content, which is a fundamentally different thing.
The study confirmed what users had been saying for years: search results are getting worse. Not in a vague, nostalgic “the internet used to be better” sense, but in a measurable, documented decline in the relationship between what users search for and what they receive.
Why This Happens: The Three-Phase Decay
Cory Doctorow’s framework of “enshittificationA three-phase pattern where platforms first attract users, then exploit them for business customers, then exploit those business customers while degrading all earlier beneficiaries. Coined by Cory Doctorow.” describes the mechanism with uncomfortable precision. Platforms follow a lifecycle: first, they are good to users to build an audience. Then they exploit users to attract business customers (advertisers, sellers, content creators). Finally, they exploit those business customers to extract maximum value for the platform itself. At each stage, the platform degrades the experience for the previous beneficiary.
Google’s founders understood this risk from the beginning. In their original 1998 Stanford paper, Sergey Brin and Larry Page wrote that “advertising funded search engines will be inherently biased towards the advertisers and away from the needs of the consumers.” They were describing a hypothetical danger. Twenty-seven years later, they are describing their own company.
The pattern is not unique to tech. It is a standard outcome when a company achieves enough market power that the cost of losing customers becomes lower than the revenue gained by degrading their experience. Airlines did it with seat spacing. Cable companies did it with bundling. Banks did it with overdraft fees. The difference with digital platforms is speed and scale: they can adjust the degradation algorithmically, in real time, for individual users, and measure exactly how much worse they can make things before someone actually leaves.
The Substitution of Judgment for Platform User Intent
What makes this particularly corrosive is the implied epistemology. When a search engine overrides your quoted query, when a video platform ignores your search terms, when a social network fills your feed with content you did not request, the platform is making a claim: it knows what you want better than you do. And it might even be right, in the narrow sense that its engagement metricsMeasurable indicators of user interaction—clicks, time spent, scrolls—that platforms optimize for as a proxy for user satisfaction, though they often reward compulsive behavior over intentional satisfaction. probably go up. You probably do click on the recommended video. You probably do spend more time scrolling the algorithmic feed than the chronological one.
But “you clicked on it” is not the same as “you wanted it.” Slot machines have excellent engagement metrics. The question is not whether platforms can capture more attention by overriding user intent. The question is whether that attention is freely given or structurally coerced. When every alternative has been degraded or eliminated, when switching costs include years of data, social connections, and muscle memory, “the user chose to stay” is doing a lot of work.
Ted Gioia has described the end state of this trajectory as “dopamine culturePlatform design that shifts from delivering content users seek to delivering stimuli that trigger compulsive engagement—passive and unchosen rather than active and intentional.,” where platforms shift from delivering content users seek to delivering stimuli that trigger compulsive engagement. The distinction matters: seeking is active, intentional, and terminates when satisfied. Compulsive engagement is passive, unchosen, and does not terminate because satisfaction was never the design goal. The algorithm does not want you to find what you are looking for. It wants you to keep looking.
The Quiet Removal of User Controls
Equally telling is what platforms remove. Google has steadily deprecated advanced search operators over the years. The “+” operator, which forced inclusion of a term, was removed in 2011 (officially to free up the symbol for Google+, a product that no longer exists). The “verbatim” tool, which was supposed to replace exact-match functionality, is buried in the interface and does not always produce verbatim results. Boolean operators work inconsistently. The message is clear: you are not supposed to have precise control over your search results. Precision is the enemy of engagement.
YouTube removed the ability to sort search results by date for many queries. Twitter removed chronological sorting as the default. Facebook’s News Feed has never given users meaningful control over its ranking algorithm. Instagram switched from chronological to algorithmic feeds in 2016, over widespread user objection. In each case, the company explained that the change was what users “really” wanted, or that it “improved the experience,” or that it “surfaced more relevant content.” In each case, users had not asked for the change and many actively protested it.
The pattern is consistent enough to constitute a design philosophy: users should not be able to specify what they want with too much precision, because precise fulfillment of platform user intent is commercially suboptimal.
What Respecting User Intent Would Look Like
A platform that respected user intent would do something remarkably simple: when you search for something, show you that thing. When you put a phrase in quotes, find that phrase. When you follow specific accounts, show you those accounts. When you sort by date, sort by date. When you click “search,” search.
None of this is technically difficult. All of it was standard functionality fifteen years ago. The technology has not regressed. The business model has. What users lost was not capability but priority. They went from being the customer to being the product, and when you are the product, your preferences are a variable to be optimized, not a constraint to be respected.
The fix is not nostalgia, and it is not likely to come from the companies themselves. It will come, if it comes, from regulation (the EU’s Digital Markets Act is a start), from competition (smaller tools like Kagi that charge users directly and therefore have aligned incentives), or from users developing enough collective irritation to make the switching costs worth paying. Until then, the platforms will keep doing what they have always done when they reach sufficient scale: deciding they know better than the people they serve.
They usually don’t.



