By Erick McAfee, Director of Growth at Supertab
AI has become the hungriest reader in history. Models devour articles, essays and posts at a scale no human audience ever could. They turn the results into fluent answers, but almost never credit the source or drive traffic back to it. For publishers, the consequences are obvious: less advertising revenue, fewer subscribers and a deep sense that their work is being treated as raw material rather than intellectual property.
For years, the relationship between content publishers and AI developers has been lopsided. Publishers had little recourse as their material was scraped, indexed and reused to train models that generated enormous value elsewhere. The existing system offered only a “robots.txt” file (more a polite request, than a legal contract) and the alternative, litigation, was slow, expensive and rarely addressed the root issue.
That imbalance may finally be shifting with the emergence of a new framework called Really Simple Licensing (RSL). Backed by early adopters like Reddit, Yahoo and Medium, RSL gives publishers a straightforward, enforceable way to define how their content can be used and to receive fair compensation when it fuels AI outputs. In other words, it transforms content from a passive input into a licensed asset, marking a significant step toward restoring balance in the digital economy.
A surge in scraping
The numbers show just how fast the problem has grown. In late 2024, TollBit tracked a doubling of AI scrapes per site within a single quarter. Scrapes per page tripled in the same period. Almost half of those hits came from bots that ignored site rules. And because AI queries don’t send users back to the source, the traffic gap is staggering. AI drives almost 96% fewer clicks than Google search.
Why a new approach matters
The old guardrail of robots.txt told crawlers which pages to skip, but compliance was voluntary. Many scrapers ignored it, but even the ones that respected it offered no compensation to the creators whose work powered their systems. A licensing standard like RSL changes the conversation. It introduces machine-readable terms that spell out conditions for use and a mechanism for payment. Instead of pleading with bots to behave, publishers can post rules and expect those rules to carry weight.
It’s a fundamental shift in how digital content is treated. Not ambient noise, not open common, but intellectual property with value.
New streams of revenue
The most compelling part of this model isn’t enforcement, but the revenue it could unlock. Publishers can tie compensation to how often their work is actually used, not just whether a page was crawled. That leads to ideas like pay-per-inference, where a small fee accrues each time an article contributes to an AI answer. No single inference is worth much but scaled across millions of queries the totals become significant.
Some companies have already started experimenting. Perplexity has committed to sharing revenue from its new Comet Plus product, promising publishers 80% of subscription income from a pool of 42.5 million dollars. Amazon struck a long-term deal with The New York Times valued at 20 to 25 million annually, granting access to sections of its archive for training purposes. These deals prove that content has monetary value in the AI economy. A standardized license such as RSL could allow many more publishers to participate without months of negotiation.
Hurdles to overcome
Of course, creating the framework is only the first step. Publishers and AI platforms still need to agree on what fair compensation looks like, and that won’t be simple. An excerpt from a news brief doesn’t hold the same value as a deep investigative piece, yet both could feed into the same model.
There are also technical concerns. The industry will soon need a way to transact that can meet customers where they are, in their preferred channels. Something instant, fluid and scalable. Today’s payment rails weren’t built for millions of machine-to-machine exchanges priced in cents. If every inference must be billed and processed like a credit card swipe, the economics would collapse. Systems would buckle under the strain of countless microtransactions. What’s needed instead is a new billing layer that charges for results, not friction.
Not to mention, adding license checks and micropayments could themselves introduce delays to systems optimized for speed, unless developed properly. Engineers will push back if the tradeoff feels too heavy. Verification remains another puzzle: how can a publisher confirm that a model used their work, and who settles disputes when claims conflict?
Finally, adoption must reach critical mass. If only a handful of companies embrace the standard, the scrapers that ignore rules will continue unchecked. True leverage only comes when compliance becomes the norm, either through reputation, regulation or market demand.
What success could look like
If these challenges are addressed, the benefits extend beyond revenue. Publishers would gain a steady income line that doesn’t depend on click-through rates. AI developers would gain legitimacy by showing they honor content rights. Readers would benefit from more transparency and accuracy since licensed sources can be credited and prioritized.
The broader cultural impact may be even more important. For years, publishers have felt like unwilling donors to a system that extracts without acknowledgment. A functioning license like RSL flips that dynamic, turning them into partners with a say in how their work is used. That shift in perception could rebuild trust at a moment when both media and AI are under intense public scrutiny.
Why the timing matters
The internet has always blurred lines between fair use and free use. Generative AI collapses those lines completely. Instead of quoting or linking, it ingests, trains and competes with the original work. Without intervention, that cycle will continue until publishers are left with no incentive to create, undermining the very supply of content that makes AI valuable in the first place.
A licensing standard might not be the sole answer. It won’t resolve every lawsuit or silence every critic. But it does provide the beginnings of a market where content has defined value and AI companies can innovate without endlessly looking over their shoulders.
It offers publishers a place at the table rather than a seat on the sidelines.
That’s why this moment feels pivotal. The balance between creation and consumption, between publishers and platforms, has tilted for too long.
If AI is going to thrive, so must the sources that feed it. And that requires a deal that both sides can live with.







