AI & Cookies: A Potential Regulatory Disaster
We don't want another cookie monster mucking up the future of generative AI.
I found myself at a US-EU trade conference recently, discussing the eternal question of transatlantic collaboration. Fresh from a few days in the UK, I offered what seemed like obvious examples: USB-C mandates represent smart regulatory harmony, while cookie banners exemplify bureaucratic overreach gone global.
The senior US trade representative’s response stopped me cold: “Well, you just press enter. It’s not a big deal.”
That moment crystallized everything wrong with how we approach tech regulation. Here was someone charged with understanding international commerce who fundamentally misunderstood both the technology and business implications of one of the EU’s most visible digital policy failures.
The Hidden Infrastructure of “Just Press Enter”
The cookie banner isn’t just an annoyance you dismiss. It’s the visible tip of a massive compliance iceberg. Behind every “Accept All” button lies an expensive, complex infrastructure that businesses worldwide must build to satisfy EU regulators, regardless of whether they have a single European customer.
Companies need consent management platforms, legal reviews, privacy audits, data mapping exercises, and ongoing compliance monitoring. A typical enterprise might spend hundreds of thousands of dollars annually just to enable users to “press enter” on a system that, by the EU’s own admission, fails to meaningfully protect privacy.
But the deeper problem isn’t cost; it’s effectiveness. Cookie banners represent regulatory theater at its worst. They create the illusion of user control while doing virtually nothing to address actual privacy concerns. Meanwhile, sophisticated tracking continues through device fingerprinting, server-side analytics, first-party data collection, and dozens of other methods that bypass cookie consent entirely.
The Externality Problem
This regulatory misfiring illustrates a crucial economic concept: externalities. These are the costs or benefits that spill over to parties not directly involved in a transaction.
Negative externalities occur when actions impose costs on others. Think pollution from a factory affecting nearby communities. Cookie banner regulations create negative externalities by forcing global compliance costs and user experience degradation on businesses and consumers worldwide, while delivering minimal privacy benefits.
Positive externalities happen when actions benefit others. The USB-C mandate creates positive externalities by reducing electronic waste, simplifying consumer choices, and enabling economies of scale in manufacturing. They’re the reason we’ll soon have fewer cables in our bags. These benefits extend far beyond the EU’s borders.
Smart regulation creates positive externalities; poor regulation exports negative ones.
The AI Regulatory Stampede
As artificial intelligence matures, we’re witnessing a regulatory feeding frenzy. The EU’s AI Act, various US state initiatives, and federal proposals are all racing to be first to market with AI oversight. The technology’s transformative potential makes it irresistible to regulators who want to claim they’re “ahead of the curve.”
But are we repeating the cookie banner mistake on a massive scale?
Early signs are concerning. Much proposed AI regulation focuses on visible, easily regulated aspects rather than actual risks. Requirements for AI disclosure labels, impact assessments, and algorithmic auditing mirror the cookie banner’s emphasis on process over outcomes. Meanwhile, the most significant AI risks, from algorithmic bias to security vulnerabilities, often happen behind closed doors where regulatory theater offers little protection.
The Cost of Getting It Wrong
Poor tech regulation doesn’t just waste money. It erodes respect for law itself. When businesses spend millions complying with rules that everyone knows are ineffective, it breeds cynicism about regulatory competence. When users mindlessly click through dozens of consent dialogs daily, it trains them to ignore important decisions.
This regulatory credibility crisis matters enormously as we approach genuinely transformative technologies. AI regulation will need broad public trust and business cooperation to succeed. Squandering that trust on cookie banner-style theater leaves us less equipped to handle the real challenges ahead.
Learning from Success Stories
The USB-C mandate shows what good tech regulation looks like. It identified a genuine problem (proprietary charging cables creating waste and consumer friction), chose an outcome-focused solution (standardization), and created benefits that extended globally through market dynamics rather than legal mandates.
Effective AI regulation should follow similar principles: focus on measurable outcomes rather than process compliance, leverage market incentives rather than fighting them, and create positive spillover effects that benefit global innovation rather than just protecting local interests.
The Path Forward
That trade representative’s casual dismissal of cookie banner complexity reveals a broader problem: policymakers often lack the technical understanding necessary to craft effective tech regulation. As AI governance takes shape, we need regulators who understand not just the policy goals but the technological and economic realities of implementation.
The stakes are too high for another cookie banner bust. AI regulation done right could foster innovation, protect consumers, and create positive global spillovers. Done wrong, it could strangle emerging technologies in bureaucratic red tape while failing to address genuine risks.
The choice is ours. We can build regulatory frameworks that promote commerce and wellbeing, or we can export our mistakes (or import others) while undermining respect for law itself.
Let’s hope we choose wisely and that our trade representatives understand what we’re actually choosing.


