Digital Fentanyl: The AI Doom Thesis
A forecast of 2030, and ways to fight our digital demons individually and collectively
Understanding Digital Fentanyl
Every society has a drug-defining era, an intoxicant that captures the zeitgeist and reshapes public life. For 19th-century China, it was opium. For industrial Britain, gin. In our time, the defining drug isn't a molecule but a medium:
the infinite doomscroll.
Alexander Good’s Doom Thesis is stark in its simplicity: AI development, instead of fueling a new productivity boom, is being channeled into the attention economy of hyper-personalized feeds, AI chatbots as companions, immersive pornography, and infinite-scroll generative worlds. These become what he and others call digital fentanyl: endlessly addictive, minimally productive, engineered to extract engagement rather than create value.
Rather than AI acting as a strong deflationary force in our economy, fighting money printing and in the fullness of time promising utopia for pennies, the doom thesis assumes AI will instead make overall productivity worse. That is, an AI-era internet is so addictive and enticing that it consumes large swaths of cognitive and physical bandwidth, stunting the growth of entire economies.
This isn’t conspiracy theory, but a model with the following logic:
AI runs on distraction, not productivity → the business model fueling LLMs is ads, subscriptions, and engagement, not real economic output.
Distraction becomes the most profitable commodity → hyper-personalized loops of content, games, and social interaction optimized for addiction.
LLMs differ from past AI dreams → instead of self-driving cars, we get infinitely adaptive attention traps that mirror us back to ourselves.
Personalized Addictive World Generators (PAWGs) → Good’s shorthand for immersive, tailored digital worlds where people spend nearly all waking hours.
Digital fentanyl, not digital weed → constant iteration and memory make these loops stronger over time, pulling users deeper into synthetic presence.
Economic paradox → AI profits soar while global GDP growth stagnates because distraction doesn’t create productivity, only monetized attention.
The death of employability → beyond job loss, chronic distraction erodes the capacity to work or focus at all; attention becomes the scarcest skill.
The existential doom → society trades liberation for curated captivity, with humans increasingly unfit for meaningful participation in real life.
What trends already point us in that direction? We don’t have to look far:
Teenagers and adults are logging 6 – 8 hours daily on their phones, a trend that has broadly risen since the 2010s.
Doomscrolling late into the night is affecting sleep quality and mental health en-masse, despite better intentions.
Scaling startups are monetizing AI girlfriends whose primary innovation is emotional manipulation designed to keep users engaged.
The intelligence age accelerates individual capability and autonomy, but also allows for a sinister and complex hijacking of the pleasure and reward centers in the brain. However, Good’s thesis is not necessarily destiny. There is a chance we can contest it, if only to save our wits from ourselves.
Why Civilizational Bandwidth is Being Hijacked Now
Technology inherently is not “good” or “evil”. We’ve always lived with dual-use technologies. The printing press could distribute Luther’s Reformation pamphlets, or the Inquisition’s propaganda. Radio could broadcast Mozart, or Goebbels. Television could air Sesame Street, or 24-hour cable outrage. AI is just the next medium in the sequence, but with two key differences:
Scale. Models generate infinite tailored content for billions simultaneously.
Precision. No two doomscrolls are alike. Each feed is a fingerprint of your vulnerabilities and your unconscious shadow.
Meanwhile, global headwinds intensify the stakes:
USD decline. The dollar’s share of global reserves has dropped from ~70% two decades ago to ~58%. Nations increasingly trade in yuan, euro, or CBDCs. Monetary fragmentation undermines the U.S. leverage that underpinned the last 70 years.
Multipolar geopolitics. Pax Americana is waning. China asserts regional dominance; India ascends; Europe hedges; America polarizes.
Demographics. The youngest generations, those who will inherit this splintered order, are also the most immersed in algorithmically curated worlds.
We roughly have until 2030. I say this not just because it’s a round number, but because by then, many AI researchers believe we will be close to, or have hit the AI singularity. We will be well past pax-Americana and the global dollar hegemony. A half decade is probably a good runway for AI to seep into mainstreams app adoption, interfaces, and devices. It’s also more than enough time for AI algorithms to get incredibly good at understanding what captivates your attention, energy, and focus.
We don’t really know what life looks like then, but it is certain that existing trends will accelerate the momentum of change. AI adoption will cause macro shifts in the meaning of money, work, life satisfaction, and technological advancement. Despite sounding doomer, this isn’t fear mongering. It’s about what happens when civilizational bandwidth gets hijacked at the same time as when global coordination is perhaps splintering when it is most needed.
The Doom Thesis in Three Futures
I frame three scenarios for the Doom thesis playing out: Bear, Bull, & Base cases.
Bear Case: Addiction Supercycle
In this trajectory, Good is vindicated. By 2030:
Average screen time approaches its upper possible limit, 8+ hours per day.
GDP growth stagnates despite trillions invested in AI and technology.
AI companions replace real socialization; generative porn outpaces generative productivity. Birth rates continue to drop off a cliff.
U.S. soft power collapses as domestic politics dissolve into algorithmic psy-ops. Much of online content is AI-generated and algorithmically driven
The world’s most valuable companies are not infrastructure builders but casino architects with better slot machines.
Social fabric frays. Students can’t concentrate, workers burn out, mental health crises increase. Civilizational bandwidth is spent in algorithmic amusement parks. As Good put it: “Everyone will be spending all of their time in highly addictive generative worlds”. This world feels very techno-feudalist, except the modern peasant isn’t renting their labour to work on land for produce, labor, and protection. Rather, they are renting their minds to attention-capturing hyper stimuli that captively brainwash the individual. At least AI ads convert more reliably with personalized sales funnels created just for you!
Bull Case: The Reclamation
In the optimistic path, humanity re-channels some of the AI efforts into productivity, health, and collective flourishing:
Enterprise AI drives global GDP up 7% across the decade.
AI tutors democratize education. AI doctors slash diagnostic error. AI assistants become ubiquitous work multipliers.
Local AI models running on device act as cognitive firewalls helping the user avoid hyper-stimuli that would otherwise draw them into addictive loops.
Regulation intervenes: China’s game curfews, Utah’s social media curfew law, the EU’s AI Act proactively start banning manipulative loops.
Cultural norms flip. Digital wellness becomes high status, like fitness was in the 2010s. Being offline and less reachable becomes a status symbol.
Here, AI is more humbly integrated as a tool, not a master. The attention economy loses prestige, we find ways to build our own agency, and re-claim our time. Productivity is a reflection of a more capable, working population.
Base Case: The Patchwork of Futures
What I feel is perhaps more probable is something in the messy middle:
Distraction AI coexists with productivity AI. A real bifurcation happens between people who can manage their executive functioning in light of hyper stimuli, and those who cannot, and get one shotted by addiction to the point of anhedonia.
Some geographies enforce sabbaths (China, EU); others with their staunch individualism, maintain a more laissez-faire approach (U.S.).
Class divides widen: elites firewall their kids from distraction, working classes rely on AI babysitters (the new iPad kids), losing cognitive agency to endless interactive consumption.
Intentional communities codify discipline: network states where members constitutionally limit screen time, build real world social groups and norms. Perhaps most ironically, DAOs that reward offline hours with tokens. I’m not sure if “capitalism solves this”, but what a neat way to incentivize and socially status signal your offline-hood via online means 😆
Consider the physical laws surrounding digital borders and access rules:
Casino Cities (addiction loops, 24 hour cities).
Discipline Enclaves (network states, crypto-firewalled communities).
Analog Revival Zones (cultural countercurrents, slow media).
Hybrid Commons (AI used for productivity but with guardrails).
Result: not uniform doom, not uniform renaissance, but divergence. A patchwork of distraction dystopias and productivity enclaves. If the internet increases variance, AI exponentially bifurcates the trend. Just as hyper-palatable food has drifted so many people towards metabolic dysfunction, so too can hyper-stimulating content drift many towards cognitive dysfunction as a default function of modernity.
Cultural Countercurrents:
Laws, Norms, Status
Even in the bear-case drift, counter-movements flicker signs of corrective hope.
China’s curfews. Since 2021, Chinese minors have been locked out of games after 10 p.m. Official reports claimed youth gaming addiction was “basically resolved”.
Screen-free parenting. In the U.S. and Europe, parents now treat devices like dessert, not diet.
Analog revival. Film photography, vinyl records, paper journals. Imperfect, tactile mediums as cultural counterpoints to glossy algorithmic feeds.
Digital hygiene as luxury. Elites already firewall their kids from screens. The Silicon Valley CEO who builds attention traps sends their child to a sort of Waldorf school with no devices.
Consider briefly the current state of these things. Law: Most Western countries generally seem to be against the government imposing social media or gaming blackout periods. Norms: Cultural norms are well embedded in online social media, based on memetic popularity and engagement. Status: It’s also much easier to status boost online with a reach of thousands, versus doing so in person with a reach of perhaps dozens of people. While it might be more fulfilling and enriching to have deeper in-person relationships, its hard to convince someone that the pursuit of touching grass is worth stepping away from poasting terminally online. Engaging with a following thats orders of magnitude larger only than in person, makes it hard to revert back to in-person status games.
The Crypto-AI Nexus:
Governance and Escape Velocity
Crypto is the parallel infrastructure to AI’s addiction loops. OS Addictivo (AI attention engines) and OS Autonomia (crypto protocols and local firewalls).
Projects like Post Fiat propose agentic protocols: AI entities that pay for compute in tokens, govern themselves, and resolve disputes on-chain. These could become the autonomous corporations of the AI age.
But crypto inherits the same fork of possibilities:
If tokens flow into AI casinos, the gamblification of everything, and generative porn, the Doom Thesis compounds.
If tokens flow into firewall AIs: Decentralized, locally-run models that filter addictive media, crypto becomes the immune system against digital fentanyl.
Crypto is the exit hatch and the amplifier. Liquidity flows not just as capital but as culture, and the channels it carves become the new branches of humanity. The choice to adopt and use these tools is not accidental - it must be made consciously with intention.
Autonomy vs. Belonging
This is the ancient tension between the individual and the group. How much autonomy are we willing to surrender to reclaim sovereignty from addiction engines?
Individuals can delete apps, buy dumb phones, or run local LLMs that strip addictive features.
Communities can enforce sabbaths, token-incentivize, culturally enforce offline life, or build digital-minimalist network states with offline living as a key health consideration and moral innovation, at the forefront.
Nations can legislate curfews, design bans, and liability laws, but at the cost of personal freedom.
Freedom without discipline is likely to lead to cognitive capture. Pure autonomy can easily default to enslavement under algorithms. Collective sovereignty requires collective rules, where the enforcement depends more on the socio-cultural fabric of society, rather than placing the burden of resisting temptation completely on the individual. Every individual and group will have to wrestle with striking the balance between having digital autonomy and following (or choosing not to follow) collectively enforced rules to prevent fallibility to hyper-stimuli.
Empowering Resilience
Individual
Audit your digital diet like calories: track screen time, not steps.
Anchor yourself in analog rituals: journaling, martial arts, hiking.
Experiment with emerging mainstream options to run local AIs as firewalls from distraction and mindless consumption. In a podcast, Vitalik lays out his concept of d/acc: a stance that pushes back against risks by favoring defense-oriented tech and decentralization rather than centralized control. Vitalk says “if you’re going to regulate AI, then explicitly exempt anything that just runs locally on realistic consumer hardware. And so making that explicit separation between smaller‑scale stuff and top‑of‑the‑line big corp stuff ”. In essence, it’d be nice to use small, local AI models as a continuously learning cognitive firewall to prevent attention capture, when we are online!
Community
Codify digital sabbaths. Be it laws, rules, or guidelines.
Incentivize offline activity. Allow for the proliferation of third spaces: gardens, gyms, local clubs.
Acknowledge the cognitive bandwidth capture if we get sucked into our devices, and how it can affect the collective mental health of a population - one that is chronically under-slept, overstimulated, and emotionally anhedonic.
Doom can Bloom with Discipline
Good’s Doomer Thesis is not prophecy, but like many things in life, acts as a reflective mirror showing multiple possibilities. It reflects the direction we’re generally sliding towards, since by 2030, AI will be very powerful and distributed everywhere. The question isn’t whether AI growth explodes - that it will. The real question is whether we thrive with it, or whether it thrives at our expense.
The paradox: doom can be useful. Naming the addiction sharpens our resistance. Shedding light on the ways in which our reptilian brain is getting hijacked is a big first step. Framing the thesis gives us an explicit choose. The antidote to digital fentanyl isn’t just better regulations, enforcement, cultural norms, or healthier apps. It’s better discipline. And discipline, when practiced collectively, might be how individual freedom survives.
Support the Blog:
You can own Parallel Citizen’s creator coin on Zora
You can also collect this post as a Zora token here
Any tips on Paragraph are also paired with the creator coin
Sponsors:
(purchasing through these links helps support the blog)
Trezor - Open-source hardware wallets for sovereign custody
Article Summary:
💊 The Substance → personalized addictive world generators (PAWGs).
🤝 The Dealer → AI platforms optimizing engagement.
🚀 The High → endless novelty, synthetic intimacy.
😵💫 The Withdrawal → collapse of attention, employability, civic bandwidth.
🏡 The Rehab → intentional communities, local AI firewalls, cultural countercurrents, discipline.
Great read! Keep up the good work! 🫡