You’re not a consumer anymore. You’re being consumed.
I know that sounds dramatic. But stay with me—because the shift is already happening, and most of us haven’t noticed yet.
Recently, I came across two research articles about the attention economy that connected dots I’d been circling for months. One examined how digital platforms use AI to externalize our habits—meaning our behavioral patterns are no longer managed by us, but by algorithms optimizing for engagement. The other explored how attention has evolved from a metaphor into a literal currency that can be accumulated, traded, and monetized.
Put them together, and you get a picture of where we are—and where we’re headed.
But here’s the thing: The infrastructure is already built. The systems are already running. AI agents are already making decisions on your behalf.
The only variable left? How much more of your time they’ll get to extract.
And if recent interviews with people like Elon Musk are any indication—where he’s straight-up saying people won’t have to work in the near future—that extraction is about to go exponential.
So let’s talk about what’s really happening. Not in some dystopian future. Right now.
The Attention Economy: From AI Innovation to Human Extraction
Back in 2017, Google researchers published a now-famous paper titled “Attention Is All You Need.” It was about transformer architecture for AI—the technology that powers ChatGPT and most modern AI systems. The paper’s title was clever, a play on The Beatles’ “All You Need Is Love.” But the underlying message was literal: For AI to work, attention mechanisms are everything.
Fast forward to today, and that phrase has taken on a much darker double meaning.
Because it turns out: Attention is all they need, too.
“They” being the platforms, the algorithms, the companies whose business models depend on capturing and keeping your attention. And increasingly, “they” includes the AI agents that are learning to manage your habits, predict your behaviors, and make decisions on your behalf.
That 2017 paper revolutionized AI development. It showed that by building systems that could dynamically allocate “attention” to different parts of input data, AI could understand context, generate coherent responses, and perform tasks that previously seemed impossible.
The transformer architecture born from that paper is now everywhere: ChatGPT, Claude, Google’s Gemini, translation tools, content recommendation systems—all powered by attention mechanisms.
But here’s the irony: While Google was perfecting AI’s ability to manage attention, tech companies were simultaneously perfecting their ability to capture human attention and convert it into profit.
The attention economy had been building for decades—from newspaper ads to television commercials to website banner ads. But the combination of social media, smartphones, and AI-powered recommendation algorithms created something fundamentally different.
It’s no longer just about showing you ads. It’s about understanding your habits so deeply that the system can predict—and guide—your next action before you consciously decide to take it.
And increasingly, it’s about having AI agents execute those actions on your behalf, removing you from the decision-making loop entirely.
How the Attention Economy Harvests Your Free Time
Most people spend 40+ hours a week at work or school.
The real extraction? That happens in your “free” time.
Let’s do the math:
- 168 hours in a week
- Minus 56 hours sleeping (if you’re lucky) = 112 waking hours
- Minus 40-50 hours working/school = 60-70 hours of “free time”
That’s 60-70 hours per week when platforms are gathering data on what you watch, what you buy, who you follow, when you’re most distracted, most likely to impulse-buy, what notifications make you stop what you’re doing.
All of this gets analyzed, packaged, and used to predict—and guide—your next behavior.
That’s the current model: Capture attention during leisure time. Convert it into behavioral data. Sell access to that attention to advertisers.
How the Attention Economy Actually Works
Last week, my son was shopping for hair clippers. Different device. Different profile. Different apps.
Within 24 hours, I started seeing ads for men’s grooming supplies on my YouTube feed.
The only thing our devices had in common? The WiFi network.
Our router—that box sitting in the corner we barely think about—was logging data, connecting behavioral patterns across devices, building a household profile that transcended individual users.
The system doesn’t care if my son orders the clippers or if I do after thinking “those look nice, Ricky might like those.”
Either way, purchase made – mission accomplished.
And this is happening across your smartphone, smart TV, smart watch, smart speaker, smart home devices. Each one running multiple apps. Each app tracking different behaviors. All cross-referencing with each other.
Your smartphone alone might have 50+ apps. Each one collecting data.
The data we provide isn’t just increasing—it’s compounding.
That’s how the attention economy operates today. But here’s where it gets unsettling.
The Attention Economy’s Next Frontier: When Work Disappears
Imagine this:
- What if people didn’t have to spend most of their waking hours at work?
- What if—due to automation, economic restructuring, or policy changes—people didn’t have to work at all?
Suddenly, those 40-50 hours previously spent at work become what? More “free time.” More hours scrolling, watching, clicking, shopping.
Imagine all the screen time data companies could gather.
This isn’t some far-fetched scenario. Several countries and even some U.S. cities are experimenting with versions of Universal Basic Income—programs that provide people with regular payments regardless of employment status.
The attention economy is already preparing for this shift. And the preparation includes something most people haven’t considered yet.
AI Agents in the Attention Economy: The Automation of Consumption
What if AI agents start making purchases and managing your habits automatically?
We’re already seeing early versions: Amazon’s “Subscribe & Save” auto-orders when you’re running low. Smart fridges “suggest” grocery orders. AI assistants “learn your preferences” and make recommendations. Apps auto-pay bills, auto-renew subscriptions, auto-book appointments.
Now scale that up to a near-future scenario:
AI agent notices you’re stressed—heart rate data from your smart watch. Cross-references with past behavior—you ordered comfort food during previous stress episodes. Completes purchase automatically—one-click or zero-click ordering. Platform captures everything: emotional state + trigger + action + outcome. System refines its model for next time.
You didn’t “decide” to order pizza. Your AI agent did. Based on data about you.
The platform doesn’t need your conscious decision-making anymore. It just needs your habits—and increasingly, it has AI agents to execute those habits on your behalf.
How the Attention Economy Redefined You: From Consumer to Raw Material
What are you in this system?
We’ve heard it before: “If it’s free, you’re the product.”
But I don’t think that’s quite right anymore. You’re not even the product. You’re the raw material being processed.
Here’s what I mean:
The Old Model: You as Consumer
You see an ad. You decide to buy (or not). You use the product. Company makes money.
You had agency. You were the active party—choosing, buying, consuming.
The Current Model: You as Product
Platform provides “free” service… You use it… Platform captures your data and attention… Advertisers pay platform for access to you.
You’re still somewhat active—but you’re also the thing being sold.
The Emerging Model: You as Raw Material
Your habits get extracted… Your behaviors get analyzed… Your patterns get packaged… Your attention gets sold.
Meanwhile, AI agents are becoming the actual economic actors—making purchases, clicking ads, completing transactions. Using your money. Generating revenue for platforms.
And you? You’re just existing. Generating data. Being processed.
The researchers whose work sparked these thoughts describe this as “habit externalization”—your behavioral patterns are no longer managed by you, integrated into your sense of self and autonomy. Instead, they’re managed by algorithms optimizing for platform goals, not your wellbeing.
Not because anyone is evil.
Because that’s how the system works.
Let’s Be Clear: Technology Isn’t the Villain
I’m not anti-technology. Neither are the researchers whose work sparked these thoughts.
Think about the last time you needed directions to a place you’d never been. Or when you video-called family across the country. Or when you found a community of people who actually understood what you were going through.
Technology makes that possible. That’s not trivial.
But here’s the part we’re not talking about enough:
Every convenience is a trade.
The question isn’t “Is this evil?” It’s “What am I trading, and is it worth it?”
And you can’t make that choice if you don’t know what you’re giving up.
The system won’t change on its own. Platform incentives are clear: maximize engagement, extract data, optimize for time-on-app. That’s not evil—it’s just how the business model works.
But you can change how you interact with the system.
Not perfectly. Not completely.
But meaningfully.
Navigating the Attention Economy: What You Can Actually Do
Nobody has this figured out—including me.
But here’s what I’m trying, and why it might actually work.
1. Understand What You’re Trading
Those vague Terms of Service aren’t just boring—they’re deliberately incomprehensible.
Before adopting any new app or device, ask: What data does this collect? Who can access it? Can I delete it later? Is there a low-tech option that serves the same purpose?
Use tools like ToS;DR (Terms of Service; Didn’t Read) for plain-language summaries.
That fitness tracker motivating you to walk 10,000 steps? It’s also collecting movement patterns, heart rate data, sleep cycles, and location history. Is the motivation worth the surveillance? Maybe yes. But make that choice consciously.
You don’t need a WiFi-connected watch to take your blood pressure. A manual cuff works fine and doesn’t send your health data to third parties.
2. Create Off-Grid Time
Your attention is finite. If it’s always being spent on platforms, it’s always being extracted.
Limiting your screen time isn’t about punishment. It’s about restoration.
Morning coffee without your phone. Family dinner with devices in another room. The first and last hour of your day screen-free.
Your bedroom after 9pm. The dinner table. These can be spaces where your habits are your habits—not externally managed patterns optimized for someone else’s profit.
Paper books don’t track you. Handwritten journals don’t sync to the cloud. Face-to-face conversations don’t get mediated by algorithms.
3. Add Friction Where It Matters
Most of our tech use isn’t decided—it’s defaulted.
Auto-play next episode. Accept all cookies. One-click ordering.
When companies remove friction to make things “easier,” they’re removing your decision points.
Which apps send you notifications? Turn off the ones that aren’t essential. Which have auto-play enabled? Disable it. Which track your location “always”? Switch to “while using” or “never.” Which store your payment info? Remove it.
Delete social media apps from your phone—use the browser instead. Turn off one-click ordering. Require a password for app downloads.
Every removed friction point is a removed opportunity for conscious choice. Add it back where your values matter most.
4. Question Your Passive Decisions
Algorithms are designed to feel effortless. You “just happen” to watch the next video. The path of least resistance is engineered.
Why am I opening this app right now? Boredom? Habit? Actual need?
What was I looking for? Did I find it, or get distracted?
How do I feel after 30 minutes of scrolling? Refreshed or depleted?
Would I have wanted this without the suggestion?
You can’t change patterns you don’t notice. The first step is awareness, not perfection.
5. Treat AI as Your Employee, Not Your Boss
As AI agents become more prevalent in making decisions “for” us, we risk outsourcing our agency entirely.
When AI recommends, suggests, or auto-completes: Question why it’s showing you that option. Ask what you’re not seeing because of its choices. Reject defaults that don’t align with your goals.
The moment it feels like you’re working for the AI instead of the AI working for you—it’s time to reset.
If You’re a Parent or Educator
We’re the first generation raising kids who will never know a world without attention extraction.
What matters most isn’t banning technology—it’s building awareness and agency.
Help kids understand that “free” apps are paid for with attention. Show them how platforms profit from habit formation. Ask them to notice when they’re being targeted.
Don’t just set screen time limits—ask questions. What made you want to download that game? Why does that app send so many notifications? What are you looking for when you scroll?
And here’s the hard part: You can’t teach boundaries you don’t practice. Breaking your own cycles helps them avoid forming the same ones.
This Goes Beyond Your Living Room
Everything we’ve talked about so far? That’s personal-level action. Your devices. Your habits. Your family.
Important? Absolutely.
Enough? Not even close.
Because the attention economy isn’t just a series of individual choices—it’s a system. And systems don’t change because people opt out quietly. They change when enough people demand something different.
So here’s what needs to happen at the collective level:
- Data sovereignty. You should own and control your own data. Not tech companies. Not platforms. You. This isn’t a nice idea—it’s a fundamental right that needs legal protection.
- Alternative platforms. We need digital spaces that aren’t built on extraction. Cooperatively owned platforms. Community-funded tools. Systems designed to serve users, not advertisers.
- Regulation that protects people, not profits. Right now, privacy laws favor corporate interests. That has to change. And it won’t change unless we make noise about it.
- Communities of practice. We’re all figuring this out together. Share strategies. Teach each other. Build networks where people can learn what actually works—not what Silicon Valley tells us we need.
The researchers whose work sparked these thoughts emphasize something critical: Technology isn’t deterministic. Humans and technologies shape each other.
We’re not powerless passengers in this system—we’re participants who can push for different outcomes.
But that requires something the attention economy doesn’t want from you: Consciousness.
Passive consumption is profitable. Active awareness is disruptive.
The attention economy wants your habits running on autopilot, managed by algorithms, executed by AI agents.
Autonomy requires you to notice. And once you notice? You can’t un-see it.
So notice. Question. Resist where it matters.
