The Attention Mines: What AdTech Does to Your Soul

You walk into the office: exposed brick, cold brew on tap, beanbag chairs that cost more than your car. You think you're building the future. You're learning to see humans as harvestable crops.

The Conversion

Month one: You're optimizing click-through rates. Just making ads more relevant, right? That's what you tell your parents at Thanksgiving.

Month six: Someone says "emotional volatility" in a meeting and everyone nods. You nod too. You know what it means now.

Sad people click more. Anxious teenagers are premium inventory. But you don't say it that way. You've learned the liturgy.

Month twelve: You're building systems that track people across every app, every site, every 2am search that reveals their deepest fears. You call it "cross-platform identity resolution."

The conversion is complete.

The Holy Language of Extraction

Adtech has elevated bullshit to religious art:

"Behavioral targeting" = manipulation
"Engagement optimization" = addiction engineering
"Personalization" = surveillance
"Reaching relevant audiences" = exploiting the vulnerable

You learn this language because if you said what you're actually doing ("I make insecure teenage girls click on things they don't need by showing them bodies they'll never have"), you might have to quit.

And the stock options haven't vested yet.

What the Money Buys

The compensation is good. Good enough that the questions change:

Should we? becomes Can we?
Is this right? becomes Does it scale?
What does this do to people? becomes What does this do to Q4?

This is what the money buys. Not just your labor. Your willingness to stop asking.

The Rewiring

Here's what nobody tells you: The work changes you.

That person scrolling at 1am? High-intent user, premium inventory.
That kid on YouTube for six hours? Engaged audience, excellent conversion potential.

You become fluent in compulsion mechanics. Which shade of red triggers FOMO. Which notification cadence breaks self-control. Which targeting parameters correlate with impulse purchases.

You're not evil. You're just someone who learned to look away.

You focus on the elegant code. The algorithmic innovation. The beautiful complexity of turning human attention into harvested data into predicted behavior into profit.

And here's the thing: You're good at it. That's why they hired you.

The 3am Question

Late at night, doom-scrolling (you know exactly why it's addictive, you helped build systems like it), you see a comment. Some kid talking about how they can't stop checking their phone. How the ads know things that feel too close, too inside their head.

And for one second, you remember: That's what we built it to do.

There's a word for shaping people's choices by exploiting their unconscious biases. For bypassing reasoning to trigger compulsion. For treating interior lives as optimizable substrate.

The word is manipulation.

What They're Not Telling You

The real risk isn't what this industry does to users. I mean, yes, that's catastrophic. The compulsion loops, the psychological strip-mining, the kids with anxiety disorders, the adults who can't focus, all of it.

But what they're really not telling you is what it does to you.

Because spending your days engineering systems to exploit human vulnerability means you become someone who does that. Not in theory. As your actual practice.

You become someone who sees a teenager's insecurity as an optimization opportunity.

You become someone who treats attention as a resource to be mined.

You become someone who stopped asking if you should because everyone around you stopped asking too, and the numbers keep going up, and the machine keeps running.

That's the soul risk.

Not that you'll do evil. That you'll do evil and call it innovation. Hollow yourself out and call it growth. Lose the ability to see people as people and call it scale.

The Question

Late at night, metrics strong, quarter good, there's a question creeping in:

What kind of person am I becoming?

The kind who builds systems to override human autonomy?
The kind who exploits loneliness and fear because the KPIs demand it?
The kind who forgot that humans aren't inventory?

That's the question the free lunch doesn't answer.

If You're Still In It

First step: Call it what it is.

Not "engagement." Compulsion.
Not "personalization." Surveillance.
Not "optimization." Exploitation.

Say the words. Out loud. To yourself. To your colleagues. To the person you used to be.

Then, if you want to stay human while you're still inside:

Treat Manipulation Like the Risk It Is

Make it a first-class category. Explicit red lines. Measurable indicators. Independent review. Not vague ethics theater.

Define manipulation: Attempts to bypass user autonomy by exploiting biases, vulnerabilities, information asymmetries. Create "never" rules: no dark patterns, no targeting acute vulnerabilities (addiction, financial distress), no designs that deliberately foster compulsion.

Build a manipulation checklist: Every product surface gets audited for dark patterns, artificial urgency, hidden defaults, attention-maximizing loops. Require PMs to document every cognitive bias they're exploiting and justify why it's persuasion (transparent, respectful) not manipulation (deceptive, autonomy-undermining).

Run sludge audits: Measure friction to say no, cancel, opt out versus friction to say yes. In user testing, probe whether people felt informed or pressured. Surprise and regret are harm signals.

Audit your data: Classify risk levels (low: contextual; medium: behavioral segments; high: inferred vulnerabilities, cross-context tracking). High-risk segments require ethics impact assessments: what vulnerability, what outcome, what plausible harms.

Measure actual harm: Track session length beyond healthy norms, late-night compulsive usage, repeated ad exposure beyond reasonable frequency. Reframe KPIs from "time spent" to "time well spent." Add caps, frequency limits, cooling-off periods. Especially for minors.

Create real governance: Internal ethics committee with genuine veto power over launches that cross manipulation thresholds. Rotate in skeptics and non-revenue stakeholders. Document decisions so leadership can't quietly override autonomy for short-term numbers.

External accountability: Publish your manipulation standards. Let independent researchers audit you. Give users mechanisms to report manipulative experiences, request explanations, adjust settings. Feed it back into your process.

What It Costs

Will this tank your metrics? Probably.

Will leadership push back? Hard.

But here's what you get:

The ability to look at yourself in the mirror.

The ability to tell people what you do without the liturgy of lies.

The ability to build systems that treat people like people, not exploitable surface area.

Maybe that's not worth it to you. Maybe the options and the title and the comfortable numbness are enough.

But if you're reading this at 3am, you already know the answer.

The machine keeps running whether you're in it or not.

What kind of person do you want to be while it does?

The Rabbi of ROAS