Your Bad Mood as a Data Point

I had a bad day. Nothing special, nothing dramatic. One of those days where you wake up tired and stay tired. That evening I sat on the couch and scrolled through one offer after another. I didn’t buy anything. But I searched longer than usual. I looked at things I would have ignored on a normal day. I was more receptive.

The algorithm knows that.

AI detects emotional states and responds to them. Not in the future. Now. Speech patterns, typing speed, scrolling behavior, facial expression in front of the camera, time of use. All of it adds up to a picture. The industry calls it “precision and context.” They celebrate it as progress in customer engagement.

There are moments in life when you’re vulnerable. A breakup, a loss, a sleepless night, an argument that wasn’t resolved. In those moments you make different decisions. You buy things you don’t need. You click on things you’d normally skip. You look for something that feels like comfort.

That’s human. It’s also a vulnerable state. And the technology is designed to detect that state and exploit it.

Not exploit in the sense of: help. Exploit in the sense of: monetize. The difference is critical. A friend who notices you’re not doing well and asks if they can help uses emotional perception for you. A system that notices you’re not doing well and shows you an offer uses it against you. The perception is the same. The intent is not.

The industry doesn’t distinguish. For them it’s the same thing. They talk about “empathic AI” that responds to the customer’s emotional state. Empathy. The word is right there, in whitepapers and keynotes. A machine that detects your bad mood and shows you a product to match is empathic. That’s how it gets sold.

Empathy means compassion. It means sensing another person’s state and responding in their interest. What’s happening here is the opposite. It’s the ability to read another person’s state and act in your own interest. That’s not empathy. That’s exploitation. But exploitation doesn’t look good on a slide.

I think about specific situations. A person going through a divorce who can’t sleep at night. He scrolls. The system registers the time, the behavior, the dwell time. It detects: vulnerable. It shows: an offer. Maybe a trip. Maybe a subscription. Maybe a course. Something that promises comfort. The person clicks. Conversion.

On another day he wouldn’t have clicked. On a day when he was rested and clear-headed, he would have scrolled past it. But the system recognized the right moment. And the right moment is the vulnerable moment.

The future of marketing is “context-sensitive,” they say. Context means: the system knows not just what you want, but how you’re feeling right now. It knows not just your preferences. It knows your state. And it acts accordingly.

Who controls which states are commercially exploitable and which are not? Nobody talks about that. There’s no boundary. No line in any code of ethics that says: Grief is not a data point. Loneliness is not a conversion window. Fear is not a targeting criterion.

Everything is data. Every moment is a moment to sell.

I wonder how the people building this explain it to their kids at night. Daddy worked today on how machines recognize sad people and sell them things. Maybe the answer is: That’s not what it’s meant to do. Maybe the answer is: Nobody thought that far. Maybe the answer is: Yes, exactly that.