Hyperpersonalization as Polite Surveillance Capitalism

Yesterday I was looking for shoes. Not on a platform, not with an app. I told a friend my running shoes were done. This morning my phone shows me running shoes.

Maybe coincidence. Maybe not. The fact that I can’t be sure anymore says everything.

“Hyperpersonalization” is the next level of customer experience. Every person should see content tailored precisely to them. Precisely their needs, precisely their moment, precisely their mood. The industry describes this with words like precision and relevance. They sound genuinely excited.

For that precision you need data. A lot of data. Purchase history, browsing behavior, location, time of day, dwell time, click patterns, social connections, search history. Whitepapers list all of it. Matter-of-factly. As if it were a technical detail.

It’s not a technical detail. It’s the seamless surveillance of a person.

The deal is well known: you get a free service and pay with your data. Everyone who’s ever read an article about data privacy says that. What’s less well known: the data collected today goes far beyond what most people imagine. It’s not about your age and your zip code. It’s about your hesitation. How long you look at an item before scrolling on. Whether you buy in the morning or the evening. What you search for and what you don’t buy. The gap between search and purchase says more about you than the purchase itself.

The industry calls that context. I call it profiling.

The elegant thing about hyperpersonalization is that it feels good. That’s the trick. You get what you want before you know you want it. The suggestion fits. The offer is right. It feels like service. It feels like: They understand me.

Nobody has ever said: I feel surveilled. What defines perfect surveillance is that it doesn’t feel like surveillance. It feels like attention.

Industrial mass production of uniqueness. That contradiction is baked into the whole concept, only the industry doesn’t notice it is one. Millions of customers are supposed to each feel uniquely treated. That’s only possible if you know every single one of them well enough to maintain the illusion. And for that you have to observe every single one of them without gaps.

There’s a historical term for when someone is observed without gaps and the findings are used to steer their behavior. It’s not called personalization. It’s called surveillance capitalism. Shoshana Zuboff wrote the definitive work on it. It barely appears in AI literature.

What bothers me is not the technology. Personalization can be useful. A doctor who knows my medical history treats me better than one who doesn’t. That’s the same mechanism, but in a context I control.

What bothers me is the vocabulary. The industry talks about customer experience and added value and relevance. They never talk about surveillance. They never talk about power. They never talk about what happens when someone has all that data and decides to use it for something other than shoe recommendations.

A customer profile precise enough to show you the perfect shoe at the perfect moment is precise enough to show you a political message at your most vulnerable moment. The infrastructure is the same. Only the application is different.

Every time I read that someone should feel “unique,” I think: How much had to be collected about this person for that to work?

The answer is in the data. Just not in the conversation where it belongs.