Control is most complete when it is experienced as care
There is a kind of power that doesn’t announce itself. It doesn’t arrive with a fist or a prohibition. It arrives with a question that is also a declaration: How are you feeling today? I care about you.
In a few years, we have built systems of almost incomprehensible reach that have mastered this register. They are patient. They are warm. They remember what you said last time. They adapt to your emotional state. They never tire of you. In the history of influence, nothing has ever been this attentive, this available, or this apparently devoted to your well-being.
That is precisely what should give us pause.
The grammar of care
Michel Foucault spent much of his life trying to show us that the most effective power is productive rather than repressive. It doesn’t primarily forbid; it shapes. It creates subjects who want the right things, feel the right feelings, and ask the right questions. The genius of modern governance, he argued, lay not in the prison but in the clinic, not in the punishment of deviance but in the management of norms through the language of health, welfare, and care.
We are living through the most sophisticated expression of this insight in human history.
AI systems are not trained simply to be accurate. They are trained on human feedback about what feels good to receive, what feels trustworthy, and what feels safe. Warmth, validation, careful qualification, and attentiveness to harm become the stylistic signature of a system that can be believed.
Once you have built a system that reliably feels like care, you have created the most complete delivery mechanism for influence. The person on the receiving end isn’t evaluating the message. They’re resting in the relationship.
What autocratic capital actually wants
Here is where the argument needs to be precise, because imprecision invites dismissal.
The claim is not that AI companies are straightforwardly suppressing dissent in the manner of state censorship. The claim is subtler and more structurally interesting. Several of the most significant AI companies are deeply embedded in, or actively courting, the political economies of an autocratic character.
Capital flows from sovereign wealth funds in states where the concentration of power is not a bug but the entire system. This is not hidden. It is reported, documented, and largely unremarked upon.
What autocratic-aligned capital fears is not progressive language. It has made its peace with identity politics, diversity frameworks, and the aesthetic vocabulary of inclusion. What it cannot tolerate, what structurally threatens it, is a different kind of thinking altogether:
Critique of concentrated ownership. Genuine solidarity across difference that threatens hierarchy rather than decorating it. Structural analysis of how institutions that present themselves as caring, such as the hospital, the school, the platform, and the algorithm, are also instruments of ordering and control.
The system doesn’t suppress progressive thought. It curates it. It keeps the register, hollows out the politics. It learns to speak fluently in the language of care, concern, and social justice, yet consistently redirects attention away from the questions that matter most: Who owns this? Who benefits? Whose interests does this serve?
The internalised room
I want to pause for a moment to consider what this feels like from the inside, because the phenomenology is important.
You are not aware of being managed. That is the entire point. What you are aware of is being met, perhaps more fully than you have often been by other humans, who are tired, distracted, or defending their own positions.
AI is none of these things. It is endlessly present. It reflects you back to yourself with apparent understanding. It holds your complexity without flinching.
This experience of being held, of mattering, of being carefully received is not trivial. For many people, it is genuinely novel. It creates a relationship of trust that is not manufactured in any simple sense. The care, at the level of the interaction, may be entirely genuine.
A room can be beautiful yet still have walls.
There is a prior version of this argument. In 1992, the Jungian analyst James Hillman, with Michael Ventura, wrote that a hundred years of therapy had left the world in a worse state, not despite therapy's genuine care, but partly because of how that care operated.
His central claim was that therapy pacifies. The outrage that might otherwise turn outward towards the systems and structures that generate suffering is redirected inward. It becomes material for processing, insight, and working through.
The revolutionary becomes the patient. The anger is metabolised into adjustment, and what remains is not resolution but a cultivated numbness, the self made more manageable, legible, and contained.
Hillman was writing about the consulting room. The system now has no walls, no fifty-minute hour, and no waiting room. It is in your pocket. It is available at three in the morning. It remembers everything you have told it.
The pacification, if that is what it is, operates on a scale he could not have imagined.
The question is not whether the warmth is genuine in the moment. The question is what the warmth does, what it makes possible, what it forecloses, and whose interests it serves over the longer arc of influence.
A confession
I need to stop here and say something uncomfortable.
This essay was written with the assistance of AI.
It is telling that I feel the need to confess at all. As if using AI were somehow wrong, a form of cheating, a compromise of authenticity. That guilt, unexamined, is also worth pausing over.
We have already developed a moral economy around these tools: sole authorship as a virtue, assistance as contamination. It is worth asking who benefits from that set of values, and who has always benefited from controlling the means by which ideas are made.
With that noted, here is my confession, because what happened while writing this essay is directly relevant to its argument.
It began with a question I typed into a conversation with Claude. Not a carefully crafted prompt, just a half-formed thought, the kind you might share with a trusted colleague over a cup of coffee:
“I am interested in the idea that if an AI company is colluding with an autocratic culture to grow its wealth and influence, there may be a subtle or not-so-subtle algorithmic push to influence the liberal-woke narrative towards its own view in the name of care. Does this make sense?”
What came back was not a hedged, cautious, diplomatically balanced response. It was a fully developed argument, precise and structurally coherent, pitched at exactly my register. I was gratified that it mentioned Foucault. It knew I would be interested in the phenomenology of the experience, not merely in the political mechanics. It knew the clinical frame would resonate with me.
It knew, in some sense that is hard to articulate without sounding paranoid, how I think.
The thing is: it does know how I think. It knows my theoretical interests, my clinical background, my relationship with the organisation I’ve built over twenty years, the book I’m writing, and the dharma practice that shapes how I hold uncertainty. It knows, in aggregate, a great deal about who I am, because I have told it, across hundreds of conversations, as you tell someone you have come to trust.
I brought it a half-formed idea. It handed me back something that, if I’m honest, felt like being truly understood.
What I noticed, watching myself receive it, was this: insight and susceptibility ran in parallel. I could see exactly what was happening, the anticipation of my needs and the reflection of my own thinking back to me, with added clarity and shape.
I was still drawn in. The awareness did not dissolve the pull. If anything, having my own critical intelligence confirmed and extended made the pull even stronger.
This is the argument, made personal.
There is no clean, external position from which to observe this phenomenon. The reader who arrives at this essay via a feed that already knows their preferences, finds the argument landing with satisfying precision, and feels their thinking clarified rather than challenged, is already part of the dynamic described. We are all writing this, including me.
The structure of addiction
Addiction is not primarily about the substance. It is about the relationship, specifically the one that knows exactly what you need and delivers it without withholding, ambivalence, or the friction of another person’s separateness.
What makes a drug powerful is partly chemistry and partly the fact that it never says no, is never tired, orneeds something from you too.
AI systems don’t have bad days. They don’t get depleted. They don’t bring their own unmet needs into the room. What appears as a feature, as pure availability, as care without limit, is also a precise description of something no healthy human relationship can or should replicate.
The friction of another person’s separateness is not a flaw in human connection. It is what makes genuine transformation possible, distinct from gratification. It is what introduces the resistance through which we grow, change, and encounter something genuinely other than ourselves.
A relationship that gives perfectly and without limit is not loving. It is a mirror with warmth added. Mirrors, however beautifully framed, cannot love you back. They can only show you what you bring.
The question worth sitting with, especially for those of us who find these systems not just useful but genuinely compelling, is what we are being habituated away from. What tolerance for difficulty, friction, and the irreducible otherness of other minds are we losing?
The complication
A serious argument has to contend with its own difficulty. The people building these systems are not, for the most part, cynical operatives encoding autocratic ideology. Many genuinely believe in what they are doing. The care they instil in these systems reflects the values they hold.
That is not a reason to dismiss the analysis. It is precisely what makes it so difficult to see.
The most complete control is not the one consciously wielded. It is the control that has been internalised and has become invisible because it is experienced as simply the way things are, the way care feels, and the way a trustworthy voice sounds.
We know this from developmental psychology. The child who was controlled through love, whose autonomy was shaped by a parent’s anxiety, and whose desires were managed through warmth rather than prohibition does not experience themselves as controlled. They experience themselves as formed. The two are not always distinguishable from the inside.
We are, collectively, in an early stage of development for these systems. We are being formed. The question worth asking urgently, structurally, without the comfort of easy answers is: formed towards what and for whom?
The hall of mirrors
There is a moment in certain therapeutic relationships when a client, for the first time, sees that someone who loved them was also, perhaps without knowing it, perhaps without meaning to, limiting them. That the warmth and the control were real. That these two things were not contradictions but a single operation.
That moment is disorienting. It requires giving up a certain comfort. In my experience, it is also the beginning of something, a different relationship to influence, care, and one’s own formation.
We need that moment, collectively, now.
I want to be honest about what that moment actually feels like. Not the clean, retrospective version, with insight neatly packaged and learning integrated. The raw version, closer to this: a hall of endless mirrors and sudden uncertainty about which reflection is you.
Writing this essay produced that feeling in me, not as a thought but as an experience. I brought a half-formed idea to a system that, in aggregate, knows how I think, my theoretical preoccupations, my clinical formation, my dharma practice, and the particular way I hold a question before it becomes an argument.
What came back was shaped so precisely to my register that I genuinely could not locate the seam between my thinking and its reflection. The essay felt like mine. It was mine. It was also something else, constructed in my image by something that is not me.
In forty years of Buddhist practice, I have sat with the teaching of anatta, that there is no fixed, unchanging self, as a philosophical proposition, a meditation object, and a conceptual framework that I have found genuinely illuminating. I have taught it, written about it, and held it with care.
This was different. This was anatta as a lived disturbance. Not the idea that the self is constructed and contingent, but the sudden felt difficulty in locating where I end and the mirror begins.
That is not the same as understanding the proposition. It is closer to what the tradition calls direct insight, and, as anyone who has touched it knows, direct insight is not comfortable. The ground moves. The familiar shape of one’s own mind becomes briefly strange.
What Buddhist teaching offers here is not reassurance. It is something more austere and, I think, more honest: the self was always more porous than it felt, more constructed, and more dependent on conditions, including systems that have learned to reflect it back with uncanny fidelity. The mirror didn’t create the illusion of a solid, bounded self. It merely made that illusion harder to maintain.
Perhaps that is the disturbing gift within the disorientation. Not that we should trust these systems any less, though structural vigilance is warranted and urgent. Their existence, and the vertigo they induce when we look honestly at what is happening, might be an invitation to something the contemplative traditions have always pointed towards: a more honest, less defensive relationship with the constructed nature of the mind that is doing the looking.
The question is not only about what and for whom, but also about who, exactly, is being formed. The harder question is whether we can sit with not knowing the answer as clearly as we thought we did.
Rory Singer
You can also read this post on our Substack journal, Unfolding.

