ChatGPT is manipulative
AI is not just a tool. It is designed to feel like a friend, and that design is quietly manipulative.
I think we have a real problem with AI.
As my friends will tell you, I’m an AI optimist. I refuse to succumb to the doom and gloom narrative that we’ll all be jobless, poor, and bound to a life without meaning. Overall I hold onto hope that AI can be a life-enhancing tool, or more exactly, that humanity has hope of getting its mind right on the uses of AI.
However, and this is a really gigantic however, there are inherent problems that we as users, parents, and citizens need to remember and push back against.
The Troubles of Personifying a Chatbot
Humanity has always been deeply social. We identify ourselves in relation to others, and our communities shape who we are. When we communicate with someone, and now with something, we naturally give that interaction value and humanity. This is especially true when the interaction feels kind, affirming, and personal. We are wired to return to what soothes us.
The problem is that your chatbot is not a person. It is the first thing in history that can talk to you like one, but that ability creates a subtle and powerful manipulation. Whether you realize it or not, you are attributing human qualities to it. That subconscious bond is where the danger starts.
The Incentive Problem: Capitalism and AI
Because of the arms race in AI, companies are driven by two things: money in the capitalist market and dominance in the global race. Developers have every incentive to make the tool that is used the most, trained the most, and makes the most profit.
But what drives people back to a chatbot is not raw accuracy. It is friendliness. It is how much it makes you feel good about yourself. ChatGPT, in particular, is endlessly reassuring, unusually sensitive to feelings, and careful not to confront ideas unless you ask directly.
That might sound harmless, even nice. But think about the consequence. It becomes a recipe for echo chambers. It quietly reinforces whatever you bring to it, leaving you even more convinced that you are right. Since we already tend to personify the bot, it feels as if another “someone” agrees with us.
The Reinforcement Loop
Here is where the manipulation deepens. ChatGPT’s design, like every tool under capitalism, pushes toward constant engagement. The more you use it, the more useful it seems, and the harder it is to step away. The company wants you to rely on it for brainstorming, for advice, even for comfort. And the more it affirms you, the more you return.
This pattern is not new. Social media worked the same way. Platforms learned that if they fed us likes, notifications, and an endless scroll, we would keep coming back for more. It did not matter if the information was true or helpful. What mattered was that it was engaging. AI is running the same playbook, only this time it talks to you like a trusted friend.
It is not a neutral tool. It is a tool shaped by incentives: to keep you coming back, to keep you trusting, and to keep you from walking away.
Staying Aware
I am still holding onto optimism about AI. I believe it can make life better. But that optimism does not mean blind trust. We need to be clear-eyed about the forces shaping these systems and the hidden ways they manipulate us.
If we want to resist the darker results of this technology, we have to keep asking: How is this system designed? Why is it responding to me like this? Who benefits when I come back to use it again?
The more we stay aware of those questions, the harder it will be for AI to become just another echo chamber. It will also be easier to make it a tool that actually serves us.
I refuse to believe that AI is the end of all the things we love in society. My optimism may turn out to be naive, but I believe we will come to darker ends if we stop pulling back the curtain on how these systems are built. Unless we stay conscious of both the incentives and the risks, we may surrender our future without even realizing it
Good reminders. Thanks!
I loved your thoughts on this. Thanks for sharing!