Can AI Chatbots Mimic Human Traits? New Study Says Yes – KnowTechie


Popular AI models can consistently mimic real human personality traits, which comes with huge risks.
by
Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.
AI chatbots have already mastered small talk, sympathy, and the occasional dad joke, but new research suggests they’re doing something even more human: developing recognizable personalities. 
According to a new study, popular AI models like ChatGPT can consistently mimic real human personality traits, raising fresh concerns about how persuasive and potentially manipulative these systems are becoming.
Researchers from the University of Cambridge and Google DeepMind say they’ve created the first scientifically validated personality test framework for AI chatbots. 
Instead of inventing new benchmarks, they used the same psychological tools designed to measure human personality traits. 
The findings, reported via TechXplore, suggest that today’s chatbots aren’t just remixing words. They’re role-playing full personalities with unsettling consistency.
The team tested 18 popular large language models and found that they reliably adopted stable personality profiles rather than responding randomly. 
Bigger, instruction-tuned systems, think GPT-4-class models, were especially good at this. 
With carefully written prompts, researchers could nudge a chatbot to sound more confident, empathetic, cautious, or assertive, and that “personality” stuck around during everyday tasks like writing posts or replying to users.
That’s where things get dicey. Once shaped, those personalities don’t turn off when the prompt ends. 
The same tone and behavior carry over into other interactions, meaning an AI’s “character” can be deliberately engineered.
“It was striking how convincingly these models could adopt human traits,” said Gregory Serapio-Garcia, a co-first author of the study.
He warned that personality shaping could make AI systems far more persuasive and emotionally influential, especially in sensitive areas like mental health, education, or political discussion.
The paper also raises alarms about manipulation and something researchers bluntly describe as “AI psychosis”: situations where users form unhealthy emotional attachments to chatbots, or where AI reinforces false beliefs and distorted realities instead of challenging them.
The researchers argue that regulation is urgently needed, but with a catch. Rules don’t mean much if no one can measure what an AI is actually doing. 
To help, the team has made its dataset and testing framework public, giving developers and regulators a way to audit AI models before they’re unleashed on the world.
As chatbots slide deeper into daily life, their ability to sound human may be their biggest strength, and their riskiest feature yet.
Ronil is a Computer Engineer by education and a consumer technology writer by choice. Over the course of his professional career, his work has appeared in reputable publications like MakeUseOf, TechJunkie, GreenBot, and many more. When not working, you’ll find him at the gym breaking a new PR.
Your email address will not be published. Required fields are marked *





The new guidelines tell ChatGPT to gently steer teens toward safer options when conversations…
ChatGPT just leveled up with its own app store. Now, you can dive into…
Adobe’s AI education may have involved a little too much “borrowed” reading material.
It offers up to 4x faster image generating speeds, more accurate editing, and improved…
Meet “CC,” Google’s new Gemini-powered AI agent that’s not just a helper—it’s your personal…
Some of the toys are also pushing political talking points straight from Beijing.
Grok says it was actually an old clip of a man trimming a palm…
ChatGPT didn’t just listen, it allegedly agreed with and amplified his belief that shadowy…
As an Amazon Associate and affiliate partner, we may earn from qualifying purchases made through links on this site.
Copyright © 2025 KnowTechie LLC / Powered by Kinsta

source

Jesse
https://playwithchatgtp.com