‘A Wonderful And Frightening And Horrifying Thing’: Lawyer Warns That People Are Taking Dire Risks Using ChatGPT. It’s Not What You Think – BroBible


Via iStockphoto / @itsmattslaw TikTok/Emiliano Vittoriosi Unsplash
Ever since ChatGPT was first introduced to the public in 2022, the AI chatbot has integrated itself into Americans’ lives in unprecedented ways. It started with the expected: helping students in school, planning people’s trips, and offering shopping recommendations. In 2025, however, ChatGPT continues to make headlines for being linked to much darker situations.
The dangers range from folks “falling in love” with the AI and losing touch with reality to literal lives being lost due to suicide. Now, there’s another risk associated with using AI that many may not expect. One lawyer issued a warning you may need to watch out for if you—like millions of others—use the bot for everything, including spilling your secrets.
In a viral TikTok, lawyer Matt Margolis (@itsmattslaw) shared his warning to 44,000 followers. He warns that people are now using AI tools, such as ChatGPT and Claude, as lawyers.
“And I don’t mean that in the context of, ‘Oh, I’m gonna lose my job,’” Margolis says. “I mean, in the context of, like, you don’t get privilege.”
He continues that with tools like ChatGPT, unlike with an actual licensed lawyer, you don’t get attorney-client privilege. That is the legal protection that keeps what you tell your lawyer confidential. As Margolis says, attorney-client privilege will “cloak” whatever it is you’re telling your lawyer.
“But not with those tools,” Margolis says. “They’re not lawyers. So someday a subpoena is going to hit one of these providers. And I think it’s already happened. And [the court] is going to get the most incriminating f—— prompts and conversations.”
How does this affect you? Well, if you’re asking ChatGPT whether an action you did is considered a crime—and it is—the chatbot won’t keep that confidential if its data ever gets dragged into court.
“It’s going to be a wonderful and frightening and horrifying thing to see in court,” Margolis concludes. “But it’s a new reality.”
In short, yes. Sam Altman, the CEO of OpenAI, which is ChatGPT’s parent company, has warned that anything you tell the chatbot can be used against you.
“So if you go talk to ChatGPT about your most sensitive stuff. And then there’s … a lawsuit or whatever, … we could be required to produce that,” Altman said in a podcast.
But it’s not only people who use it as lawyers that should be warned. Even lawyers have to be careful when using the tool.
Margolis told BroBible that it’s more than just ChatGPT, but all the other “free” AI tools that should come with warnings. As a litigator, Margolis says he’s seen more people ask the AI chatbots “incriminating” things. He compares them to the Google searches that have pinned down criminals in the past.
“I’m totally with … people wanting to be more informed about things,” Morgalis says. “But they’re trying to use these AI tools as a lawyer. And what people don’t realize is you get a level of privilege with lawyers where you’re asking, like, ‘Hey, hypothetically, is this a problem?’ A lawyer is gonna be like, ‘Yeah,’ and that conversation isn’t gonna just, like, be subject to a subpoena.”
On whether people can use AI tools safely, Margolis says there’s a future where there will be some sort of limited exemption on these tools. However, he maintains that you’re paying lawyers for that confidentiality you can’t get with AI.
“There’s this value with having a lawyer,” he continues. “And I know people will say. … I’m just like, digging for more work, right? A lawyer trying to not be replaced by AI. But they’re just aspects of this kind of doctrine and the aspects of our legal system.”
Margolis is glad to see people trying to become more informed on these topics, but there should still be caution when using them.
Some people might think it depends on how you ask AI tools like ChatGPT about questionable actions you may have committed. But Margolis says—while it is up to a jury to decide—it can be akin to Google searches. He listed the Palisades fire suspect as an example of someone who was arrested after evidence was collected from his ChatGPT conversation.
Chat is this a crime
♬ original sound – Matt Margolis

On the other hand, Margolis says lawyers using tools to help in their research can be protected under the work-product privilege. He notes there have been some arguments on that topic, but he argues that “they’re nothing more than tools that I’m using that are in furtherance of the litigation.”
Margolis shares that he himself represents some legal tech AI tools and believes AI can be useful when used properly. He criticizes, however, the surge in AI slop littering the internet.
“I’m all for getting informed and making sure that someone is dealing with the legal process is adequately equipped to handle whatever they’re dealing with,” he says. “Just understand the limitations of these AI tools.”
He says understanding the privacy, security, and intellectual property issues with these AI tools is key.
“As long as you’re informed of what you’re doing and you’re careful, like I do, like that people are getting informed and using,” he says. “But just like, be damn careful.”
BroBible reached out to OpenAI via email.

source

Jesse
https://playwithchatgtp.com