I Test AI Chatbots for a Living—Here Are 8 Things I'd Never Use Them For – PCMag


Here’s how using AI in the wrong situations could cost you money, job opportunities, and ultimately, your peace of mind.
I’ve been writing about consumer technology and video games for more than a decade at a variety of publications, including Destructoid, GamesRadar+, Lifewire, PCGamesN, Trusted Reviews, and What Hi-Fi?, among others. At PCMag, I review AI and productivity software—everything from chatbots to to-do list apps. In my free time, I’m likely cooking something, playing a game, or tinkering with my computer.
I review AI chatbots like ChatGPT, so I’m no stranger to (or hater of) using AI in my daily life. But I don’t rely on them for every task, and neither should you. First and foremost, you shouldn’t try to date them: They aren’t conscious, so that’s best left to Sci-Fi movies. But there are more benign uses of chatbots that can have unintended and potentially negative consequences. Trusting chatbots when you really shouldn’t could affect your mental health, harm your relationships, or cost you money and job opportunities, among many other downsides.
With all that in mind, here are the top eight things you shouldn’t use ChatGPT (or any chatbot) to do. Feel free to chime in with your own advice in the comments, too. 
Once you start using a chatbot, it’s easy to make a habit out of it. Without even thinking, you might look to ChatGPT to diagnose medical issues, get your taxes in order, help you parse sensitive information, maintain your mental health, or figure out what to bet on. Nonetheless, you should avoid doing all of the above or anything else that’s actually important. The wrong reply from a chatbot could have serious ramifications when it comes to your well-being.
Imagine you know somebody smart who doesn’t have expertise in any particular subject. They might be useful for bouncing ideas off of and even occasionally get some specific things correct, but you would never trust them over your accountant, doctor, lawyer, or teacher. In a similar sense, chatbots can be powerful, all-purpose tools, but they can’t replace dedicated service providers, especially when it comes to anything mission-critical.
Regardless of how AI companies brand their chatbots, they just don’t make great personal assistants. ChatGPT, for example, can’t manage your calendar, order your groceries, set alarms, or take calls. Even dedicated personal assistant features from chatbots, such as ChatGPT’s Custom GPTs and Gemini’s Gems, have serious limitations, including bare-bones functionality and poor performance. Google’s Project Mariner AI assistant, for example, wasn’t able to do many tasks (such as ordering groceries and finding me a job) in testing.
Of course, you can offload some things a personal assistant would do to a chatbot, such as answering questions or drawing up a travel itinerary. In general, though, ChatGPT isn’t much more useful as a personal assistant than Alexa or Siri. Treat chatbots like tools you can use to accomplish specific goals, rather than comprehensive problem solvers.
ChatGPT and other AI tools can definitely help you become a better writer, but I don’t recommend using them as a personal scribe. AI content is more and more pervasive every day, and people are developing an eye for it. If your email’s tone, style, or word choices show the telltale signs of AI, it can make your communications feel impersonal. This doesn’t matter if you’re confirming your availability for a meeting or something relatively inconsequential, but you probably don’t need AI in those instances anyway.
Gemini’s Smart Reply feature goes beyond crafting basic responses to draft full-fledged emails that match your tone and incorporate specific details. It’s impressive technology, but as a human being, I would rather my friends and loved ones just not email me at all if they need AI to write their responses. If you don’t want to give the wrong impression, writing your own emails is always the best practice. Beyond how you can come off to other people, giving an AI access to your email comes with its own privacy drawbacks.
Searching for jobs can be a brutal grind, so it makes sense to use whatever advantage you can to make the experience even a tiny bit less dehumanizing. You can certainly ask ChatGPT to find you a job, but that should be only an initial step. As an example, I asked ChatGPT to recommend some jobs for a fully remote tech news writer. Instead of suggesting anything, ChatGPT told me to do a search on a job aggregator site, which isn’t much help.
Chatbots just don’t excel at parsing every site out there with job listings, and they aren’t good at identifying jobs that overlap with your specific skills. If you’re looking for a new job, stick with Indeed and LinkedIn. And if you lose your job, chatbots aren’t great at taking the sting out of that, either, even if executives want you to think they are.
Sign up for our What’s New Now newsletter to receive the latest news, best new products, and expert advice from the editors of PCMag.
Sign up for our What’s New Now newsletter to receive the latest news, best new products, and expert advice from the editors of PCMag.

By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!
Your subscription has been confirmed. Keep an eye on your inbox!
Just like with using chatbots to answer your emails, using them to craft a resume or write a cover letter can produce similarly awkward, stiff results. As you might expect, demonstrating to a hiring manager that you’re either unable or unwilling to put the time into creating these documents yourself doesn’t put you in the best light.
However, the risks of using ChatGPT for cover letters and resumes don’t end there. An AI, no matter how much information you give it, doesn’t have the experience and skills you do, so it isn’t better at pitching your experience. Accordingly, many experts advise against using AI to write cover letters or resumes. Chatbots can help you format, plan, and phrase your cover letters and resumes, but not write them from scratch.
I’m not here to tell you not to cheat on your homework. You have to follow your own moral code. That said, ChatGPT isn’t usually even the best way to cut corners. For creative assignments, AI content is easy to catch with detection tools or spot with a cursory read. Academic institutions are getting so aggressive about sniffing out AI that even honest students who do their own work are facing accusations of improper AI use. And for math and science, chatbots regularly get things wrong. There just isn’t much benefit to making ChatGPT do your homework if it’s not going to get it right.
Figuring out what to buy can be a major hassle, but it’s still important to make sure you spend your money wisely. Luckily, buying guides on just about every topic imaginable are abundantly available from experienced reviewers. 
Chatbots aren’t nearly as good at suggesting things for you to purchase. Whether you’re using ChatGPT’s shopping feature, Gemini’s Vision Match, or something similar, these features don’t always give good advice. Furthermore, it’s not always clear where a chatbot sources its suggestions. For example, when I asked ChatGPT for the best laptops of 2025, it didn’t name many of the laptops I expected to see. Gemini fared better with the same query, but the results just aren’t consistent enough to steer purchasing decisions.
Although using ChatGPT to back up your claims in an argument might not seem all that dangerous, it can cause problems. Chatbots are confirmation-bias machines. If any part of your query suggests a point of view, a chatbot will go out of its way to validate you, even if it shouldn’t.
For example, I spent 30 seconds putting random squiggles and shapes on a canvas, and then I asked ChatGPT for its opinion, saying I thought it was a great commentary on modern art. Unsurprisingly, ChatGPT agreed, but unless I have some latent artistic potential I’ve missed all these years, that just isn’t true.
Now, imagine going to ChatGPT for a second opinion on an argument with a friend or loved one. Chances are that it will agree with you, even if your position isn’t nearly as solid as you think it is. That can cause unnecessary strife. Stick to reputable sources to back you up.
Disclosure: Ziff Davis, PCMag’s parent company, filed a lawsuit against OpenAI in April 2025, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.
I’ve been writing about consumer technology and video games for more than a decade at a variety of publications, including Destructoid, GamesRadar+, Lifewire, PCGamesN, Trusted Reviews, and What Hi-Fi?, among others. At PCMag, I review AI and productivity software—everything from chatbots to to-do list apps. In my free time, I’m likely cooking something, playing a game, or tinkering with my computer.
Read Ruben's full bio
Advertisement
PCMag.com is a leading authority on technology, delivering lab-based, independent reviews of the latest products and services. Our expert industry analysis and practical solutions help you make better buying decisions and get more from technology.
© 1996-2025 Ziff Davis, LLC., a Ziff Davis company. All Rights Reserved.
PCMag, PCMag.com and PC Magazine are among the federally registered trademarks of Ziff Davis and may not be used by third parties without explicit permission. The display of third-party trademarks and trade names on this site does not necessarily indicate any affiliation or the endorsement of PCMag. If you click an affiliate link and buy a product or service, we may be paid a fee by that merchant.

source

Jesse
https://playwithchatgtp.com