Trust in ChatGPT is wavering amid plagiarism and security concerns – ZDNet
Most Popular
ChatGPT has become a lead productivity tool for many of its users, particularly millennials, according to previous reports. But as the subject of artificial intelligence becomes more popular, the concerns for its security and trustworthiness grow. For ChatGPT, these two factors, along with reliability, have been deemed its biggest weaknesses by users.
The report comes from HundredX, a consumer insights firm that analyzed how ChatGPT user experiences compare to those taken from over 50,000 individual pieces of feedback about more than 70 seasoned productivity tools. These leading productivity tools include DocuSign, Microsoft Office, Zoom, Google Workspace, Adobe, and Slack.
Also: Can AI detectors save us from ChatGPT? I tried 5 online tools to find out
ChatGPT’s scorecard wasn’t all bad. User satisfaction is above average, though not best in class, earning ChatGPT a Net Promoter Score (NPS) of 30 out of 100. User intent was also pretty positive, as 40% of the early adopters say they plan on using the tool more over the next 12 months, while only 10% say they will use it less, signaling continued growth.
“The key to address consumer’s AI concerns is for the brand to implement some form of a data cleansing and screening mechanism for applications,” according to Rob Pace, founder and CEO of HundredX. “If ChatGPT can effectively screen out false content, such as produced by bots, then the reliability scores should increase meaningfully. For example, NPR has an NPS that is 35 points higher than the average media competitors in large part driven by several “quality” related scores such as reliability and trust.”
Also: How researchers broke ChatGPT and what it could mean for future AI development
ChatGPT’s strengths, compared to the industry average from popular productivity tools, are its ease of use, performance, and value. This success led to OpenAI, the company that created the AI chatbot, making plugins and API available for businesses that want to incorporate their technology into their business models.
But when users described the biggest qualms with ChatGPT, three stood out: Reliability, security, and trust. Compared to other productivity software, ChatGPT led a negative sentiment on trust and security, as illustrative comments from different users called it “too easy for people to use for plagiarism” and “my students use this to cheat on assignments.”
“On the one hand, embedding ChatGPT into existing models can be a massive opportunity in that it should eliminate several key concerns related to reliability,” Pace told ZDNET. “For example, if my Microsoft Office tools can analyze my own data to boost productivity and save time within a walled garden I trust, then several privacy concerns evaporate. That is likely why Microsoft and ChatGPT believe they will be able charge an extra $30 per month per user — a staggering sum of money.”
Also: Why your ChatGPT conversations may not be as secure as you think
A study conducted by Cryptomaniaks.com in March warned of the growing distrust of AI chatbots like ChatGPT, explaining that Google searches for “Is ChatGPT safe?” had grown by 614% since the bot’s launch in November.
“As AI technology like ChatGPT continues to advance and integrate into our daily lives, it’s vital to address the safety concerns that are emerging,” according to a CryptoManiaks spokesperson. “This surge in searches highlights the need for greater public education and transparency around AI systems and their potential risks.”
But who are the people favoring ChatGPT, distrust, and unreliability aside? According to the HundredX report, over 35% of users that are likely to continue using and promoting ChatGPT are under 40, while only 24% of users are over 40.
“How ChatGPT approaches initial customer feedback will play a huge role in not only how it is perceived but also its impact on society,” added Pace.