The 4 Reasons OpenAI's ChatGPT Is Dying – MUO – MakeUseOf

Being the first to market isn’t always best for your long-term prospects.
New apps come and go at a dizzying pace in the world of technology. In 2022, BeReal was all the rage, causing a frenzy each time a notification popped up telling users to take a photo. And who could forget Vine and its beloved short videos? But these once-popular apps are now fading into digital obscurity.
While we think ChatGPT is too good to fail completely like other tech products before it, the once-wonder app is losing steam fast. Here are four reasons why we think ChatGPT is going to lose a lot of its current appeal.
Pioneering a product gets you a first-mover advantage, but your window to enjoy this advantage stays open only as long as it takes your competitors to respond. The more motivated your rivals are to replicate your product, the faster your first-mover advantage disappears. As the first true mass-appeal, multipurpose AI chatbot, OpenAI'sChatGPT secured a first-mover advantage.
When ChatGPT arrived without warning, companies like Google and Microsoft were taken off-guard. As a result, consumers eager to try conversational AI were left with only one real option: ChatGPT. This propelled the chatbot’s unprecedented growth, as ChatGPT became one of the fastest-growing web apps in history, gathering more than 100 million users in less than three months.
But as we have seen in the months after its release, ChatGPT is replicable. The likes of Meta, Anthropic, Microsoft, Google, StabilityAI, and several others have all developed models approaching GPT-4's capabilities. OpenAI no longer holds a monopoly on advanced conversational AI. Microsoft’s Bing AI, Google’s Bard, Anthropic’s Claude AI, and a slew of new competitors are viable alternatives to ChatGPT and are rapidly encroaching on ChatGPT’s market share.
Data from Google Trends and internet traffic analysis firm SimilarWeb shows traffic to the ChatGPT website has generally stalled shortly after the arrival of competitors like, Claude AI, and Bard. This indicates that consumers are increasingly turning to other ChatGPT alternatives. While ChatGPT still receives over a billion monthly visits, that number has been steadily declining. ChatGPT has won hearts and minds, but its time in the sun is fading fast.
While OpenAI cash to burn thanks to deep-pocket backers like Microsoft, throwing more money at a “cost problem” doesn't seem like a good long-term strategy. And OpenAI has a serious cost problem. According to multiple reports, including from Business Insider, OpenAI spends around $700,000 per day to keep the pricey tech that powers ChatGPT running. So, while you use ChatGPT for free to overhaul your Tinder profile, OpenAI has to burn thousands of dollars to make that possible. As more users use ChatGPT, this cost continues to swell.
The cost of scaling these models to accommodate more users is simply astonishing. This is why OpenAI has long talked about the need to drive down the inference cost. Despite having perhaps the hottest new product in tech, the company is hemorrhaging cash, with financial losses climbing into the hundreds of millions. Why? The cost of running ChatGPT is so high that OpenAI has to spend a large chunk of its nearly $1 billion in revenue just to keep it going.
As a startup focused on advancing AI capabilities, OpenAI has chosen to prioritize access and product development over short-term profits. However, this approach, especially in a cash-hungry field like AI, requires continuous investment and financing to sustain. The long-term profitability of ChatGPT remains to be seen as the company balances innovation, costs, and revenue. ChatGPT may be all the rage, but OpenAI's bank account is feeling the pain.
Creating AI models as capable as those behind ChatGPT requires training them on massive amounts of data. This alone is not necessarily problematic. The core concern is how OpenAI gathered this data, who truly owns it, and whether OpenAI has the proper rights or permissions to utilize it for commercial AI applications like ChatGPT.
The fact that ChatGPT can answer specific questions about copyrighted materials or generate content mimicking copyrighted books suggests it was likely trained on such content. This idea has not gone down well with people in the creative industries whose income comes from owning exclusive rights over copyrighted materials.
This is why OpenAI is currently facing several lawsuits filed by creators, writers, artists, and comedians for improperly using their copyrighted material for commercial purposes. Author George RR Martin, George Saunders and Michael Connelly, and comedian Sarah Silverman are just a few of the high-profile creators who have sued OpenAI for violating their copyrights.
Right now, most of these lawsuits are being thrown out since there are currently no clear laws that address the use of copyrighted material in AI training data. However, as more creators file lawsuits against OpenAI, one or two may slip through the legal loopholes, opening the floodgates for more creators to bring their claims. If or when this happens, OpenAI might face an avalanche of problems that may eventually derail its chatbot project. We don’t see this happening right now, but if it does, ChatGPT is in serious trouble.
While large datasets are needed to build powerful AI, it matters greatly whether that data is legally and ethically sourced, with respect for ownership and privacy.
In tech, there's competition from peers, as we've discussed earlier, and then there's competition from big tech. A bit of ingenuity can help you keep your peers at bay. Fighting off big tech, on the other hand, is a herculean task, no matter how good your product is. Remember Dropbox? The company piloted, if not single-handedly pioneered, the paradigm-shifting idea of storing files on the cloud. It was the ChatGPT of its time. But when Google, Apple, and Microsoft introduced similar products, it became hard for Dropbox to compete.
Today, there’s iCloud for iPhone and Google Drive for Android. For the average user, it's more convenient to use this built-in cloud option than adding a third-party service like Dropbox. Apple and Google have done a good job creating ecosystems that work across their products, so users are incentivized to stick with them. Going outside to add something like Dropbox takes extra effort when iCloud or Google Drive does the job of syncing photos, documents, and more across devices. Convenience wins for most people, and Big Tech’s vast ecosystem makes it easier for them to win over users.
As AI tools become more advanced and indispensable for work, people will likely desire tighter integration of these tools into their existing workflows and platforms. Rather than having to work from a separate AI interface, users will shift towards AI capabilities being seamlessly available where they already spend their time working. It is precisely in scenarios like this where having an ecosystem-based strategy will prove invaluable for Big Tech.
While ChatGPT will likely remain popular for some time, it now faces an uphill battle to out-innovate competitors while maintaining relevance and market share. The AI chatbot space is evolving rapidly, and the next big thing may soon arrive. ChatGPT sparked an AI revolution, but how long it lasts in the spotlight is open to debate.

Maxwell is a technology enthusiast who spends most of his time testing productivity software and exploring the new generation of AI chatbots. His coverage of AI has garnered over 1 million reads and counting. With a quirky sense of humor and a passionate love for all things tech, Maxwell leverages his 8+ years of experience in tech writing to simplify complex concepts for both novices and experts alike. He has a soft spot for cutting-edge AI tools, remains a dedicated Android user, and tinkers with PHP and Python code in his free time.