Tidalflow helps any software play nice with ChatGPT and other LLM ecosystems – Yahoo Finance

Much in the same way as companies adapt their software to run across different desktop, mobile and cloud operating systems, businesses also need to configure their software for the fast-moving AI revolution, where large language models (LLMs) have emerged to serve powerful new AI applications capable of interpreting and generating human-language text.
While a company can already create an "LLM-instance" of their software based on their current API documentation, the problem is that they need to ensure that the broader LLM ecosystem can use it properly — and get enough visibility into how well this instance of their product actually works in the wild.
And that, effectively, is what Tidalflow is setting out to solve, with an end-to-end platform that enables developers to make their existing software play nice with the LLM ecosystem. The fledgling startup is emerging out of stealth today with $1.7 million in a round of funding co-led by Google's Gradient Ventures alongside Dig Ventures, a VC firm set up my MuleSoft founder Ross Mason, with participation from Antler.
Consider this hypothetical scenario: An online travel platform decides it wants to embrace LLM-enabled chatbots such as ChatGPT and Google's Bard, allowing its customers to request airfares and book tickets through natural language prompts in a search engine. So the company creates an LLM-instance for each, but for all they know, 2% of ChatGPT results serve up a destination that the customer didn't ask for, an error rate that might be even higher on Bard — it's just impossible to know for sure.
Now, if a company has a fail tolerance of less than 1%, they might just feel safer not going down the generative AI route until they have greater clarity on how their LLM-instance is actually performing. This is where Tidalflow enters the fray, with modules that help companies not only create their LLM-instance, but test, deploy, monitor, secure and — eventually — monetize it. They can also fine-tune the LLM-instance of their product for each ecosystem in a local simulated sandboxed environment, until they arrive at a solution that meets something amenable to their fail-tolerance threshold.
"The big problem is, if you launch on something like ChatGPT, you actually don't know how the users are interacting with it," Tidalflow CEO Sebastian Jorna told TechCrunch. "This lack of confidence in the reliability of their software is a major roadblock to rolling out software tooling into LLM ecosystems. Tidalflow's testing and simulation module builds that confidence."
Tidalflow can perhaps best be described as an application lifecycle management (ALM) platform that companies plug their OpenAPI specification / documentation into. And out the other end Tidalflow spits out a "battle-tested LLM-instance" of that product, with the front-end serving up monitoring and observability of how that LLM-instance will perform in the wild.
"With normal software testing, you have a specific number of cases that you run through — and if it works, well, the software works," Jorna said. "Now, because we're in this stochastic environment, you actually need to throw a lot of volume at it to get some statistical significance. And so that is basically what we do in our testing and simulation module, where we simulate out as if the product is already live, and how potential users might use it."
Tidalflow dashboard. Image Credits: Tidalflow
In short, Tidalflow lets companies run through myriad edge cases that may or may not break its fancy new generative AI smarts. This will be particularly important for larger businesses where risks around compromising on software reliability are simply too great.
"Bigger enterprise clients just cannot risk putting something out there without the confidence that it works," Jorna added.
Tidalflow's Coen Stevens (CTO), Sebastian Jorna (CEO) and Henry Wynaendts (CPO). Image Credits: Tidalflow
Tidalflow is officially three months old, with founders Jorna (CEO) and Coen Stevens (CTO) meeting through Antler's entrepreneur-in-residence program in Amsterdam. "Once the official program started in the summer, Tidalflow became the quickest company in Antler Netherlands' history to get funded," Jorna said.
Today, Tidalflow claims a team of three, including its two co-founders and chief product officer (CPO) Henry Wynaendts. But with a fresh $1.7 million in funding, Jorna said the company is now actively looking to recruit for various front- and back-end engineering roles as they work toward a full commercial launch.
But if nothing else, the fast turnaround from foundation to funding is indicative of the current generative AI gold rush. With ChatGPT getting an API and support for third-party plugins, Google on its way to doing the same for the Bard ecosystem and Microsoft embedding its Copilot AI assistant across Microsoft 365, businesses and developers have a big opportunity to not just leverage generative AI for their own products, but reach a vast amount of users in the process.
"Much like the iPhone ushered in a new era for mobile-friendly software in 2007, we’re now at a similar inflection point, namely for software to become LLM-compatible," Jorna noted.
Tidalflow will remain in closed beta for now, with plans to launch commercially to the public by the end of 2023.
FINTECH.TV, the global media platform that provides cutting-edge insights into finance, blockchain, technology, sustainability, impact investing, SDGs, and ESG, is delighted to announce its strateg…
As generative AI becomes embedded in enterprise software, one of the challenges that companies face is how to fairly price it to recoup the cost of running their content against foundational large language models like OpenAI's. Box introduced Box AI last May, bringing generative AI to the platform via a partnership with OpenAI. “So regardless of whether you ask a document question, you generate content in Box Notes or you go to a hub and ask a question, you get 20 queries per month,” CEO Aaron Levie said. At that point, they can dip into a shared pool of 2,000 additional credits that belong to the entire company.
(Bloomberg) — Adobe Inc. announced new versions of its generative artificial intelligence model, Firefly, promising higher-quality images and the ability to produce new types of media.Most Read from BloombergIsrael Latest: Hamas Leaders Targeted as Ground War LoomsApollo CEO Marc Rowan Demands UPenn Leaders Quit Over ‘Antisemitism’IMF Caught Off Guard as China Strikes Sri Lanka Debt DealHouse Republicans Pick Scalise as Speaker Nominee in Trump SnubWall Street Brushes Off Hot PPI as Stocks Adva
Modal Labs, a platform that provides cloud-based infrastructure to data teams and app developers, particularly those creating generative AI applications, has raised $16 million in a Series A funding round led by Redpoint Ventures, with participation from Amplify Partners, Lux Capital and Definition Capital. The cash infusion, which brings Modal's total raised to $23 million, will be put mainly toward hiring, according to co-founder and CEO Erik Bernhardsson. Bernhardsson started Modal Labs in 2021 after a 15-plus-year career building and leading data teams.
With the possibility of a travel recovery this year, which airline and transportation ETFs are ready to fly again?
Alteryx (AYX) expands its portfolio with the recent launch of Alteryx AiDIN innovations, which includes AI Studio. The expanded portfolio is expected to aid top-line growth.
AMD yesterday acquired Nod.ai, an open source AI software provider, as the chipmaker looks to bolster its efforts to build an ecosystem of AI development tools, libraries and models around its hardware. In a press release, AMD SVP Vamsi Boppana said that the Nod.ai acquisition will "significantly" enhance AMD's ability to provide customers with "software that allows them to easily deploy highly performant AI models tuned for AMD hardware." "The addition of the talented Nod.ai team accelerates our ability to advance open-source compiler technology and enable portable, high-performance AI solutions across the AMD product portfolio," Boppana said.
The asset manager is moving further into the business of digital investing.
As part of its MAX conference, Adobe traditionally shows off some of its more forward-looking tech, which may or may not end up in its Creative Cloud apps at some point in the future. The idea here is to show what its engineers are working on; right now, as you can imagine, that's a lot of generative AI. With Firefly now being part of Photoshop and now also Illustrator, the next frontier here is video and unsurprisingly, that's where Adobe's most interesting "sneak" of this year comes in. Projec
Samsung said its September quarter operating profit fell 78% but the drop wasn't as sharp as in previous quarters. Micron and Intel could benefit from signs of a chip recovery.
Software growth stocks outperformed in the first half of 2023. But will the artificial intelligence-driven rally be sustainable for software stocks?
Amid hype over artificial intelligence and ChatGPT, the best AI stocks generate sales or get a strategic edge from the fast maturing technology.
Look for changes the next time you log into your account.
Google's Pixel 8 and Pixel 8 Pro bring new generative AI features to the company's smartphones.
We have narrowed our search to five Internet software stocks with strong potential for the rest of 2023. These are: PANW, PINS, MNDY, WDAY, TWLO.
Anysphere, a startup building what it describes as an "AI-native" software development environment, called Cursor, today announced that it raised $8 million in seed funding led by OpenAI's Startup Fund with participation from former GitHub CEO Nat Friedman, Dropbox co-founder Arash Ferdowsi and other angel investors. The new cash, which brings Anysphere's total raised to $11 million, will be put toward hiring and supporting Anysphere's AI and machine learning research, co-founder and CEO Michael
Samsung has joined Google’s campaign to force Apple to make iMessage RCS-compatible—but European regulators are more likely to get that job done.
OpenAI plans to introduce major updates for developers next month to make it cheaper and faster to build software applications based on its artificial intelligence models, as the ChatGPT maker tries to court more companies to use its technology, sources briefed on the plans told Reuters. The updates include the addition of memory storage to its developer tools for using AI models. This could theoretically slash costs for application makers by as much as 20-times, addressing a major concern for partners whose cost of using OpenAI’s powerful models could pile up quickly, as they try to build sustainable businesses by developing and selling AI software.
T-Mobile holds an edge in 5G wireless spectrum but will its market share gains vs. rivals continue? A big stock buyback is underway.
Powered by Google's Tensor G3 chip and a new suite of impressive AI features, the Pixel 8 and Pixel 8 Pro are some of the most compelling Android phones in years.

source

Jesse
https://playwithchatgtp.com