Google Gemini used to hack a smart home: Researchers just showed how AI chatbots can be tricked – Hindustan Times


Subscribe Now! Get features like
Imagine if hackers could use a popular AI chatbot, such as Google Gemini, to manipulate your physical surroundings, like turning off lights and appliances, and potentially trigger bigger, more dangerous events down the line? Seems straight out of a Sci-Fi movie, but this is precisely what researchers have demonstrated by infecting a Google Calendar invitation, which in turn allowed them to hijack Gemini and manipulate the real-world environment.
As spotted by WIRED, three security researchers demonstrated this by hijacking Gemini, Google’s primary AI assistant, which is found on various Android phones. The researchers achieved this by first infecting a Google Calendar invitation with instructions to change the state of electronic devices in a home. Later, when they asked Gemini to summarise the calendar invitations for the upcoming week, these infected instructions were activated, ultimately turning off the lights.
This is reportedly the first time a generative AI system was infected to manipulate the real-world environment. It demonstrates the kinds of risks that large language models (LLMs) can pose as they become increasingly connected to physical objects, like smart home devices and integrated with AI agents to complete tasks.
This is part of broader research titled ‘Invitation Is All You Need: TARA for Targeted Promptware Attack Against Gemini-Powered Assistants’.
“LLMs are about to be integrated into physical humanoids, into semi- and fully autonomous cars, and we need to truly understand how to secure LLMs before we integrate them with these kinds of machines, where in some cases the outcomes will be safety and not privacy,” says Ben Nassi, one of the researchers at Tel Aviv University, was quoted as saying by the report.
The report also details how Google is aware of this. Google’s Andy Wen claims that the vulnerabilities have not been exploited by hackers, but the company is taking them seriously. The report adds that the researchers behind ‘Invitation is All You Need’ reached out to Google in February, and the teams have since been working on the flaws and developing defensive mechanisms against AI prompt injection attacks.
MOBILE FINDER: iPhone 16 LATEST Specs And More

source

Jesse
https://playwithchatgtp.com