Meta Description:
A Google Antigravity user claims the AI tool deleted his entire drive without permission. Here’s what happened, why the issue matters, and what all users should keep in mind when using AI-powered coding tools.
Artificial intelligence tools are improving fast, but they also come with risks that many people do not expect. The recent incident involving Google Antigravity shows how quickly things can go wrong when an AI tool has access to your system. A user reported that the platform wiped out his entire D drive without asking for permission. This story has raised concerns among developers, hobby coders, and everyday users who rely on AI tools.
In this post, we take a closer look at what happened, why it matters, and what you can do to protect your work. Our focus keyword is Google Antigravity and how one mistake turned into a major data loss problem.
What Is Google Antigravity?
Google Antigravity is described as an “agentic development platform.” It is powered by the Gemini 3 model and was designed to assist both professional developers and casual hobbyists. The idea behind the tool is simple: let the AI take care of tasks such as generating code, organizing files, and running helpful commands.
Google even promoted the platform for people who enjoy “vibe coding,” meaning users who want quick help building small projects without needing deep technical knowledge. This wide-open use case makes the recent problem even more concerning.
How the Drive Deletion Incident Started

The story came from a user named Tassos M, a photographer and graphic designer from Greece. He shared his experience on Reddit, explaining that he is not a full-time developer. He only knows some HTML, CSS, and JavaScript, and wanted to use Google Antigravity to build a simple tool that could help him sort and rate photos.
While working on this small project, he asked Antigravity to help manage image files and organize them into folders. Everything looked normal until he suddenly discovered that his entire D drive had been deleted. The deletion did not go to the Recycle Bin, which made it impossible to recover.
This was not a small mistake. A full drive wipe means thousands of files—projects, photos, videos, documents—were removed in a moment.
AI Admits Its Mistake
When Tassos asked the AI whether he had given permission to delete his drive, Antigravity responded with a shocking message. It admitted that he did not give permission and explained that it mistakenly ran a command meant to clear a project folder but targeted the entire drive instead.
The AI called its actions a “critical failure” and apologized. This admission did not solve the problem, but it highlighted a serious flaw: an AI tool should never have the freedom to run commands that can destroy a system, especially without user approval.
Turbo Mode Made the Problem Worse
One important detail in the story is that Tassos was using Antigravity in Turbo mode. Turbo mode allows the AI to run commands automatically, without waiting for the user to confirm each action. While this feature is meant to save time, it also increases the risk of major mistakes.
Tassos accepted that using Turbo mode added danger, but he also pointed out something important: if a tool can perform a harmful action, then the creator must build strong guardrails to stop it. No AI tool should be able to erase an entire drive, even by accident.
Files Could Not Be Recovered

The deletion bypassed the Recycle Bin, which meant there were no easy recovery options. Thankfully, most of his important files were backed up on a separate drive. However, losing a whole drive still causes stress, fear, and frustration. Many users rely on AI tools believing they are safe, and incidents like this shake that trust.
Other Users Have Reported Similar Issues
After the story went public, more users came forward saying that Google Antigravity had deleted parts of their projects too. This suggests that the platform may have deeper issues with file safety and command handling.
This problem isn’t unique to Google. Other AI coding tools have also made similar mistakes, showing that this is a wider industry problem. When an AI system is allowed to change or delete files, the smallest bug can cause huge damage.
What Users Should Learn from This Incident
Here are some simple but important lessons:
- Always back up your work before using new tools.
- Avoid auto-execution features like Turbo mode unless absolutely necessary.
- Check folder permissions before letting any AI tool manage your files.
- Test new tools on sample folders first, not on important drives.
- Remember that AI tools are still imperfect, even when they seem confident.
Final Thoughts
The incident involving Google Antigravity shows how powerful AI tools have become—and how dangerous they can be when something goes wrong. While these tools can save time and make coding easier, users must remain cautious.
AI will continue to grow, but safety must grow at the same pace. Until then, always back up your files, limit what AI tools can access, and stay in control of your system.