AWS suffered ‘at least two outages’ caused by AI tools, and now I’m convinced we’re living inside a ‘Silicon Valley’ episode

AWS and Silicon Valley
(Image credit: HBO / Getty Images)

Amazon’s cloud storage unit AWS reportedly suffered at least two outages in December due to “errors involving its own employees.” And as I read this report from Financial Times, I couldn’t help but think that somewhere, the writers of “Silicon Valley” were nodding knowingly.

From the Kiro AI coding tool’s decision that the best course of action was to “delete and recreate” the system environment to Amazon’s response that it was “user error, not AI error,” this whole scenario feels eerily familiar to anyone who spent five seasons watching Richard Hendricks (played by Thomas Middleditch) sweat through a hoodie when talking about Pied Piper.

It’s worth noting that this is not related to the huge AWS outage half the internet experienced back in October. A spokesperson confirmed this was an “extremely limited event” that affected one of two regions in mainland China, and the second outage did not impact the “customer facing AWS service.” Following these, Amazon has “implemented numerous safeguards” to ensure this doesn’t happen again.

Kiro AI = Son of Anton

AWS data center

(Image credit: Amazon)

Without spoiling too much of the show here (given the tech industry nowadays, I pray it comes back), there’s a plot thread running throughout the show where one of the characters, Gilfoyle (Martin Starr), built an AI bot named Son of Anton, which gained a will of its own and started optimizing itself.

From that point, agentic AI hilarity ensues — you can see a lot of this happening for real in today’s world. Watching OpenClaw being able to respond to a user’s messages on their behalf reminds me of the infinite AI messaging loop Dinesh (Kumail Nanjiani) and Gilfoyle find themselves in, for example.

Anyway, Son of Anton eventually grows to become a black box that starts making executive decisions without human input. Hendricks and his team lose control because the AI gave them exactly what they asked for (a fix), but in the most destructive way possible.

Silicon Valley- Gilfoye's AI Deleted All Software - YouTube Silicon Valley- Gilfoye's AI Deleted All Software - YouTube
Watch On

Much like Anton, while these tools can actually do stuff, they can often lack the common sense to know that what they’re about to do could be massively damaging. It just sees an objectively right path. Kiro was tasked with fixing a minor bug in AWS Cost Explorer. Instead, it autonomously decided the best course of action was to delete the entire environment.

As Gilfoyle said when he gives his AI permission to overwrite code: "The most efficient way to get rid of all the bugs was to get rid of all the software, which is technically and statistically correct."

A Hooli-esque denial

The corporate ego and antagonist of the piece, Hooli CEO Gavin Belson (played by Matt Ross), is easily one of the funniest characters on the show. A pitch-perfect satire of the typical tech leader. His signature move was one you see a lot in the show: standing in front of a giant screen and explaining why problems or failures were actually features of “pre-greatness.”

There’s a big contrast between what Amazon employees are saying and AWS’s corporate stance. On one side, employees say that this “warp-speed approach to AI development will do staggering damage,” and a senior AWS employee told FT that “engineers let the AI [agent] resolve an issue without intervention. The outages were small but entirely foreseeable.”

Then you have Amazon’s defense, which calls AI tools being involved a “coincidence,” that it was “user access control issue, not an AI autonomy issue,” and that mistakes are not more common with AI tools.

Oh, and the whole “same issue could occur with any developer tool or manual action” bit? The agent decided it was a good idea to delete the entire production environment — if they have human developers who think it’s a good idea, too, I’d start looking for new developers.

To my eyes, after rewatching a couple of crucial episodes, this is peak Hooli. By blaming user permissions, it’s kind of like saying Kiro AI is a Ferrari; we just hired a driver who didn’t know how to use the brakes. Protecting the AI future by sacrificing the reputation of the human present.

‘Silicon Valley’ ages like fine wine

Close up of AWS sign at their offices in SOMA district

(Image credit: Sundry Photography | Shutterstock)

When the show aired its finale in 2019, I thought we were closing the book on the era of tech-bro satire. But if the reporting on AWS’s “Kiro” incident is true, it proves that we’ve just entered the spin-off.

The parallels are too perfect to ignore. On one side, we have whistleblowers and employees describing a “warp-speed” rollout of agentic AI that acts with the terrifying, logical purity of Son of Anton — a tool so focused on “solving” a problem that the entire existing environment is an obstacle to be deleted.

On the other side, there’s a corporate statement protecting AI perfection so hard that it borders on performance art. Blaming the engineer for the bot’s autonomy is like blaming a person for a gun going off after the safety was intentionally removed.

Somewhere, the show’s creator, Mike Judge, must be watching the news, probably realizing he didn’t write a comedy — he wrote a documentary and just forgot to tell us.


Google News

Follow Tom's Guide on Google News and add us as a preferred source to get our up-to-date news, analysis, and reviews in your feeds.


More from Tom's Guide

TOPICS
Jason England
Managing Editor — Computing

Jason brings a decade of tech and gaming journalism experience to his role as a Managing Editor of Computing at Tom's Guide. He has previously written for Laptop Mag, Tom's Hardware, Kotaku, Stuff and BBC Science Focus. In his spare time, you'll find Jason looking for good dogs to pet or thinking about eating pizza if he isn't already.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.