Skip to main content

This is a dead giveaway that you're watching an AI-generated video

Moonvalley AI image
(Image credit: Moonvalley)

AI-generated videos are everywhere, with clips generated by OpenAI's Sora 2 and Veo 3.1 from Google appearing everywhere on social media and even in your news feed. Between the big names and dozens of smaller AI tools cranking out hyper-realistic clips, it’s getting harder to tell what’s real and what’s synthetic.

But after months of testing these tools, I’ve found one consistent giveaway. It’s not the melting hands, blurry backgrounds or too-perfect lighting. It’s the eyes. The eye-line wobble is the new uncanny valley

No matter how advanced the model, AI still struggles with micro-tracking. Those are the tiny, constant eye movements humans make while focusing, reacting and thinking. Real eyes adjust. They flick. They track motion. They respond to light. But AI-generated eyes often don’t. Instead, you’ll notice:

  • Eyes locked in place, never shifting
  • Blinking that’s too slow or robotic
  • Gaze that doesn’t follow head movement
  • Pupils that don’t react to light
  • A flat, painted-on look that’s hard to explain but easy to spot

Once you notice it, you won’t be able to unsee it; even the best models haven’t solved it.

Why AI still can’t get eyes right

Runway generated video of a woman

(Image credit: Runway Gen-3 Alpha/Future AI)

Here's the thing: eyes aren’t just part of the visual asethetic, they're actually behavioral signals. They reflect emotion, attention and thought. And to get that right, AI would need to understand things like context-aware attention, realistic physics, emotional depth (although it's getting better), light interaction and finally, micro-expressions that make us human.

Most video generators work frame by frame or in short bursts, so they miss the continuous, subtle shifts we expect from real people. The result is a stare that feels vacant, disconnected or just…off.

Don’t forget the audio — that’s another big giveaway

Food reel - YouTube Food reel - YouTube
Watch On

Even when the visuals look convincing, AI-generated voice and sound design often break the illusion instantly. You'll want to listen to voices that sound too perfect. Real speech is messy. Humans pause, stutter, trail off and emphasize words unevenly. AI voices often sound overly smooth.

You'll also notice that emotions don’t land. AI can mimic tone, but it rarely gets the rhythm right. That’s why emphasis lands in weird places, jokes don't land and even excitement falls flat. In general, emotions seem off, either from audio or the eyes.

In many AI videos the ambient noise feels wrong for the environment. For example, street scenes with no wind or traffic feel like the Twilight Zone. Indoor clips with zero echo or crisp, studio-quality voices in outdoor footage. This goes for AI videos with human, animals or just background and landscape.

More signs you're watching AI

Pika Labs lip sync video

(Image credit: Pika Labs)

Eyes and voice are the biggest giveaways, but these visual clues are common too:

  • Clothing and hair that moves unnaturally
  • Backgrounds that look overly smooth or empty
  • Perfect facial symmetry
  • Hands that glitch during motion
  • Movement that feels delayed or puppet-like

It’s only getting harder to tell

SUP backflip - YouTube SUP backflip - YouTube
Watch On

With every new model, the ability to tell the difference between real and AI are nearly indistinguishable from real footage. But for now, the combo of eye behavior and voice imperfections is your best shot at spotting fakes.

As AI-generated video floods social feeds, spotting synthetic content is quickly becoming a basic digital skill. Deepfakes, manipulated footage and misinformation are only going to grow.

To keep yourself from being fooled consider the following:

  • Watch the eyes — they’re the biggest tell
  • Listen closely — real voices aren’t flawless
  • Check motion physics — especially hands, hair and clothing
  • Reverse search a frame — it’s not perfect, but it can help verify origin

Final thoughts

AI-generated videos are getting scarily realistic, but the cracks are still there. Although the best AI tools can simulate faces and sounds with shocking accuracy, they still can’t fake the subtle cues that make us human.

So next time you’re scrolling and something feels off, scan the eyes. Listen to the rhythm of the voice. Check the background noise. Chances are, you've spotted an AI video.

Category
Arrow
Arrow
Back to Laptops
Brand
Arrow
Processor
Arrow
RAM
Arrow
Storage Size
Arrow
Screen Size
Arrow
Colour
Arrow
Condition
Arrow
Price
Arrow
Any Price
Showing 10 of 98 deals
Filters
Arrow
Show more

More from Tom's Guide

Follow Tom's Guide on Google News and add us as a preferred source to get our up-to-date news, analysis, and reviews in your feeds.

Google News


Amanda Caswell
AI Editor

Amanda Caswell is an award-winning journalist, bestselling YA author, and one of today’s leading voices in AI and technology. A celebrated contributor to various news outlets, her sharp insights and relatable storytelling have earned her a loyal readership. Amanda’s work has been recognized with prestigious honors, including outstanding contribution to media.

Known for her ability to bring clarity to even the most complex topics, Amanda seamlessly blends innovation and creativity, inspiring readers to embrace the power of AI and emerging technologies. As a certified prompt engineer, she continues to push the boundaries of how humans and AI can work together.

Beyond her journalism career, Amanda is a long-distance runner and mom of three. She lives in New Jersey.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.