A set of hyper-realistic AI videos featuring actors from Stranger Things has gone viral this week and not in a way fans are celebrating.
The clips, which show a creator seamlessly replacing his face and body with those of Millie Bobby Brown, Finn Wolfhard, and David Harbour, have unsettled viewers and reignited concerns about how quickly deepfake technology is advancing, and how little protection exists for digital identity.
The videos were reportedly created using Kling AI’s 2.6 Motion Control model, a tool capable of full-body character swaps rather than simple face replacement. Posted by Brazilian creator Eder Xavier, the clips spread rapidly across platforms, surpassing 14 million views on X alone, with reposts continuing to circulate.
Unlike earlier deepfakes that focused mainly on facial overlays, these videos track body movement, posture, clothing, and expression in near-perfect sync.
The result is footage that looks convincingly real at a glance precisely what many viewers found disturbing.
“This is so creepy,” one user wrote on X. Others echoed similar reactions, pointing less to shock value and more to unease over how difficult it has become to tell what’s real.
Why these videos feel different
According to Yu Chen, a professor of electrical and computer engineering at Binghamton University, full-body character swapping represents a step change in synthetic media.
“Earlier deepfake tools focused on faces,” Chen commented.
“Full-body swapping has to solve pose estimation, skeletal tracking, clothing transfer, and natural movement across the entire frame.”
That added realism removes many of the visual tells that detection tools once relied on.
The same realism that impresses viewers also lowers the barrier for abuse. Emmanuelle Saliba, chief investigative officer at cybersecurity firm GetReal Security, warned that these tools make impersonation easier than ever.
“For a few dollars, anyone can generate a full-body video of a celebrity, politician, CEO, or private individual from a single image,” she said. “There’s no default protection of a person’s digital likeness.”
Saliba added that misuse is likely to extend well beyond novelty videos, ranging from impersonation scams and corporate fraud to political disinformation and non-consensual explicit content.
Hollywood implications and wider risks
The viral clips have also caught the attention of industry insiders. a16z partner Justine Moore shared one of the videos, noting that AI video models are poised to reshape production pipelines faster than most people expect.
“We’re not prepared for how quickly this will change,” Moore wrote. “Endless character swaps at a negligible cost.”
Beyond Stranger Things, similar body-swap videos have already appeared featuring Leonardo DiCaprio and other well-known actors, suggesting the technique is spreading rapidly as tools like Kling, Google’s Veo, FaceFusion, and OpenAI’s Sora become more accessible.
We're not ready.
— Min Choi (@minchoi) January 15, 2026
AI just redefined deep fakes & character swaps.
And it's extremely easy to do.
Wild examples. Bookmark this.
[🎞️JulianoMass on IG]pic.twitter.com/fYvrnZTGL3
Users commenting on the incident.
While it remains unclear how Netflix or the actors involved will respond, Chen emphasized that responsibility can’t fall on developers alone. Because many of these models are publicly available, effective safeguards will require cooperation across platforms, regulators, and users.
“Detection systems need to focus on intrinsic signatures of synthetic media, not metadata that can be easily removed,” Chen said. He also called for clearer disclosure rules and liability frameworks as AI-generated content becomes harder to distinguish from reality.
A cultural moment at the wrong time
The timing adds another layer of irony. Stranger Things recently concluded its fifth and final season, closing out one of Netflix’s most influential series. As fans say goodbye to Hawkins and its characters, AI replicas are already extending their digital afterlife, without consent, contracts, or creative control.
You can’t know what’s real and what’s AI generated… just think where it will be in couple of years 🤯
— 𝕄…𝕏 𝔻𝕚𝕧𝕚𝕕𝕖𝕟𝕕 𝕘𝕣𝕠𝕨𝕥𝕙 (@martyyyn_s) January 14, 2026
Users commenting on the incident.
For many viewers, that contrast is exactly what makes the videos unsettling.
What began as a technical demo now sits at the intersection of fandom, identity, and trust, raising a question the internet is still struggling to answer: when anyone can convincingly become anyone else on screen, what does authenticity even mean anymore?

Disclaimer: All materials on this site are for informational purposes only. None of the material should be interpreted as investment advice. Please note that despite the nature of much of the material created and hosted on this website, HODL FM is not a financial reference resource, and the opinions of authors and other contributors are their own and should not be taken as financial advice. If you require advice. HODL FM strongly recommends contacting a qualified industry professional.




