
H&M's bold move to digitally clone 30 human models promises global scalability without travel. But can fashion genuinely thrive if we sideline the real-world energy, storytelling, and human connection behind it?
H&M recently triggered one of the biggest debates in the fashion and tech industries. The Swedish retail giant announced plans to create AI-generated "digital twins" of 30 real-life models, fully cloned with permission and used in marketing and social media.
These hyper-realistic digital replicas will be available for photo campaigns, e-commerce imagery, and video content. The good news for models is that they will retain ownership of their likenesses and get paid for digital appearances. But are we a firmware update away from AI perfecting Blue Steel and demanding its own agent?
Wow.
undefined PJ Ace (@PJaccetturo) March 25, 2025
H&M is making AI clones of 30 models this year for ads and social posts.
The wild part? The models own their digital twins—and can rent them out to other brands, even H&M’s rivals. pic.twitter.com/JkoqqAJliB
Where collaboration meets cloning
The promise of AI clones in fashion is hard to ignore. Models can effectively scale themselves globally without a plane ticket or makeup call. Brands cut down on travel emissions, reduce logistical headaches, and generate campaign assets in hours, not weeks. That means fewer last-minute reshoots, more flexibility for trend-based content, and cost efficiencies that are undeniably appealing in an industry with a razor-thin margin.
Creating fully licensed digital avatars of human models feels like an incremental evolution. Sure, it is more visible, controversial, and layered than the previous tech tweaks, but make no mistake, this is not our first rodeo when it comes to tech and fashion. We’ve been here many times before, from the rise of photo editing to virtual try-ons and filters that blur digital and physical lines.
However, this latest announcement hit differently because H&M isn't cutting models out of the process. They're positioning the models as collaborators, giving them complete control over the use of their digital likeness, allowing licensing to third parties (even competitors), and offering compensation aligned with traditional shoot rates.
That last part is critical. It repositions AI not as a replacement but as an augmentation tool. Something that can allow models to earn while they sleep. Or while working on another project. Or while simply having a life off the runway.

It's a precedent many hope will become the industry standard, especially as generative AI tools become more embedded in everyday creative workflows. But at what cost?
What we lose when AI models take the stage
Fashion is, by its nature, performative. It thrives on spontaneity, chemistry, imperfection, and the often chaotic beauty of a real-world shoot. Behind every glossy ad are human stories. Take a peek behind the curtain, and you will find nervous energy backstage, spilled coffee, and unplanned moments that define a look or campaign. Can an AI avatar truly replicate that?
Critics, including models, photographers, stylists, and educators, worry not necessarily about the tech itself but what might be lost in the shift. AI twins don't need lighting assistants, makeup artists, or location scouts. They don't laugh with the crew between takes. They don't inspire BTS content or form part of an ecosystem that has, for decades, supported thousands of creative workers across the globe.
There's also concern that what's gained in scale and efficiency may come at the cost of storytelling, culture, and emotion. The intangibles that make fashion feel personal, not programmatic.
H&M is working with models and their digital twin: pic.twitter.com/CuuuKE9MJO
undefined Joey (@joeylabs) March 26, 2025
Fast fashion, slower truths
There have been mixed reactions online to H&M's announcement, with many highlighting the irony that the brand is claiming sustainability points by switching to digital avatars while continuing to churn out fast fashion at scale.

Cutting down on flights and physical shoots may well reduce carbon emissions. However, as researchers have pointed out, training deep learning models like those used to create hyper-realistic avatars is far from free in environmental terms. One GPT-scale model can consume the energy equivalent of 126 Danish homes over a year.
Even if H&M's AI isn't running on that level, the cumulative environmental cost of generative content is rising and rarely transparent.
Add to that the fashion industry's existing challenges with greenwashing, and it's understandable why some observers aren't giving out automatic sustainability points for this move. We may be approaching the time to reassess what sustainability means when digital infrastructure is part of the equation.
👉Did you know that a single training session of #GPT3 uses the equivalent of a year's energy consumption of 126 Danish homes 🏘️?
undefined Verónica Bolón (@veronicabolon) January 26, 2023
🟢Green AI refers to #AI research which is more environmentally friendly and inclusive, trying to reduce the energy consumption of large models.
Another uncomfortable reality is that diversity doesn't automatically scale with technology. Using only 30 models as the base dataset raises valid concerns about representation. Will the digital twins reflect a broad spectrum of body types, abilities, ages, and backgrounds? Or will the push for clean, consistent, "on-brand" content nudge us toward more homogenous and commercialized ideals?
There's also a blurred line between digital identity ownership. Sure, today, the models involved hold rights over their twins. But what happens in five years when new commercial pressures, platform norms, or licensing deals come into play? Are we heading toward a world where an individual's likeness becomes a line item in a brand's asset library?
Why creative education must evolve for an AI-first future
As technology reshapes the rules of content creation, there's also an increasing education gap as universities and advertising schools struggle to keep pace with what's now possible. The creative pipeline is still primarily built on teaching campaign planning, mood boards, and traditional production workflows. But those methods are quickly becoming outdated.
Marketing and creative education must adapt fast. That means bringing tools like Midjourney, Synthesia, and HeyGen into classrooms, not banning them, not as gimmicks but as everyday tools essential to industry roles. Students need to learn how to use AI creatively and critique it ethically. The future is already hybrid. Curricula need to catch up.
So, where do we go from here?
This isn't the first time fashion has leaned into emerging tech. H&M's digital twin announcement arrives alongside a broader shift in how clothing intersects with data, identity, and even surveillance.
Fashion apps from H&M and Nike are among the most aggressive in collecting user data, gathering everything from photos to personal information. Elsewhere, Cap_able is experimenting with anti-surveillance garments designed to confuse facial recognition software, offering wearable privacy in an age of mass biometric tracking.
The most pressing question isn't whether AI clones will be part of fashion's future. That ship has sailed. The real question is how do we ensure it's a future worth having?
Can digital tools amplify creativity without erasing the human quirks that make it memorable? Where models are empowered, not sidelined, carbon footprints are measured, where education evolves, and diversity is more than a checkbox. A scalable future, yes, but also authentic.
It's not about AI versus authenticity. It's about holding space for both and asking better questions before we let efficiency run the show. We might not need to pick sides between pixels and people, but we do need to protect the humans behind the pixels. If the industry gets that right, this digital twin moment could mark the start of something truly transformative, just not in the way the buzzwords would have us believe.
Your email address will not be published. Required fields are markedmarked