
Meta’s Nick Clegg says asking artists for permission to use their work in AI training would "kill the industry overnight." That’s not just dramatic – it’s a window into how Silicon Valley deflects responsibility with scale.
Nick Clegg, former president of global affairs at Meta, has dismissed the idea of asking artists for permission to use their content, as he believes it would kill the AI industry.
The ex-deputy prime minister of the UK claimed that overregulation would stifle AI development in the UK, putting it at a disadvantage globally.
His curious framing of the matter considers it a logistical impossibility, as there would be too many creators and too much content to get permission for each time.
“Quite a lot of voices say, ‘You can only train on my content if you first ask.’ And I have to say that strikes me as somewhat implausible because these systems train on vast amounts of data,” he told the Times.
So, Clegg is directly acknowledging that the opposition is quite strong to having their creative material snatched, and that it’s not convenient to ask. But not convenient for whom?

Is gaining consent that hard?
Large-scale licensing might be difficult in a logistical sense, but surely that’s the fault of big tech, as opposed to the artist's fault.
Other industries routinely manage large-scale rights systems – think music sampling itself, pharmaceutical companies handling patents, and stock photo platforms paying creators.
There are some opt-out avenues, such as Have I been trained? But this is hardly full-scale industry protection.
Have I Been Trained is a search tool that lets artists see if their work was scraped to train AI, and opt out if they catch it in the act.
When AI firms are trading in billions, claiming that consent is “too expensive,” it feels hollow.
Artists should absolutely be notified when their content is going to be used for AI training. You can’t borrow a car without asking, just because you’re in a hurry.
The amendment – it was just about transparency
In 2024, UK lawmakers proposed a small but powerful tweak to the Data Protection and Digital Information (No.2) Bill – a massive piece of legislation – asking AI companies to disclose what copyrighted content they used to train their models.
So, no asking for permission – just a receipt that it was used afterwards.
Hundreds of authors, musicians, and artists signed the draft legislation, but the House of Commons blocked it from passing at the last moment.
The reason cited by parliament was that it might slow AI innovation. But while the UK is trying to become competitive, it is once again turning a blind eye to the integrity of the artist.

The bigger picture – who is AI working for?
Nick Clegg’s “consent will kill AI” stance isn’t about technical hurdles – it’s more of a power play to protect big tech's bottom line.
Ignoring the integrity of artists and using their content to be gobbled up by the system, with no forewarning or compensation, is a basic ethical right.
It’s free R&D for Meta and a large and exploitative ignorance from both the government and big tech, but creators are done being the unpaid fuel for someone else’s billion-dollar bonfire.
Your email address will not be published. Required fields are markedmarked