© 2024 CyberNews- Latest tech news,
product reviews, and analyses.

Musk and Wozniak signal concern over rise of AI in open letter


More than a thousand notable signatories, including Twitter, Tesla, and SpaceX boss Elon Musk, and Steve Wozniak, co-founder of Apple, have signed an open letter calling on all artificial intelligence (AI) labs to pause the training of systems more powerful than GPT-4.

“Contemporary AI systems are now becoming human-competitive at general tasks and we must ask ourselves: Should we let machines flood our information channels with propaganda and untruth? Should we automate away all the jobs, including the fulfilling ones?” asks the letter.

“Should we develop nonhuman minds that might eventually outnumber, outsmart, [render us] obsolete and replace us? Should we risk loss of control of our civilization? Such decisions must not be delegated to unelected tech leaders.”

The signatories who also include famous author Yuval Noah Harari, Rachel Bronson, president of the Bulletin of the Atomic Scientists, and scores of AI researchers write that AI systems “with human-competitive intelligence” can pose profound risks to society and humanity.

The letter states there isn’t enough planning and management of AI advances. The opposite is supposedly happening – AI labs have been locked in an “out-of-control race to develop and deploy ever more powerful digital minds that no one – not even their creators – can understand, predict, or reliably control.”

Even though the signatories say they’re not calling for a complete pause on AI development in general, they add that experts should use the letter’s proposed half-year hiatus to develop and implement shared safety protocols.

“We call on all AI labs to immediately pause for at least six months the training of AI systems more powerful than GPT-4. This pause should be public and verifiable, and include all key actors. If such a pause cannot be enacted quickly, governments should step in and institute a moratorium,” the letter says.

Many of the signatories, such as Musk, could reliably be called libertarians who are skeptical of state authority. But they jointly call for more regulation when it comes to developing advanced AI systems.

The letter calls for developers to work closely with policymakers and create AI governance systems. The latter would track AI and develop watermarking systems to help distinguish the real from synthetic. Besides that it says audits will be needed, as well as mechanisms for liability in case of harm caused by the technology.

“Humanity can enjoy a flourishing future with AI. Having succeeded in creating powerful AI systems, we can now enjoy an ‘AI summer’ in which we reap the rewards, engineer these systems for the clear benefit of all, and give society a chance to adapt,” said the letter.

Intriguingly, to date no one from OpenAI, the company behind the advanced ChatGPT-4 model, has signed the letter, although Emad Mostaque, founder and CEO of Stability AI, is on the list.

Jeff Jarvis, a journalism professor at the City University of New York and advocate for greater public engagement online, called the letter “absolutely f***ing ludicrous and laughable” on Twitter. He said it was signed by “so many usual-suspect moral entrepreneurs and attention addicts.”

Others counter that the open letter has been signed by experts in machine learning, and that it seems wise to establish principles governing AI development.

Cybernews recently reviewed a book by a professor of business psychology who expresses concern over the craze about generative-AI tools, but adds that humans are still in control of what the technology is or is not used for.

To read the open letter in full, please click here.


More from Cybernews:

Toyota scrambles to patch customer data leak

Microsoft introduces AI-powered cybersecurity assistant

LockBit ransomware gang infrastructure reported down

Latitude data breach exposed 14m clients

Apple keeps old music alive with new app

Subscribe to our newsletter



Leave a Reply

Your email address will not be published. Required fields are markedmarked