Musk, Wozniak, tech leaders sign open letter calling for pause on “out-of-control” AI development

Musk, Wozniak, tech leaders sign open letter calling for pause on “out-of-control” AI development

SpaceX, Tesla and Twitter CEO Elon Musk

An open letter signed by more than 1,000 artificial intelligence researchers and leaders such as Elon Musk have joined a call for an immediate pause on training ‘giant’ AI systems for at least six months, allowing time to study the danger of systems like GPT-4.

Other signatories of the letter include Apple co-founder Steve Wozniak, Getty Images CEO Craig Peters, Pinterest co-founder Evan Sharp, and renowned author Yuval Noah Harari, as well as engineers from Amazon, Google, Meta and Microsoft.

The letter was also signed by numerous Australian experts from some of the nation's leading universities. 

It comes only two weeks after OpenAI, the company behind ChatGPT, launched GPT-4, a large multimodal model that accepts prompts of text and images, including documents with text and photographs, diagrams, or screenshots.

“We call on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4,” the open letter published by the Future of Life Institute, a non-profit backed by Musk, states.

“Advanced AI could represent a profound change in the history of life on Earth, and should be planned for and managed with commensurate care and resources.

“Unfortunately, this level of planning and management is not happening, even though recent months have seen AI labs locked in an out-of-control race to develop and deploy ever more powerful digital minds that no one – not even their creators – can understand, predict, or reliably control.”

The letter also states that AI labs and independent experts should use the pause to jointly develop and implement a set of shared safety protocols that are safe beyond a reasonable doubt.

“This does not mean a pause on AI development in general, merely a stepping back from the dangerous race to ever-larger unpredictable black-box models with emergent capabilities,” the letter adds.

One argument often brought up in conversations surrounding AI is the paperclip theory, which is the idea that if you tell a machine to optimise a specific goal, it will do so at all costs.

By instructing a machine to maximise the number of paperclips it produces, it would eventually develop the power and resources necessary in order to achieve its goal, at the expense of human life if necessary. 

Alluding to this, the letter warns that “AI systems with human-competitive intelligence can pose profound risks to society and humanity.”

If a pause is not collectively agreed on, the letter says governments should step in and create a moratorium.

There are 29 nations, including Australia, Canada, and India, that are part of the Global Partnership on Artificial Intelligence – an international initiative that aims to advance the responsible and human-centric development of AI.

Governing agencies in China, Singapore and the EU have also introduced early versions of AI governance frameworks.

“Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable,” the letter says.

“This confidence must be well justified and increase with the magnitude of a system's potential effects.”

Get our daily business news

Sign up to our free email news updates.

 
Finexia’s Childcare Income Fund secures ‘very strong’ rating from Foresight Analytics & Ratings
Partner Content
Private credit specialist Finexia Financial Group (ASX: FNX) has secured a “very...
Finexia
Advertisement

Related Stories

Billionaire pubs baron Mathieson boosts holding in The Star back to nearly 10pc

Billionaire pubs baron Mathieson boosts holding in The Star back to nearly 10pc

Pubs baron Bruce Mathieson has taken advantage of a slump in The St...

Don’t understand predictive algorithms? Xplainable bridges the “how and why” gap of machine learning

Don’t understand predictive algorithms? Xplainable bridges the “how and why” gap of machine learning

"There is so much hype around AI. Let's just focus on...

IHG teams with Felix Capital for four-star Holiday Inn at Caloundra

IHG teams with Felix Capital for four-star Holiday Inn at Caloundra

IHG Hotels & Resorts has partnered with Sydney-based Felix Capi...

Construction and hospitality dominate insolvencies amid 36pc spike in administrator appointments

Construction and hospitality dominate insolvencies amid 36pc spike in administrator appointments

Whilst barely a fortnight goes by when a well-known Australian comp...