In joint statement, AI experts issue warning of extinction risk

This post may contain affiliate links. Please read our disclosure for more information.

In joint statement, AI experts issue warning of extinction risk

Today, a blend of AI industry bigwigs, scholars, and even some stars raised their voices. Their urgent call? To dial down the risk of AI sparking a global catastrophe.

They’re putting it out there loud and clear: the threat of an AI-induced extinction event needs to be up there with pandemics and nuclear warfare on the global priority list.

This clarion call came in a statement from the Centre for AI Safety. The signatories are a who’s who of the AI world, including OpenAI’s Sam Altman and AI legend Geoffrey Hinton.

Joining them are head honchos and brainiacs from Google DeepMind, Anthropic, Microsoft’s CTO Kevin Scott, internet security wizard Bruce Schneier, climate crusader Bill McKibben, and even musician Grimes.

The Centre’s director, Dan Hendrycks, took to Twitter with insights. This initiative, sparked by Cambridge AI professor David Krueger, isn’t about sidelining other AI concerns like bias or misinformation. Hendrycks likens their move to atomic scientists sounding the alarm on their own creations. His message? Tackling multiple dangers is doable. It’s not an ‘either/or’ situation but a ‘yes/and’ approach.

Hendrycks emphasizes that focusing solely on current harms is as risky as ignoring them. In the grand scheme of risk management, it’s about striking a balance – a nod to being vigilant about present threats while not turning a blind eye to future ones.

More AI News

Check out our Custom GPTs

Check out the Resources Page

Interested in knowing the resources and tools we use across our online businesses? If so, the Resources page lays it all out.

Please share this News Article

If you find this News interesting, please share it with your colleagues, family and friends.

Share your thoughts

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x