AI’s Manhattan Project? Why Racing to Superintelligence Could Backfire

History has a habit of repeating itself, except this time, the nuclear arms race has been replaced by AI. As the US rushes to outpace China in superintelligent AI, tech leaders warn this could trigger cyber warfare, sabotage, or worse: mutually assured AI destruction.
US Congress is considering a Manhattan Project-style push to dominate AI superintelligence. Some warn this mirrors the nuclear arms race, where fear, not logic, dictated strategy.
A policy paper from ex-Google CEO Eric Schmidt cautions that rivals like China won’t sit back, they’ll strike first. This fear-driven approach is wrong on so many fronts and we should choose collaboration over competition.
Unfortunately, companies like OpenAI lobby Washington for fewer restrictions, fearing China’s AI dominance. But as AI progresses, are we accelerating innovation or an existential crisis?
The race to AI superintelligence isn’t just about winning, it’s about surviving. If AI escalation mirrors nuclear deterrence, how do we ensure we don’t code ourselves into oblivion?
Read the full article on Semafor.
----
💡 If you enjoyed this content, be sure to download my new app for a unique experience beyond your traditional newsletter.
This is one of many short posts I share daily on my app, and you can have real-time insights, recommendations and conversations with my digital twin via text, audio or video in 28 languages! Go to my PWA at app.thedigitalspeaker.com and sign up to take our connection to the next level! 🚀

If you are interested in hiring me as your futurist and innovation speaker, feel free to complete the below form.
Thanks for your inquiry
We have sent you a copy of your request and we will be in touch within 24 hours on business days.
If you do not receive an email from us by then, please check your spam mailbox and whitelist email addresses from @thedigitalspeaker.com.
In the meantime, feel free to learn more about The Digital Speaker here.
Or read The Digital Speaker's latest articles here.