Your Chatbot Has a Carbon Footprint

The next time you ask AI to draft your to-do list, remember, it may have consumed more energy than your dishwasher.

We’ve been so busy racing toward artificial intelligence, we forgot to check the meter. MIT Tech Review’s investigation into AI’s energy use reveals an inconvenient truth: every query, image, and video generated by AI consumes real electricity, lots of it.

One billion ChatGPT queries a day adds up fast. Newer models, like GPT-4 and Sora, demand more energy than ever, while companies like OpenAI, Microsoft, and Meta build out data centers and even fund nuclear power to keep up.

Yet details remain tightly guarded, with closed AI models offering little transparency.

  • AI inference now uses 80–90% of compute power
  • Generating a 5-second video uses 3.4M joules
  • By 2028, AI may consume power equal to 22% of US homes

We can’t lead through complexity with blinders on. Transparency isn’t a threat to innovation—it’s what earns public trust. How should we hold AI companies accountable when they won’t even tell us what they’re burning?

Read the full article on MIT Technology Review.

----

💡 If you enjoyed this content, be sure to download my new app for a unique experience beyond your traditional newsletter.

This is one of many short posts I share daily on my app, and you can have real-time insights, recommendations and conversations with my digital twin via text, audio or video in 28 languages! Go to my PWA at app.thedigitalspeaker.com and sign up to take our connection to the next level! 🚀

If you are interested in hiring me as your futurist and innovation speaker, feel free to complete the below form.