Synthetic Minds | Spatial Computing Stopped Being A Headset
The Synthetic Minds newsletter offers short daily insights to get you thinking. If you enjoy it, please forward. All signals are powered by Futurwise. If you need more insights, subscribe to Futurwise and get 25% off for the first three months!
I have just launched the Intelligence Age Scorecard! It will help you understand how ready your organization is for the Intelligence Age.
Today’s topic: Spatial Intelligence
Spatial Computing Stopped Being A Headset
Google shipped the developer tool kit for AI glasses before any AI glasses ship next year. That is not impatience. That is the platform race already being decided, and the question is which apps land on the surface first.
That is the gestalt shift hiding behind this week's headlines. Spatial computing's center of gravity moved from the headset to the AI layer, and the SDK arrived ahead of the hardware.
Google's Android Show: I/O Edition launched Android XR SDK Developer Preview 3, opening AI-glasses development to third parties from Uber to GetYourGuide. Project Aura from XREAL was confirmed as the first Android XR wired smart-glasses device. The first Samsung, Gentle Monster and Warby Parker glasses arrive next year.
Apple quietly published three AI research papers the same week on whether models really understand physical spaces, on reading sign language, and on building lifelike 3D heads from cameras, showing its spatial-computing work has moved from the headset itself into the AI brain that runs on it.
Then Bloomberg's Mark Gurman confirmed no new Vision Pro for at least two more years. The Vision Products Group talent has been reassigned to lightweight smart glasses, Siri, and AI wearables.
Underneath: IDC and Counterpoint data confirmed smart glasses outsold VR and Mixed Reality headsets three to one in 2025. AI smart glasses now make up 78% of all smart-glasses shipments, up from 46% twelve months earlier.
That's the spatial-computing story. Here is the signal.
The display became optional. The model became the platform. The headset is now a vertical product for surgery, design review, and training, not a consumer category.
That architecture quietly rewrites the device contract. A headset asked permission: turn it on, opt in, take it off. An AI model living on glasses cannot ask, because the world streaming in is the input it needs.
AI glasses see and hear everything around you. That is not a feature. That is how you use them.
Ray-Ban Meta has already sold seven million pairs. EssilorLuxottica is targeting twenty million on faces by year-end.
Privacy law was written for a headset, used at home, with the door closed, with consent. None of that survives a pair of glasses worn to the supermarket.
The race is no longer which device wins. It is which AI model is sitting on your customer's face when they walk out the door.
The Intelligence Age Scorecard

The spatial-computing category just moved from the headset to the AI layer in a single week. The SDK shipped before the hardware, smart glasses outsold headsets three to one, and the privacy frameworks built for the visor have not been redrawn for always-on glasses. Are you still watching the headset race, or already adapting to a category where the model is the platform? Use the Intelligence Age Scorecard to benchmark your readiness for the next two quarters, and the next five years.
If this newsletter was forwarded to you, you can sign up here.
Thank you.
Mark