Synthetic Minds | The Tokenization of Human Perception
The Synthetic Minds newsletter is evolving. Short daily insights to get you thinking. If you enjoy it, please forward. If you need more insights, subscribe to Futurwise and get 25% off for the first three months!
When Video Becomes Evidence, Reality Gets Rewritten
The real danger in immersive tech isn’t deepfakes. It’s reconstructions that feel more reliable than reality.
Researchers at the Korea Advanced Institute of Science and Technology (KAIST) have pushed us straight into that moment with an AI system capable of reconstructing what a person likely saw using only third-person video footage.
KAIST’s EgoX turns third-person footage into a plausible first-person viewpoint by inferring posture, attention, and spatial geometry, no wearables, no LiDAR, no multi-camera rigs.
That’s not a media trick. It’s the pivot from recording video to reconstructing contextual space. Every old clip becomes potential training data for spatial twins, and cheap “synthetic experience” for training humanoids that learn best from first-person perspective.
This is how the metaverse actually scales: not as a separate world we build, but as a layer stitched onto the world we already filmed. The new value chain shifts from capture hardware to reconstruction software, and the hard currency becomes provenance.
Now the catch: hallucinations. A smooth first-person view can overwrite uncertainty so cleanly it becomes courtroom-grade persuasion.
One rule has to hold: reconstructed perspective may inform decisions, but it cannot settle them. EgoX-style outputs must be treated like probabilistic simulations, not replayable truth.
That means hard provenance (source video, model version, parameters), measurable confidence bounds, and independent corroboration before anyone is allowed to act on what “the subject experienced.” Otherwise we’ll normalize persuasive fiction as admissible reality: the cleanest story wins, not the most accurate one.

'Synthetic Minds' continues to reflect the synthetic forces reshaping our world. Quick, curated insights to feed your quest for a better understanding of our evolving synthetic future, powered by Futurwise:
1. The increasing use of internet-connected robots and smart home devices has raised significant security concerns. A recent incident involving a DJI robot vacuum highlights the potential risks associated with these devices. (Popular Science)
2. Citrini Research released a possible future scenario where increasing capabilities of AI lead to significant job displacement, particularly in white-collar sectors. But what does it mean for our economy and society? (Citrini Research)
3. As AI-generated content becomes increasingly prevalent, we must consider the potential dangers and moral implications of these tools, such as ads where kids promote cigarettes. (LessWrong)
4. The global supply chain is a complex web of activities that stretch from raw materials to manufacturing, transportation, warehousing, and product returns. Greening this chain means reducing emissions, waste, and resources at every step. (Happy Eco News)
5. The release of DeepSeek's V4 model is expected to be imminent, and it could have major implications for US tech companies and the firms backing them. (Wired)
If you are interested in more insights, grab my latest, award-winning, book Now What? How to Ride the Tsunami of Change and learn how to embrace a mindset that can deal with exponential change.
If this newsletter was forwarded to you, you can sign up here.
Thank you.
Mark
