When Distortion Competes With Truth
On synthetic reality, contested evidence, and new systems for trust
While a snowstorm moves across the country this weekend, I am at home like many people, watching the news and trying to understand what is happening in Minneapolis. Videos from a deadly shooting there Saturday began circulating almost immediately after it happened. News sites and social media both claimed to show what had occurred, even as the same images were quickly reinterpreted, reframed, or dismissed as incomplete or misleading. In one account, federal officials said the person approached agents brandishing a firearm; in the footage itself, no weapon is clearly visible, and the individual appears to be holding a phone as agents move in. Official statements diverge from multiple bystander videos.
As if the violence itself were not disturbing enough, a second rupture appears. People are no longer only arguing about the event. They are arguing about whether what they are seeing can function as evidence at all. The dispute shifts from “What happened?” to “Which version of real are you prepared to accept?”
In this sense, the conflict is no longer only political. It becomes evidentiary. The same images support radically different narratives, and what begins to collapse is not opinion but the basis on which belief itself is formed.
This pattern of contrasting accounts and broken belief is not unique to a single incident; it reflects how real-world events are now experienced through media systems that form what counts as evidence itself.
When Synthetic Reality Was Just Fiction
For centuries, artists and storytellers have created imaginary worlds as a form of escape, critique, or possibility. Synthetic realities existed clearly as fiction. Fantasy was something we chose to enter. It was a space of invention, not a substitute for evidence.
I cannot help but draw parallels to Avatar 3, which has been on my mind since I saw it recently. Stories have always wrestled with power and truth, but what struck me is how directly perspective itself has become the main battleground in worlds we still recognize as fictional. The struggle is over who controls meaning, who is seen as a threat, and whose account of reality is allowed to stand. In a world that is more immersive, more luminous, and more technologically advanced than our own, this conflict is rendered with extraordinary clarity. Power is abused, and groups are cast as heroes or enemies according to who controls the story. It is injustice and manipulation staged inside a more beautiful environment. The difference is that we understand it’s fiction. We know that these worlds are worlds of make-believe.
By synthetic reality, I mean not just AI-generated images or virtual worlds, but the broader digitally constructed layer through which events are now seen and interpreted: images, video, feeds, and algorithmically shaped representations that stand in for direct experience. It was meant to create imagined worlds we could step into and out of.
What feels different now is how close that logic has moved to everyday life. In Minneapolis, people are not debating a fictional universe. They are debating what actually happened. They are trying to decide what counts as evidence at all.
This is where synthetic reality stops being entertainment and starts becoming infrastructure. It is no longer confined to films or virtual worlds. It is woven into the ways people decide what is true enough to act on. A narrative can be reinforced with images and clips that feel real, even when they are incomplete or misleading.
It is tempting to blame technology for this. But that is too simple. We have always used tools to influence one another. Cameras changed how the truth was recorded. Television changed how it was shown. Social platforms then changed how it was amplified. What is different now is that realism itself can be produced and reshaped with unprecedented ease, and the boundary between imagination and evidence has begun to disappear.
The harder question is what this does to trust when perception itself can no longer carry authority.
The Generative AI Paradox
I have been looking for research that explains this shift in how reality is mediated, not just how it is misused. That search led me to the work of Emilio Ferrara, a researcher at USC who studies how AI reshapes information systems.
For years, digital trust has been treated primarily as a problem of content. Is this image real, is this article factual, or is this video manipulated? The dominant assumption has been that if fake or altered material could be detected more reliably, the problem would remain manageable.
In The Generative AI Paradox: GenAI and the Erosion of Trust, the Corrosion of Information Verification, and the Demise of Truth, Ferrara challenges that assumption at a deeper level. He is not describing synthetic reality as art or storytelling. He is describing it as a social condition.
For most of history, imagined worlds were clearly marked as imagined. Ferrara’s argument is that something fundamentally different is now taking shape. Synthetic environments are no longer confined to fiction. They are increasingly used to construct identities, conversations, and events that people encounter as part of everyday life. Media is paired with believable personas, reinforced through interaction, and delivered through systems that were built for a world in which authenticity could usually be inferred.
In other words, synthetic reality has moved from the realm of narrative into the realm of proof.
Ferrara is not writing about events like Minneapolis; he is describing the digital environment through which events like Minneapolis are now seen, interpreted, and debated. His argument is not about a collapse of human judgment so much as a change in the conditions under which belief is formed.
When anything can be made to look real and any story can be reframed, images and statements no longer function as self-evident proof. They are encountered through digital systems that organize, contextualize, and move meaning in ways that shape how events are understood. In this sense, Minnesota is not an example of synthetic reality as fiction, but of a real-world event being interpreted through the kind of synthetic information environment Ferrara describes.
Contested Evidence
Seen this way, what is unfolding in Minnesota is not just political conflict. It is a contested reality. Neither videos and images alone nor official statements settle the story. Each side can assemble a version of events that feels complete using fragments of evidence, interpretation, and amplification.
What makes this moment destabilizing is not the existence of lies, which have always been with us, but the disappearance of the ways to validate where information came from, how it had been shaped, and how much confidence it deserved. A video can’t be assumed to be true. A statement no longer arrives in a shared context. The same fragment can function as proof, propaganda, or fabrication depending on perspective.
Designing for Trust
Trust, in this sense, stops being something we assume and becomes something that must be built into the way information is presented. It depends on whether the structures behind what we see are made legible: where something came from, how it has been changed, what is known, what is uncertain, and how alternative interpretations can be examined rather than hidden. This does not fix contested reality, but it restores something essential. It gives people a way to interrogate what they are seeing rather than simply accept or reject it.
Synthetic reality, thankfully, is not going away. It is one of the most powerful creative tools we have developed, capable of expanding imagination, experience, and expression through technologies like VR and AR, just as literature, film and photography still do. But when synthetic media becomes part of the infrastructure through which public life is interpreted, it cannot remain only aesthetic. It must also support mutual understanding rather than simply generate believable stories.
Designing for trust is no longer a philosophical concern. It is now a crucial capability of the systems that shape how people understand and act in the world.
By Rori DuBoff
Note: This line of thinking informs the Trust Stack, a system developed at All Things Trust to evaluate whether digital systems create the conditions for trust by making origin, structure, and claims legible rather than opaque. https://www.allthingstrust.com/trust-stack

