At this year’s Doxumentale, we had the pleasure of speaking with filmmaker Friedrich Moser, whose documentary 'How to Build a Truth Engine' was part of this year’s selection.
We talked about why facts alone can’t fight disinformation, how our brains are wired for shortcuts — and why media literacy, not just technology, is key to defending democracy.
Friedrich Moser is an Austrian documentary filmmaker. In How to Build a Truth Engine, shown at this year’s Doxumentale, he explores why facts alone don’t change minds — and how disinformation thrives in the space between how our brains work and how the digital world is designed. Moser’s film opens a deeply relevant conversation about truth, manipulation, and democratic resilience.
Friedrich Moser: My film originally started out as a project about automating fact checking. And then investigative journalism came on board. Some of the software developers that were developing software to automate fact checking told me:
‘Hey, you have to talk to the neuroscientists that we are working with on other projects, because all the facts in the world will not convince somebody who is into conspiracy theories to leave those conspiracy theories.’ It really is about how the brain works.
Then I had my first interview — actually it was a Zoom call — with one of the neuroscientists of my film, Zahra Aghajan from UCLA.
And it totally changed my perception of the problem and also the course of the film.
So the reason why we are so susceptible to disinformation, fake news and conspiracy theories has got to do with how our brain is wired, with how we are digesting information. Our brain works in a way that, through our senses, we are getting all the information of the world around us, and we are building a model of the world in our brain. Then the brain checks back with the world around if this model is true or false.
What happens if you are narrowing this flow of information — and also the ability to check back with reality — to very few channels, maybe only online channels, is that you get a very distorted version of the world around you as a model in your brain. And this leads to our susceptibility.
The other thing has got to do with something that has been an evolutionary advantage of our brain. We are very fast in detecting patterns, completing them, and coming to predictions — because our brain, for neuroscientists, is a pattern detection, completion and prediction machine.
Because of doing shortcuts to come faster to predictions, we are actually falling prey to false interpretation of patterns that we are seeing. And this is where disinformation, conspiracies and conspiracy theories come into play.
One of the problems is: how can you prevent people from being exposed to harmful content and disinformation without censoring our information ecosystem?
There are several approaches. One has to do with creating awareness — why we are so vulnerable to disinformation and the right way to actually tackle that. There are actually two ways of tackling the problem of why we are so susceptible to disinformation.
The first one has got to do with being aware that we are vulnerable. If you know that, you behave differently.
I always give this example: if you are walking or driving through a city in the evening and you know it is a safe place, you are going your way totally differently than if you know that it is an unsafe place. We need to get trained and used to the fact that the internet is an unsafe place. There are many people out there who don't want our best. That is one thing.
The second one has got to do with knowing the mechanics of the media, but also the mechanics of social media. One of the things that technologists who are developing software against disinformation told us was: Please don't count on the software alone. Also work on media literacy and social media literacy.
So I, with a bunch of other people, founded a nonprofit which is called Open Circle Lab. We are developing workshops and materials for social media literacy — which means we are actually having the participants of these workshops undergo an experience, also a physical experience, of how algorithms work.
We are translating something that is behind the scenes — but that everybody is exposed to — into a physical experience.
And I think this will lead to understanding and looking at social media differently. And I have to say: at least I hope that that is the case.
Journalism is in a real existential crisis today, because basically all the money has gone to Google and Facebook.
And it's an economic problem that translates into a problem of efficiency, obviously.
Journalism, if it's executed well, is a very important part of a democratic society because it operates as a spam filter in our information ecosystem.
This is what the press and the role of the press actually is — and of traditional media that has to obey rules and that has a code of ethics. Social media does not have that. Social media is not bound to any rules. It's not bound to be truthful. It's not bound to be fact-based. It's all about creating emotions. The reason why this is happening has got to do with the fact that social media — the only way it can finance itself is through advertisement.
So through this very emotional content, they try to keep us on the screens so that they can push advertisement in front of us.
We also have to look into ways of having social media — which is actually a great experience in the online world, in a globalized society, where your friends don't live next door anymore, but you have them all over the place.
I think social media per se is a very interesting thing to keep up with friends that you have all over the world.
So I'm not against social media — but I think there need to be some rules that social media needs to obey. And I think it needs to be treated like traditional media.
That's one thing. The second thing is regarding the financing. We have to think about something like public value — like public value broadcasting or public broadcasting, but translated into the digital world. And I know that there are studies ongoing, but I think we need to move fast to create systems like that — that are not dependent on advertisement.
If you look at all of society, something that I have seen is: disinformation, and falling for conspiracy theories and fake news, has nothing to do with your age. It has nothing to do with your gender. It has nothing to do with your education. As soon as you limit the influx of information — the variety of information — and you make it very, very small, you fall prey to echo chambers, and you fall prey to disinformation. So what we need to do is to keep as many sources of information as we can. And this, of course, is also an economic and social question.
The second thing is the awareness that we are vulnerable.
The third thing is: we need to use technology that helps us deal with the amount of information and with the speed of information. Because the rhythm at which social media operates is in nanoseconds. Our brain operates in seconds — or one third of a second. So that's a totally different scale. Basically, the algorithms that are driving social media are one billion times faster than the speed in which our brain operates.
So that’s something we need to take into consideration.
And then I think: it's a technology — like many other technologies that have come across in the media system since the Middle Ages — and we will learn to deal with it. So I’m positive. But it needs proper training. And there needs to be this awareness.
My film is not the typical documentary film. My film is more an essay film.
It asks a lot of questions, and it opens many, many doors into many different directions.
What I wanted my film to be is a conversation starter. And I think that’s something that I have achieved.
And if you watch my film — hopefully in fall, also on a larger scale when we are rolling out internationally — it’s really to have a healthy debate about where we are and where we want to go to.