Global Research Bytes Episode 5: Mis & Disinformation with Hudson Golino
Global Research Bytes Episode 5: Mis & Disinformation with Hudson Golino
Emily Mellen 0:08
Welcome to the fifth episode of our podcast series Global Research Bytes. I'm Emily Mellen. And I'm here with Hudson Golino from UVA’s psychology department. Hi Hudson.
Hudson Golino 0:18
Hello, Emily. Thanks for inviting me for the podcast.
Emily Mellen 0:20
You're very welcome. Your current project which you have called “New Tools to Study and Prevent Online Mass Manipulation” studies the cognitive processes behind susceptibility to mis and disinformation and proposes and test strategies to counter this mis and disinformation can you define mis and disinformation for us and then tell us what you have discovered about this susceptibility.
Hudson Golino 0:42
Sure. So, the difference between misinformation and disinformation is the intention to manipulate. So, every time we engage with or we disseminate false or misleading information, that is classified as misinformation. When you produce false or misleading information with the intent to manipulate people, that's actually what we call disinformation. So, this information can be shared or you can believe in misinformation. And it can be harmless. For example, maybe you believe in something that isn't true or scientifically valid and you've shared this with your peers because you find the information interesting.
Sometimes we believe in information that was actually designed to either manipulate or impact our behavior or our political attitudes with the goal to make a group or a company or a country gain something. The gain can be just political instability. For example, if you say that the election was stolen and the election wasn't stolen… I'm going to give you a very clear example: in Brazil right now Lula was elected. And some of the supporters of Bolsonaro, the former president, say that the election was fraudulent, although there is no evidence about that. So, they started fabricating this kind of information and they started to disseminate this information for political gains.
So, Russia, for example, is one of the most advanced cultures in terms of the production of misleading and false information for political or for geopolitical gains. For example, they started doing that during the Russian Revolution. They discovered that by controlling the flow of information, the society, they could more easily manipulate people.
Emily Mellen 2:51
How do you plan to counter this mis and disinformation?
Hudson Golino 2:54
So, to counter, there are several ways to deal with that. So, this is a very new scientific field. There are several approaches to that that are being tested in different parts of the world. The more traditional one is called pre-booking. So, basically what you do is you give people enough information and knowledge so they can be critical when they're facing false information or misleading information. For example, if you want to help people become stronger against disinformation related to vaccines, you can start teaching people how vaccines work. So, next time they encounter some kind of dis- or misinformation about vaccines, they can deal with this information in a very specific way, decreasing their chances of believing in the mis or disinformation. Pre-booking is very useful. It's better than trying to correct misleading or false information after people have engaged with this misleading or false information.
So, for example, fact checking is a way to counter false information after the fact. So, after you encounter the misleading information then people try to help other people by showing them the veracity or the level of veracity of a specific statement. The problem is that it doesn't really matter. Once you've engaged with misleading or false information, because you're more likely to see this information again and again in an information ecosystem, this creates a psychological tendency to believe that the information might be true irrespective of the content, just because you have encountered this information more often. So, this is called the illusory truth effect. The more you see a piece of information repeatedly over time, the more it seems to be true just because you can recover or retrieve this information from memory more easily.
So, countering misinformation after people have engaged with a piece of misinformation is not super effective and it's costly. So, the way we decided to deal with that, Dr. Mariana Teles and I, is to develop large-scale online interventions using video animations. So, basically what we're doing is we're helping people realize that our rationality is limited and that we don't operate rationally all the time. And we do that by showing, for example, visual illusions. Even if you know that the vision illusion is there, it's very difficult for us to not see the visual illusion. So, that's one way to do it.
Another way to do it is to show different types of cognitive biases that we have. So, one type of cognitive bias that we have, to give you one example, or cognitive heuristics, is, for example anchoring. So, if I ask you to try to guess how tall Mount Everest is and I anchor and I give you a number, for example, 10 thousand feet, before you think of your answer, your answer will anchor around this number, so it won't be probably 100 thousand feet that you're going to guess, it's going to be something very similar to 10 thousand feet. So, we show people that anchoring exists and that we are all susceptible to this kind of cognitive bias or heuristic. And then we start to promote what we call cognitive humility.
So, it's basically saying we are all rational but we don't operate rationally all the time. We are very limited in the way we can process information. There are several cognitive biases. There are several mistakes that we make when we are trying to think rationally and process information. So, we try to make people aware of these limitations and to help them promote a more deep critical thinking style, basically, and we have some evidence that this can work at least in the short term.
Emily Mellen 7:17
How did you become interested in this topic?
Hudson Golino 7:20
So, I am a methodologist, I'm a quantitative psychologist. What that means is that I develop statistical methods and software to analyze large volumes of human behavior data. Mariana is a cognitive psychologist.
And we became very interested in the field of misinformation during the first wave of covid-19. We have seen many people that we know that are highly educated starting to believe in crazy misleading or false information related to the virus. And as psychologists, we started to think “okay, why are these people believing information that's clearly false?” and we discovered this very interesting field of the psychology of misinformation and then we started to think, “okay, is there a way for us to improve the field? Can we help to fight disinformation in some way?” And her field is cognitive psychology, so she develops cognitive interventions, especially for older adults. So, we started thinking of ways to develop large-scale interventions to help decrease people's vulnerability to disinformation. And I started to also develop new techniques and methods to analyze this kind of data and also to do reverse engineering of Russian information propaganda and online manipulation.
So, it's a very fun area of study from a scientific perspective, but the real world effects and consequences of all these disinformation campaigns are very severe, from public health issues to really war crimes.
Emily Mellen 9:10
And following that what are the next steps for this project?
Hudson Golino 9:13
So, we have some pilot studies that we developed and we want to test our online intervention in a larger sample in the United States. We also want to collect data in other countries, especially in Eastern Europe, South America, and maybe in Asia in the next few years. So we can see if the online intervention works in different cultures. And now also we want to see what the differences are in terms of the psychological predictors of vulnerability to misinformation and disinformation in different cultures. We're hoping to see and to find psychological characteristics that are universal.
And we also want to expand the work specifically to Estonia. Because Estonia is a very interesting country. There is a very big Russian-speaking population that is super aligned with the Kremlin regime. And although we can't collect data inside Russia because it wouldn't be valid. You can't go to Russia and ask people, “Okay, do you believe in this misinformation and in this specific piece of information about the war?” because even if they don't believe it, if they say so they can face 10 years in jail. So, we can't go there and collect data, right, but we can go to Estonia and ask Estonians that speak Russian and that are more politically aligned with the Kremlin regime questions about all sorts of the disinformation pieces produced by the Kremlin and this can help us understand, okay, what are the characteristics that make people from Russian heritage that are aligned with the Kremlin regime to believe the pieces of information that are being designed to manipulate people.
Emily Mellen 11:10
I look forward to seeing the next steps as this comes to fruition. And thank you for talking with me today.
Hudson Golino 11:16
Thank you very much.