Have you ever wondered why debates become more and more polarized? Why is it that when two people argue it is almost impossible for them to reach an agreement? How is it possible that, even when presenting solid evidence to the contrary, people so aggressively defend their opinions?
No matter how rational we consider ourselves, it seems that human beings have the natural tendency to seek, interpret, favor and remember information that supports our previous beliefs and values, regardless of whether there are facts that contradict them.
This natural tendency has a name: it is my side bias Below we are going to delve into this widespread and, at the same time, potentially harmful psychological phenomenon and the research that has revealed a little light on how it occurs.
What is my side bias?
There are many times that, when we are talking to someone about any topic, we explain what we think and what “facts” there are. We explain all the evidence we have found in all kinds of “reliable” sources. We know that this person has an opinion contrary to ours and we trust that, after giving him this evidence, he will change his mind, but that simply does not happen. No, he is not deaf, nor has he ignored us, it has simply happened that since what we have told him contradicts what he thinks, he has underestimated our “facts”, thinking that we are uninformed.
Myside bias is a psychological phenomenon that causes us to have tendency to seek, interpret, favor and remember that information that supports or confirms our previous beliefs and values , ignoring or downplaying evidence that contradicts what we believe. Basically, this bias is an inherent defect of our brain in the way it processes information, which leads us to make biased decisions or adopt wrong points of view and opinions.
Although all human beings are victims of this bias, this psychological phenomenon is considered potentially dangerous, in the sense that It makes us practically blind to any information that, no matter how true it may be, if it is contrary to what we think, we will consider it false or not very rigorous. In fact, some theorists on this pattern of thought, such as Keith E. Stanovich, consider it to be essentially responsible for the idea of post-truth: we only see what we want to see.
Implications of this cognitive bias
Over the past decades Stanovich along with other cognitive researchers such as Richard F. West and Maggie E. Toplak have experimentally addressed this bias. One of its main implications is that human beings tend to look for information that gives strength to our opinions, omitting or discarding any data that, no matter how true and demonstrable it may be, we consider less rigorous. People We look for information that gives strength to our hypotheses, instead of looking for all the evidence, both those that confirm and those that refute
In fact, this is something quite easy to understand by seeing how people behave on practically any topic in which they want to document themselves. For example, if we find a person who is pro-life, that is, she is against abortion, she will be more likely to look for information that proves her right and, what’s more, it is even possible that she will become even more opposed. to abortion. She will rarely look for information that explains why abortion should be a universal right or if the fetus of a few weeks does not feel, and if she does she will read those contents from a very skeptical and superficial perspective.
Curiously, the fact of looking for information that is on both sides of a debate, that is, looking for data favorable and unfavorable to the opinion that one has already made from the beginning, seems to be related to personality traits rather than intelligence In fact, some research suggests that the most self-confident people tend to look for data that proves and disproves both sides of the debate, while the most insecure people look for what gives strength to their beliefs.
Another clear implication of this bias is how the same information is interpreted differently based on our basic beliefs In fact, if exactly the same information is given about a topic to two individuals, it is most likely that they will end up having different points of view, totally or partially opposite, given that although the message is identical, the interpretation they make of it will not be identical. and your way of seeing it will be personally skewed.
The death penalty experiment
We have a good example of this in an experiment carried out at Stanford University, in which researchers They looked for participants who already showed strongly divided opinions on the same topic: being for or against the death penalty Each of the participants was given brief descriptions about two studies, one that compared US states with and without capital punishment and the other that compared the murder rate in a state before and without capital punishment. after having introduced the death penalty.
Following this description, they were given more detailed information about both studies and asked to rate how reliable they believed the research methods were in both investigations. In both groups, both those in favor of the death penalty and those against the death penalty, reported that their attitudes had changed somewhat at the beginning of the study when they were given the brief description, but When given more details, most returned to their previous beliefs , despite having evidence that gave solidity to both studies. They were more critical of sources contrary to their opinion.
German cars and American cars
Another study showed that intelligence does not protect us from myside bias. In this case, the intelligence of the participants was measured before giving them information about a fact in which they had to express their opinion. The fact in question was about cars that could pose safety problems. The participants, all of them Americans, were asked if they would allow German cars with safety problems to travel on US streets. They were also asked the vice versa question: whether they thought that American cars with defects should be able to travel on US roads. Germany.
Participants who were informed about German cars with safety problems said they should be banned in the US as they pose a danger to US road safety. On the other hand, those who were informed about their American counterparts said that they should be able to transit in Germany. That is, they were more critical of the safety of German cars because they were German and driven in their country and more lax with American cars because they were American and driven abroad. Intelligence did not reduce the probability of bias on my side
Memory and bias on my side
Although people try to interpret a piece of information in the most neutral way possible, our memory, which will be biased by our own beliefs, will act by favoring the memory of that which supports our point of view, that is, we have selective memory. Psychologists have theorized that information that fits our existing expectations will be more easily stored and remembered than information that disagrees. That is to say, We memorize and remember better what makes us right and we forget more easily what goes against us
What relationship does this have with social networks?
Given all this, it is possible to understand the seriousness of the implications of bias on my side when receiving and interpreting any information. This bias makes us unable to effectively and logically evaluate the arguments and evidence given to us, no matter how solid they may be. We can believe more strongly in something that is doubtful for the simple fact that it is on “our side” and be very critical of something that, despite being very well demonstrated, because it is “against us” we do not see as rigorous and reliable.
But Of all the implications that this entails, we have one that is directly related to social networks , especially its algorithms. These digital resources, through “cookies” and remembering our search history, cause us to be presented with resources that are related to something that we have already seen before. For example, if we search for images of kittens on Instagram, more photos of these animals will begin to appear in the magnifying glass section.
What implication do these algorithms have with myside bias? A lot, since we not only look for images of animals or food on social networks, but also opinions and “facts” that confirm our pre-established opinion. So, if we look for a vegetarianism blog, many other related blogs will appear in the search section, both politically neutral and vegetarian recipes, as well as blog posts, images and other resources that talk about animal brutality and criminalize people. carnacas.”
Taking into account that we are hardly going to look for information contrary to our point of view, It is a matter of time before our opinions become more radical As the networks show us resources in favor of our point of view, we will progressively go even deeper into the topic and, taking the example of vegetarianism, it is even likely that we will end up in vegan sectors, supporters of more intense actions towards the meat sector.
Based on this, and especially applied to political ideologies, many people consider that these algorithms are ending democracy. The reason for this is that, since the algorithm does not present us with all the available points of view on the same topic, it presents us with what favors our opinion, making us less likely to compare options. Since we are not facing different “truths” and are stuck in the comfort of our point of view because of social media, we are really being manipulated.
It is for this reason that, as an attempt to escape the trap of our own mind and how social networks help us to lock ourselves even more in what we think, it never hurts to look for opinions that are contrary to ours. Yes, it is true, the bias on my side will make us tend to see them more critically and superficially, but at least The attempt can give us a little ideological freedom and opinion Or at least delete the search history and not give the social network the opportunity to trap us in our own beliefs.