30.12.2020 change 30.12.2020

Scientists urge ‘Socrates method of debate’ to expose ignorance

Credit: Adobe Stock Credit: Adobe Stock

Want to convince a layman that experts are right? Without judging, ask them to explain the topic in detail. If they realize how little they know, there is a chance that they will soften their position and it will be easier for them to agree with experts, psychologists confirmed in a clever experiment referring to the Socrates method.

“The problem that we tried to solve is that people do not distinguish between the opinion of experts and the opinion of laymen,” says Professor Michał Białek, a psychologist from the University of Wrocław.
 
“Celebrities express their opinions on the topics of the coronavirus or international politics, and some trust these opinions. This shows that for some it does not matter where someone's opinion comes from,” says.
 
Consequently, the psychologist with colleagues from an international team, wanted to show a method that would make people aware that some opinions are supported by facts, reliable knowledge, years of analysis, and others only by beliefs, intuitions and a superficial view of the topic.
 
Białek says: “The idea behind our research is simple: we went back to the Socrates method. Socrates used dialogue to make people aware of the fact that they knew nothing. Because when people realize the limits of their knowledge, they open up to new knowledge and are ready to revise their views. They begin to appreciate experts, feeling the knowledge gap between them and themselves.”
 
The researchers asked several thousand people from the US recruited via the Mechanical Turk platform about their opinion on economic issues (and confidence in these opinions). These were questions, for example, about the sense of restoring the gold standard (dollar convertibility into gold) as a fixed exchange rate. Or to assess whether trade with China is positively affecting the lives of American citizens. Then, the respondents were asked to explain the phenomenon that was mentioned in the question. Then the position of ordinary people on these issues was presented, as well as - separately - the consensus worked out by world experts. The respondents were asked to express their opinion on the same subject again. It turned out that after asking for an explanation, the level of respondents' confidence in their opinion dropped. And when the position of the experts was shown, the respondents were more inclined to agree with it than people from the control group.
 
In the control group there were also people who changed their opinion during the experiment, but their change of mind was as often related to the responses of the majority of laymen as to the responses of experts.
 
“People surprisingly often have an opinion on a topic, but they have no knowledge,” says Professor Michał Białek, adding that people are often unaware of this lack of knowledge (this is called illusion of explanatory depth). 
 
"If we ask someone if they know how the pen works, they will usually say yes. But if we ask them to explain, there is a good chance that they will get lost in the details,” says Białek. And if someone understands that they hardly have expert knowledge in some field, it is easier for them to rely on the opinion of real experts.
 
“It's just a hypothesis, but I think our experiment worked because we didn't give the subjects any feedback, we didn't comment on their answers. We didn't point out their ignorance,” Białek continues. “We didn't say which answer was correct. If we were to tell the subjects that they answered incorrectly, they could start to defend themselves and try to convince themselves that their first opinion was right. 
 
“Meanwhile, they themselves had a revelation that they knew little. Thanks to this, changing their minds was not accompanied by a loss of self-esteem, they did not feel defeated or ridiculed.
 
“Sometimes in a discussion we try to bombard the other party with arguments and force them to analyse them and accept those arguments. And when they talk, we only look for a moment to undermine their arguments. We propose a much better strategy: we let the other person talk, we ask them to explain their arguments, we ask questions, and at the same time we do not criticize them in any way. While trying to explain the topic, the other party may begin to notice that they only superficially understand it. They may start to have doubts, and more humility in relation to expert knowledge.
 
“The 'Socrates' methods of discussion have long been known about. Yet people do the opposite in discussions. They do not ask, they speak. They do not listen, they argue. They do not let opponents draw their own conclusions, but instead bombard them with own views, thus blocking the way to changing their opinion. This is not an effective method.”
 
In their study, the psychologists selected economic topics, on which a large number of experts agree. Meanwhile, the scientist notes, with regard to many topics, the public does not always know what the actual consensus of experts is.
 
He gives an example that sometimes the media create an impression that the experts' consensus does not exist, while in fact it does. And, for example, invite one vaccinologist and one anti-vaxxer to a discussion. “The audience's impression after this discussion is that opinions are evenly distributed. Meanwhile, in order to accurately reflect the proportions, it would be necessary to invite a lot of experts representing one position, and one representing the opposite view,” says Białek.
 
He adds that some discussions taking place in the public space are by no means real discussions among experts, saying: “It is rather an exchange of opinions between the vast majority of experts and a small group of people with different views. This concerns, for example, the anthropogenic causes of global warming, or evolution.” 
 
The research led by Ethan Meyers appeared in the scientific journal Judgment and Decision Making http://journal.sjdm.org/20/200615a/jdm200615a.pdf
 
The co-authors of the study include Professor Jonathan A. Fugelsang and Professor Derek J. Koehler. The scientists already have experience in studying how to detect statements and speakers who think they are smarter than they really are. 
 
In 2016, they were awarded the Ig Nobel Peace Prize for research on pseudo-profound bullshit in statements related to the New Age movement (Ig Nobel Prize is awarded for research on unusual topics that makes people think). (PAP)
 
Author: Ludwika Tomala
 
lt/ ekr/ kap/
 
tr. RL

Przed dodaniem komentarza prosimy o zapoznanie z Regulaminem forum serwisu Nauka w Polsce.

Copyright © Foundation PAP 2024