People Leave As They Entered-- How do you change hearts and minds?

Costanza

Rising Star
Registered
The truth. James Baldwin spent years trying to take a calm, intellectual point of view and funny part about this society is that we don't teach for intellectual dexterity. People leave debates with the same thoughts that they entered the debate with regardless if they are schooled or not.

This comment struck a chord with me... I think it's worth isolating and asking "What can be done about it?"

Because it doesn't happen in every argument.

I suppose it all boils down to humility? Openness, as opposed to entrenched defensiveness?

For Rob Portman, it took his son telling him he was gay for him to reverse his stance on gay marriage. Can anyone recall a debate where it happened with words? Somebody being given new information and/or corrected-- someone being schooled-- and admitting "Maybe you're right," "I was wrong," "I need to rethink my position," "I never thought of it that way," etc? Something?

It can and does happen... Doesn't it?
 

Art Vandelay

Importer/exporter
Registered
How to debunk false beliefs without having it backfire
Susannah Locke
Vox.com
December 22, 2014


There's nothing worse than arguing with someone who simply refuses to listen to reason. You can throw all the facts at them you want, and they'll simply dig in their heels deeper.

Over the past decade, psychologists have been studying why so many people do this. As it turns out, our brains have glitches that can make it difficult to remember that wrong facts are wrong. And trying to debunk misinformation can often backfire and entrench that misinformation stronger. The problem is even worse for emotionally charged political topics — like vaccines and global warming.


So how can you actually change someone's mind? I spoke to Stephan Lewandowsky, a psychologist at the University of Bristol and co-author of The Debunking Handbook, to find out:


Susannah Locke: There’s evidence that when people stick with wrong facts, it isn't just stubbornness — but actually some sort of brain glitch. Why is it so difficult to change people’s minds?

Stephan Lewandowsky: It’s not an easy task to update people’s memories. That’s a very clear result that even happens with completely innocuous items. It's a fundamental problem for our cognitive apparatus to update what’s in our head.

What people have suggested — and what I think is going on — is that what people remember is the information, and then they attach a tag, "Oh no it’s not." And the problem is that often this tag can be forgotten. So you remember the misinformation, but not the fact that it’s false.

Now, one of the ways to get around that is to tell people not just that something is false, but tell them what’s true. Alternative information makes it much easier to update your memory.

That’s a classic study where people are told there’s a fire in a warehouse, and we found oil paints or flammable materials in the wiring cabinet. Then, later on, it will say, by the way, the wiring cabinet was empty. Now, if that's all you do, people will still think that there was oil paint in the wiring cabinet. Just simply saying something isn't true doesn't do the trick.

But instead, if you say the wiring cabinet was empty, and we found some petrol-soaked rags [elsewhere] at the scene, then people forget about the wiring cabinet because they have an alternative explanation for the fire. You need an alternative to let people let go of the initial information.

Locke: What’s the biggest thing people do wrong when trying to change other people’s minds?

Lewandowsky: The moment you get into situations that are emotionally charged, that are political, that are things that affect people’s fundamental beliefs — then you've got a serious problem. Because what might happen is that they’re going to dig in their heels and become more convinced of the information that is actually false. There are so-called backfire effects that can occur, and then the initial belief becomes more entrenched.

Locke: How can people prevent these backfire effects on political issues?

Lewandowsky: It’s very difficult. A lot of this stuff is about cultural identity and people's worldviews. And you've got to take that into account and gently nudge people out of their beliefs. But it’s a difficult process.

One [solution] is that if you give people an opportunity to self-affirm their beliefs ahead of time. Let's talk about weapons of mass destruction in Iraq. They didn’t exist, right? After Iraq was invaded, they didn't show up. And yet I think to this date about 30 percent of the public believes in the existence of weapons of mass destruction, and that’s sharply along partisan lines. If you get Republicans into the laboratory, and you say hey, there weren’t any weapons of mass destruction, that may strengthen their incorrect belief. We’ve done exactly that study.

There’s some evidence that you can avoid that if you ask people to tell us [about] an occasion when you felt really good about your fundamental beliefs in free enterprise (or whatever is important to the person in question). Then they become more receptive to a corrective message. And the reason is that it’s less threatening in that context. Basically, I make myself feel good about the way I view the world, and then I can handle that because it’s not threatening my basic worldview.

The other is you can have a messenger who is consummate with your beliefs. You get a liberal to talk to liberals and a conservative to talk to conservatives.

Locke: Have psychologists completely thrown out the information-deficit model — the idea that you can change people's understanding by giving them the correct information?

Lewandowsky: It’s a nuanced issue. A couple of years ago, people basically said the information-deficit model is dead — it’s all basically about culture. Now I think that’s an oversimplification. It’s a combination of two factors. Culture is extremely important. But it’s also true that in some circumstances providing people with information is beneficial. That is, more information does enable people to sort out what's going on.

Now, the trick appears to be that you’ve got to get people the opportunity to deal with information in great depth. If you have a situation like a classroom where people are forced to sit down and pay attention, that’s when more information is helpful. There's a lot of evidence of this in educational psychology.

Now the problem is in a sort of casual situation, people listening to the radio or having a superficial conversation — that's where the information deficit model doesn’t apply. And superficially just throwing information at people probably will make them tune out. So you’ve got to be careful when you’re talking about public discourse, TV, radio, media.

Locke: Let’s say I’m going home for the holidays and have an uncle who doesn’t believe in climate change. How can I change his mind?

Lewandowsky: It’s difficult. There’s a couple of things I can suggest. The first thing is to make people affirm their beliefs. Affirm that they’re not idiots, that they're not dumb, that they’re not crazy — that they don't feel attacked. And then try to present the information in a way that’s less conflicting with [their] worldview.

One of the problems I've been working with is people's attitudes toward climate change. For a lot of people, the moment they hear the words "climate change," they just shut down. But there are ways that you can get around that. For example, it’s been shown that if you show the health consequences of climate change or if you can have market-based solutions to the problem, that does not challenge their worldview too much.

If you tell people that there is an overwhelming scientific consensus that 97 out of 100 climate scientists agree on the basic notion of global warming, it seems that is a gateway belief that enables people to recognize the importance of the issue.

More often than not, that is effective with people who are ideologically disposed to reject global warming as a fact. In general, people are very sensitive to what they perceive to be the majority opinion around them.

Locke: If you throw too much information at people, are they more likely to reject your stance?

Lewandowsky: That’s quite nuanced, and it depends on how much time people are willing to invest in processing the information. If people sit down with the intention of listening and trying to undo the problem, then we have no evidence for an overkill backfire effect.

However, there's plenty of evidence that in a casual context — turning on the TV or whatever — you can dilute the message by putting too much information in it. This whole information-overload issue is more critical in a more casual context. And that's always important.

Most of the research on misinformation has mimicked casual situations. People just sit there and read something like a newspaper article, and that’s when you get backfire effects and people are very susceptible to misinformation.

Locke: What about the "familiarity effect," in which just mentioning the wrong information could make it stick even harder?

Lewandowsky: As recently as two or three years ago, I would have assumed that it exists. Now, it’s beginning to look like that’s not terribly robust. We’ve had a hard time trying to reproduce it. It sometimes occurs and sometimes doesn’t. I’m inclined to think it will turn out to be quite infrequent.

Locke: What’s your favorite experiment that shows the difficulty of debunking?

Lewandowsky: The one study I like a lot is the one I did about the Iraq war that was published in 2005. And what we did there was to look at people’s processing of information related to the Iraq war and the weapons of mass destruction. We ran the study in three different countries: in the US, in Germany and in Australia — at the same time.

And what we found is that Americans who knew something was false continued to believe in it, which makes no sense. We said, here’s this piece of information and asked them if they knew it was retracted. And a minute later, we asked them whether they believed the information. And they continued to believe it. The Germans and Australians did not.

Now, at first glance, that makes it sound as though there's something weird about Americans compared to the other two nationalities. But what’s really interesting is that’s not the case at all. What drove this effect was the skepticism [of the reasons why the war was being fought in the first place]. It turns out that when we asked people if they thought the war at the time was fought over weapons of mass destruction, that item could predict whether people would continue to believe things that are false.

When you control for skepticism, all those differences between Americans and Germans and Australians disappear. There was an underlying cognitive variable that explains it. It just so happened that there were far more skeptics in Germany and Australia at the time.

Locke: How has psychology’s understanding of debunking shifted since you first started studying it?

Lewandowsky: Over the past 10 years or so that I’ve been doing this, the role of cultural worldviews and people's identification with their own culture has been realized more and more. And equally, we know that skepticism is extremely important. People who are skeptical about the motives of someone telling us something — that’s very important and fairly new.

Another thing that’s emerged more and more over time is the existence of backfire effects that if you tell people one thing, they’ll believe the opposite. That finding seems to be pretty strong.

Locke: Have you seen people changing their messages in response to this new research?

Lewandowsky: The Debunking Handbook — that’s been downloaded at least half a million times. So that message is getting out, I think. I’ve seen a lot of reference to that handbook, and I think some people in the media are now aware of how difficult it is to remove information from public discourse.

I’m vaguely optimistic that this research is having an impact. And certainly when it comes to government and large organizations, I think they’re beginning to be fairly savvy in what they say and how they do it, in part because of the research.

Locke: Is there anything else important that people should know?

Lewandowsky: One thing that I would point out is that it’s very important for people to be skeptical and anticipate that people will be misleading to the public. Some of the misinformation that’s out there is not accidental. I think there’s quite a bit that’s put into the public discourse in order to have a political effect. It’s supposed to be wrong, but effective.

What our research shows is that if people are aware of the possibility that they might be misled ahead of time, then they’re much better at recognizing corrections later on.
 

Mrfreddygoodbud

Rising Star
BGOL Investor
This comment struck a chord with me... I think it's worth isolating and asking "What can be done about it?"

Because it doesn't happen in every argument.

I suppose it all boils down to humility? Openness, as opposed to entrenched defensiveness?

For Rob Portman, it took his son telling him he was gay for him to reverse his stance on gay marriage. Can anyone recall a debate where it happened with words? Somebody being given new information and/or corrected-- someone being schooled-- and admitting "Maybe you're right," "I was wrong," "I need to rethink my position," "I never thought of it that way," etc? Something?

It can and does happen... Doesn't it?

it happens just not often.... folks who respect each other in a relationship can easily do that..

its just most folks are in relationships for selfish reasons.. and not to grow as one

I love when peope can prove me wrong, it helps me grow..

its just a rare when it occurs..lol
 

Mrfreddygoodbud

Rising Star
BGOL Investor
In general people ain't shit, this is especially true for cacs.

Its seems that way doesnt it? but thats only because the soulless assholes are normally the loudest,

most obnoxious attention whores... they do shit, like run for a political office, or become kkkops to bully people..etc... those types...

but in reality..

Its really fifty fifty.. for every obnoxious drip of fuckin scum... there are

some really Good Souls out here!!
 

0utsyder

Rising Star
BGOL Investor
Its seems that way doesnt it? but thats only because the soulless assholes are normally the loudest,

most obnoxious attention whores... they do shit, like run for a political office, or become kkkops to bully people..etc... those types...

but in reality..

Its really fifty fifty.. for every obnoxious drip of fuckin scum... there are

some really Good Souls out here!!

And that is what I was gonna say the assholes make the most noise BUT the assholes are listening the likes of Trump, Alex Jones and Joe Rogan. Instead of being humble or apologizing when wrong, folks just dig in more or just explain away their thoughts and baseless claims. CNN, FAUX are never going to get it 100%! Take what they give and combine that with actual peoples lived experiences and you'll get a more robust outlook. I get on BGOL and generally crack jokes, so when I AM going back and forth I am ALL for being proved wrong and humbled when it does happen. Unfortunately in most cases of online back and forths it's always to "own" the other person.
 

Soul On Ice

Democrat 1st!
Certified Pussy Poster
I used to be one of those angry Black people asking for representation and tangibles.

now with the guidance from Dray, Cam, Bert, and Tex on BGOL; I've been shown the light and realize that our best bet to get things as the Black people of America is vote strictly blue and not let evil republicans into office.

While we wont get things directly for Black people (trickle down is good enuff) through the Dems, the evil republicans want us back in chains!

#awakened
 

Costanza

Rising Star
Registered
^^^ Putting you on ignore for a while. If the only way you can advocate for your position is misrepresenting someone else’s, that says something about you. Your strawman routine that I’ve seen at least a dozen times now isn’t funny or smart, it just shows you can’t actually argue for what *YOU* believe.
 
Top