Jump to content

Reasoning not designed to pursue the truth. Reasoning designed to win arguments.


doughishere
 Share

Recommended Posts

 

 

But if you take the point of view of the argumentative theory, having a confirmation bias makes complete sense. When you're trying to convince someone, you don't want to find arguments for the other side, you want to find arguments for your side. And that's what the confirmation bias helps you do.

                               

The idea here is that the confirmation bias is not a flaw of reasoning, it's actually a feature. It is something that is built into reasoning; not because reasoning is flawed or because people are stupid, but because actually people are very good at reasoning — but they're very good at reasoning for arguing. Not only does the argumentative theory explain the bias, it can also give us ideas about how to escape the bad consequences of the confirmation bias.

                               

People mostly have a problem with the confirmation bias when they reason on their own, when no one is there to argue against their point of view. What has been observed is that often times, when people reason on their own, they're unable to arrive at a good solution, at a good belief, or to make a good decision because they will only confirm their initial intuition.

                               

On the other hand, when people are able to discuss their ideas with other people who disagree with them, then the confirmation biases of the different participants will balance each other out, and the group will be able to focus on the best solution. Thus, reasoning works much better in groups. When people reason on their own, it's very likely that they are going to go down a wrong path. But when they're actually able to reason together, they are much more likely to reach a correct solution.

Link to comment
Share on other sites

What is really striking is the failure of attempts to get participants to reason in order to

correct their ineffective approach. It has been shown that, even when instructed to try to falsify the hypotheses they generate, fewer than one participant in ten is able to do so....Thus, falsification is accessible provided that the situation encourages participants to argue against a hypothesis that is not their own.

 

Link to comment
Share on other sites

But if you take the point of view of the argumentative theory, having a confirmation bias makes complete sense. When you're trying to convince someone, you don't want to find arguments for the other side, you want to find arguments for your side. And that's what the confirmation bias helps you do.

<snip>

 

People mostly have a problem with the confirmation bias when they reason on their own, when no one is there to argue against their point of view. What has been observed is that often times, when people reason on their own, they're unable to arrive at a good solution, at a good belief, or to make a good decision because they will only confirm their initial intuition.

                               

On the other hand, when people are able to discuss their ideas with other people who disagree with them, then the confirmation biases of the different participants will balance each other out, and the group will be able to focus on the best solution. Thus, reasoning works much better in groups. When people reason on their own, it's very likely that they are going to go down a wrong path. But when they're actually able to reason together, they are much more likely to reach a correct solution.

 

The bold part sounds nice, but doesn't the italic part contradict the bold part?

 

I.e. if you have a group "discussion" (read: argument), the fundamental premise of argumentative theory starts playing a role: your goal is to win the argument, not to reach the correct solution.

 

The group as a whole may or may not reach the correct solution. But it seems very likely that the participants of the argument are more likely to dig in and try to persuade others of their POW rather than change it because of the discussion.

 

(There's numerous examples of that in this forum too ;)).

 

I wonder if responses to this post will be a meta example of the point above.

Link to comment
Share on other sites

But if you take the point of view of the argumentative theory, having a confirmation bias makes complete sense. When you're trying to convince someone, you don't want to find arguments for the other side, you want to find arguments for your side. And that's what the confirmation bias helps you do.

<snip>

 

People mostly have a problem with the confirmation bias when they reason on their own, when no one is there to argue against their point of view. What has been observed is that often times, when people reason on their own, they're unable to arrive at a good solution, at a good belief, or to make a good decision because they will only confirm their initial intuition.

                               

On the other hand, when people are able to discuss their ideas with other people who disagree with them, then the confirmation biases of the different participants will balance each other out, and the group will be able to focus on the best solution. Thus, reasoning works much better in groups. When people reason on their own, it's very likely that they are going to go down a wrong path. But when they're actually able to reason together, they are much more likely to reach a correct solution.

 

The bold part sounds nice, but doesn't the italic part contradict the bold part?

 

I.e. if you have a group "discussion" (read: argument), the fundamental premise of argumentative theory starts playing a role: your goal is to win the argument, not to reach the correct solution.

 

The group as a whole may or may not reach the correct solution. But it seems very likely that the participants of the argument are more likely to dig in and try to persuade others of their POW rather than change it because of the discussion.

 

(There's numerous examples of that in this forum too ;)).

 

I wonder if responses to this post will be a meta example of the point above.

 

The thing that I questioned about the article is that people defer their own judgements to an expert. 

 

I didn't see the article addressing this.  We don't argue enough with the expert to fit the theory.

 

A group needs to make a decision and move along and not spend it's resources quibbling forever -- that's how I'd explain the deference to expert testimony.

 

I still found it interesting -- their theory fits within an argument between relative equals.  This is likely what they meant.

 

I think a lot of harm is done when an expert is consulted and his strong opinion is followed unquestioned -- that leaves us victim to his confirmation bias.  We usually don't get enough experts together to duke it out for this theory to work -- maybe because it's too expensive from a time and resources point of view.  And of course there is danger in not consulting an expert -- usually it's best to consult with them.

 

People make their way through life with different abilities and develop different specializations -- we can't all be a master of everything, so deferring to the area expert is best perhaps.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...