Metodas iš programos: „Man sakė“ EN

Bias and Me

Objectives:

To understand what is bias and how it works;
To understand the connection between bias and manipulation opportunities;

Materials:

Pre-printed bias cards (20 per group), flip chart page, markers.

Process:

First of all, the peer educator must explain the basics of the cognitive psychology theory, which explains the relation between human thoughts, emotions and behaviour, also what is the influence of the above-mentioned actions on their decisions.

Divide the participants into smaller groups (up to 5 people in one group). Hand out the pre-printed cards and ask the participants to get to know their content. All of the 20 statements have a certain influence on the decision-making, understanding and cognition. If the number of the participants is smaller, you can discuss it as a single group. Ask the participants to discuss the following:

  • In what situations/topics do these statements work? Make a list of topics and assign statements.
  • What would help get rid of bias and make independent decisions?
  • Is bias always negative? In what situations could it be positive and help?
  • How can bias help manipulate our decisions, available information, attitudes?
  • Think of situations, where bias could be used in a negative/positive way?

Bias effects:

  1. Anchoring bias. People rely too much on the first information they receive and reject the information, received later. E.g. people at auctions tend to perceive the first person making a bid as the worthiest of winning the auction.
  2. Availability heuristic. People tend to overvalue the importance of the information, presented to them. E.g. someone could keep arguing that smoking is not bad for you, because he/she knew somebody, who smoked and lived for a 100 years.
  3. Bandwagon effect. A probability of forming an opinion based on the amount of people supporting it. The more people join a certain position, the easier it is to convince others. This is a very powerful form of group thinking, which is often the result of low productivity of larger gatherings.
  4. Blind-spot effect. The inability or refusal to get to know one’s attitudes. People are inclined to notice and recognize others’ bias before their own.
  5. Choice-supportive bias. Upon making a choice, you hope to feel comfortable and positive about your position. Even if the choice also has negative aspects. E.g. someone has a good opinion about their dog, even if the dog bites everyone.
  6. Clustering illusion. A tendency to notice various models in different activities/events/cases and group them according to one’s experience. Thus we pick individual elements. E.g. if you’re playing a roulette and several of the previous results were red, you may be inclined to think that the next one will be red too.
  7. Confirmation bias. We are inclined to listen to information, which supports our own opinion. That is why people find it very difficult to discuss with someone, who has an opposite opinion.
  8. Conservatism bias. When people tend to prioritise the proof, received earlier than new. E.g. it took long for people to accept the idea that the Earth is, in fact, round, because they found it difficult to reject the previous prevailing opinion that it was flat.
  9. Information bias. A tendency to search for information, when it has no influence. More information does not always mean better or higher quality. Sometimes less information helps people make more accurate conclusions or guesses.

10. Ostrich effect. The decision to ignore negative or dangerous information by ‘hiding one’s head in the sand’ like an ostrich.

11. Outcome bias. Underestimating the decision based on the results, without considering the decision-making procedure. 12. Overconfidence. Some people have too much confidence in

their abilities and this makes them take increasing risks every

day.
13. Placebo effect. Believing that simple faith, object, action or

person – any factor – will have the expected effect. E.g. in medicine a doctor prescribes simple vitamins instead of antibiotics and they make the patient feel better, because the patient expected that.

14. Pro-innovation bias. When advocates of innovation tend to overestimate the effect and reject the limitations of innovation.

15. Recency. A tendency to weigh the latest information more carefully than the previous one.

16. Salience. A tendency to focus on functions/characteristics that are the easiest to understand.

17. Selective perception. Allowing our expectations to influence our attitude to the world.

18. Stereotyping. Hoping that a group, a person or a certain factor will meet the picture that you have created earlier. This allows us to quickly identify unfamiliar things as friendly or hostile. Unfortunately, people tend to abuse this.

19. Survivorship bias. A mistake, caused by focusing only on examples of survival situations. This creates a mistaken picture of the situation. E.g. we may think that being a businessman is very easy, but we only think so, because we never hear stories of negative experiences or failure.

20. Zero-risk bias. Sociologists have found that people like to be reassured. This happens even, when we’re completely unproductive. Complete elimination of risk refers to safety and no damage.

Summary:

All people use their bias, created or acquired in advance. It is important to recognize, when the bias crosses certain boundaries, becoming an obstacle instead of helping. Manipulation of information or the goal of spreading propaganda often uses namely the experience already acquired by the consumers. That is why it is very important to recognise things that happen instead of immediately rejecting them. Try to answer the following questions together with the participants:

How should we fight each of the above-mentioned biases in order to avoid their negative outcome? (Write this part down on a large flip chart page and use it as recommendations)