Data-based coaching to counteract cognitive biases

by | Aug 8, 2017 | Uncategorised

The effect of Cognitive biases in Agile teams is as subtle and profound as in all other aspects of life. Perhaps the stresses and organisational context make them even more prevalent.

“A cognitive bias refers to the systematic pattern of deviation from norm or rationality in judgement, whereby inferences about other people and situations may be drawn in an illogical fashion. Individuals create their own “subjective social reality” from their perception of the input.”

It takes some pretty expert coaching, and heightened self-awareness in the team, for them to realise when a bias is affecting decisions and what action to take.

Most team members, including myself, do not have that level or self-awareness. These biases are too ingrained for us to recognise when they are influencing our response to a situation.

A good article on the topic lists “58 cognitive biases that screw up everything we do” on the Business Insider website..

Business Insider have a great cheat sheet of the top 20 biases, although this lovely image, done by John Manoogian, from the Wikipedia entry lists many more.

How a data-based approach helps

The key benefit of a data-based approach is to move the conversation from being largely subjective to one that is based on evidence or can stimulate new questions and fresh perspectives of a situation.

Agile teams can spend much of their time under some kind of duress, probably increasing the likelihood of biases, given that they many originate in our psychological need to protect ourselves.

As a ScrumMaster or coach, you will have experienced, or will discover, that using simple visualisations of the process data is one of the most effective ways to surface and work around (or perhaps work with) these biases.

For example:

  • A stakeholder who is ‘anchored’ to a specific date with fixed scope is only going to be shifted with something simple and evidence-based.
  • A PMO with confirmation bias is going to prioritise or seek information that confirms their preconceptions – until they are nudged into a new perception of the situation.
  • A team that lives with the negativity bias may need evidence to show them that they can achieve a stretch goal. Or perhaps they need to save them from themselves and overcome the planning fallacy.

In these cases a data-based approach, with simple visualisations, may be the only way to nudge people towards accepting and then tackling a difficult truth.

Agile – made for those pervasive biases in every organisation

Agile can be seen as an attempt to overcome cognitive bias at an organisational level.

The frequent failure of simple, if counter intuitive*, practices of Agile must be due, in part, to these common, powerful biases.

Telling a traditionally schooled Project Manager that the team will self-organise towards delivering a key project, or that the scope will change, are very difficult to reconcile with the mental model he has of the world. To suggest that managing risk through transparency and trust, rather than control, is equally difficult.

Agile’s emphasis on empiricism and evidence-based decision making is one of the key mechanisms by which it tackles biases.

* I say ‘counter-intuitive’ if you see them in the context of the ‘Status Quo bias’ or ‘Semmelweis effect’ (below)


Some of the Cognitive Biases you see in Agile teams

The definitions below are from Wikipedia, – which has a list of more than 100 cognitive biases.

An experienced coach would have seen many of these exhibited in Agile teams – as well as amongst the stakeholders and throughout the organisation.

Anchoring bias

  • The tendency to rely too heavily, or “anchor”, on one trait or piece of information when making decisions (usually the first piece of information acquired on that subject)

Availability cascade

  • A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or “repeat something long enough and it will become true”)
  • ‘The project is green’, ‘we will make the August deadline’

Confirmation bias

  • The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions
  • ‘The team is struggling’, ‘Scope is clear and unchanging’

Clustering illusion

  • The tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns).
  • ‘Recent velocity shows that the team has turned the corner’, ‘recent cycle time increases show that the team can’t get on top of the challenge’

Curse of knowledge

  • When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people.
  • Why can’t they understand the vision and backlog?’

Experimenter’s or expectation bias

  • The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations
  • A challenge in retrospectives and finding evidence for improvement

Focusing effect

  • The tendency to place too much importance on one aspect of an event.
  • Yes, the last sprint was difficult and we failed to complete any stories, but the overall trend is still strong.

Framing effect

  • Drawing different conclusions from the same information, depending on how that information is presented
  • ‘Well, the PM has assured us that the project is still green’ (even though we each have slightly different understanding of what being green actually means)

Illusory correlation

  • Inaccurately perceiving a relationship between two unrelated events.
  • ‘we can deliver because of the environments and route to live’ – whilst missing the point that the backlog is not of good enough quality, there is no real Definition of Done and we’re racking up technical debt.

Information bias

  • The tendency to seek information even when it cannot affect action.
  • Organisations are great at collecting lagging information, by the time it’s presented to the governance board, precious weeks to intervene have been lost. And the data is actionable anyway.

Negativity bias or Negativity effect

  • Psychological phenomenon by which humans have a greater recall of unpleasant memories compared with positive memories.
  • ‘This organisation is so political, doesn’t really matter what we do as a team, the last time we failed to hit the deadline…. ‘

Optimism bias

  • The tendency to be over-optimistic, overestimating favourable and pleasing outcomes (see also wishful thinking, valence effect, positive outcome bias).
  • Might be considered a strength by some project managers, however can lead to dangerously misaligned expectations.

Ostrich effect

  • Ignoring an obvious (negative) situation.
  • We all know this can’t be done in time – but the PM dare not tell the governance board – they just don’t want to hear that.

Overconfidence effect

  • Excessive confidence in one’s own answers to questions. For example, for certain types of questions, answers that people rate as “99% certain” turn out to be wrong 40% of the time.
  • Are we going to finish on time?

Planning fallacy

  • The tendency to underestimate task-completion times.
  • A frequent challenge within teams as well as beyond the team.

Selective perception

  • The tendency for expectations to affect perception.
  • “I think we’re doing pretty well… ” – what do you mean you haven’t checked the SonarQube code quality charts for 3 months?…

Semmelweis reflex

  • The tendency to reject new evidence that contradicts a paradigm.
  • We have always reported project progress like this.. you’re real-time burnup chart with forecast just isn’t what the stakeholders expect. They want RAG status reports – once a month.
  • We reduce risk with tighter control and a risk log. When instead; Real-time data can give earlier sighting of issues. Whilst control is exerted through transparency and the self-organisations of peers.

Status quo bias

  • The tendency to like things to stay relatively the same
  • We’re too busy to make those changes – they’ll take time.

Subjective validation

  • Perception that something is true if a subject’s belief demands it to be true. Also assigns perceived connections between coincidences.

Triviality / Parkinson’s Law of

  • The tendency to give disproportionate weight to trivial issues. Also known as bikeshedding, this bias explains why an organisation may avoid specialised or complex subjects, such as the design of a nuclear reactor, and instead focus on something easy to grasp or rewarding to the average participant, such as the design of an adjacent bike shed.
  • Let’s refresh the UI – rather than tackle the gnarly technical debt and have that awkward conversation with the architects.