How data-based coaching enhances psychological safety

by | Jun 2, 2017 | Uncategorised

According to a study of 180 Google teams, Psychological safety “was far and away the most important” of the five dynamics that characterised the most effective teams in the company. It underpins the other four; Dependability, Structure & Clarity, Meaning and Impact.

‘Data-based coaching’ offers you ways to improve the sense of safety in your team, and so stimulate learning and improvement.

The initial work on psychological safety was done by Dr Amy Edmonson, of Harvard Business School, in her 1999 paper. She defines a safe team as one where there is .. “ a belief that one will not be punished or humiliated for speaking up with ideas, questions, concerns, or mistakes”. Her TED talk summarises these insights.

Psychological safety encourages and deepens the learning process which in turn makes teams more effective, as evidenced by various qualitative and quantitative measures.

Safety is mostly seen in the interpersonal behaviours within the team. However, ‘data-based coaching’ can help to alter the dynamics around the team, as well as facilitating some key of those key internal behaviours.

Data–based coaching also honours the essence of Agile, the use of empiricism to provide frequent feedback, enabling teams to ‘inspect and adapt’.

 

Coaching to improve psychological safety

Dr Edmondson recommends three ways in which a coach can improve the team’s sense of safety.

  1. Frame work as learning problems, as opposed to execution problems.
  2. Acknowledge your own fallibility
  3. Model curiosity by asking a lot of questions

Visualising data is useful in addressing points 1 and 3 although it’s probably best to leave you to decide on the best way discuss your fallibility… a useful question for me too!

“Frame work as learning problems, as opposed to execution problems.”

How data-based coaching stimulates learning

Experimentation drives the learning process. Done properly, using the scientific method, this starts with an hypothesis, moving on to the experiment, followed by analysis of what happened and the capturing of new insights.

Some of that analysis may be qualitative, however most should be evidenced quantitatively.

Changes in system behaviours are likely to be small and our measurements need to be sensitive enough to pick up these subtle changes. The learning will be more useful if the analysis is close to real-time.

Visualising the data will often identify surprising patterns in the system, prompting the team to learn more about the dynamics of the system. Retrospectives can be more engaging and focused with visual charts, see examples below.

Most teams, if they are already using a tool like JIRA or TFS, will already have much of that data, although sadly they rarely use it to its full potential.

Whilst the physical board humanises and enables self-organisation in the moment it fails, without quite significant manual effort, to show us the subtle shifts in trends over time.

Caution: Data can also focus too much on the ‘execution problems” mentioned by Edmondson, with negative consequences. A damaging example is the simplistic overuse of ‘velocity’, leading to stakeholders and team fixating on the output rate with limited opportunity for questions or learning.

 

Stimulate curiosity with intriguing visualisations

Model curiosity by asking a lot of questions

Open, nonthreatening questions are at the heart of good coaching. It follows that anything that generates more and better questions is going to help.

If those questions are mostly about factors that are external to the team, then we are giving the team a collective purpose and focus – as well as making it psychologically safer.

Whilst the data is impenetrably dull – the visualisations can often pique the curiosity of team members. They also provide the coach with insightful and safe starting points for a discussion.

Such visualisations rarely provide ‘answers’ however they do get people asking better questions.

Examples of visualisations that stimulate

This 4-dimensional backlog map clearly illustrates where the focus of the team has and will be.

The four dimensions being: time (down the left), features (along the top), the colour representing state on the board, whilst size is the relative size of the story.

The correlation of story size and cycle time clearly showing outliers and is a great question-generator for retrospectives. Should we be trying to improve the correlation – or perhaps decide that it’s not worth estimating.

See www.SenseAdapt.com for more on 14 real-time charts designed specifically for Agile teams. These are available on JIRA and TFS/VSTS, in both cloud and on-premise versions.

 

And.. visualisations could even help with fallibility

Acknowledge your own fallibility

Whilst fallibility is mostly about the coach’s presence and behaviours, there is a way that visualisations can help here too.

A coach can use charts to identify things they have noticed but do not have an ‘answer’ for. This helps clarify to the team that the coach is not the expert who will sort out the situation, but rather someone who is also curious about what is happening and doesn’t have the answers.

“The system that people work in and the interaction with people may account for 90 or 95 percent of performance”

Focus on the system – not the team

Teams work within a system that is generally un-see-able. That dynamic system behaves in surprising ways which have a direct impact on the team. Hopefully their work flows smoothly across the board – often it does not.

As Deming said “the system that people work in and the interaction with people may account for 90 or 95 percent of performance”

By heightening awareness of the system around the team, rather than on the individual actions of the team, we create a safer, common focus for everyone.

If we can’t see and talk about the system, then our attention naturally focuses on what we can see and measure i.e. the individuals and their actions. Thereby adding psychological risk.

The team needs to be coached to understand and improve at the system level.

The stakeholders should also clearly understand the significance of the system and not inappropriately focus on the team. Data can help to show the stakeholders just how much the system is affecting outcomes.

 

Resisting the political forces

‘Internal politics’ can be useful shorthand for the swirl of opinions and the jostling for influence and status within an organisation.

Software development teams are less inclined and less able to play this game. It feels risky to be buffeted by these forces and be a victim of the resulting unrealistic promises or macho deadlines.

The most effective way that a coach and team can protect themselves is to move the discussion from an opinion-based subjective ‘analysis’ to a more data-based, objective picture.

Deming is spot on, again, when he writes:

“Without data you’re just another person with an opinion.”

In most cases, if we find the team having to respond to others’ judgements with their opinion they are going to ‘lose’. As a coach it’s better to help create a evidence-based position.

This chart clearly shows how is is both the scope and throughput rate that determine the likely range of end dates. Talking only about velocity fails to show stakeholders the influence of scope.

There is no clearer way to show stakeholders what is possible. It also illustrates a critical fact – that the scope is a greater determinant of end dates than throughput.

In conclusion

Data-based coaching enhances psychological safety by both facilitating the learning process and altering the context the team works within.

As a coach you are able to stimulate the team’s curiosity and develop their skills and attitude towards purposeful experimentation and learning.

The evidence-based approach also protects the team from external political pressures, by countering opinion with fact. Whilst also drawing attention to the system-level dynamics that impact on the team’s effectiveness more than most people are aware of.