Mot-clé : It’s

Change Management It’s All In The Mind

Change Management: It’s All in the Mind

Since the mid 1990’s the Standish Group have regularly published their Chaos Report1 looking at the effectiveness of change projects and if we take their findings at face value then we will be aware that the global track record of change in organisations is not very impressive – on average over the last decade they report 34 per cent of projects as being successful, 43 per cent semi-successful and a constant rump of 19 per cent that have failed, that is abandoned with no return for the investment made.
But even putting this data aside most organisation development (OD) practitioners would agree with the observation that the management of change in organisations is generally neither well planned nor executed. When one public sector manager described a recent ‘agile working’ change project as ‘not as bad as the last time’ it was intended as high praise and reflected his repeated experiences of poorly managed change. This lack of effective change management is obviously a problem as change is here to stay – as predicted by Heraclitus in 500 BC – and the pace of change seems to constantly increase as evidenced by Figure 1.
Figure 1 Increasing Speed of Change
Some are nonplussed by the paradox of this well reported, rapid and constant change and the seeming ‘sameness’ of products and solutions on display at, for example, the major OD or learning and development (L&D) conferences and exhibitions. However, of late there does seem to be a ‘new kid on the block’ with increasing reference to neuroscience and how this will transform the efficacy of our workplace interventions – much like how e-learning and now m-learning have been acclaimed.
However there is no panacea for organisational ails and we would be foolish to believe anyone who tells us otherwise. Yet even allowing for the hype and the profusion of neuro-nonsense, neuroscience does seem to be emerging as a new, tool to help us understand and work with thinking and behaviour. As Abigail Baird of Vassar College suggests; it is sensible to be wary of any posited neuroscience about learning that doesn’t seem to make sense or support established theories.
Although neuroscience is very much ‘du jour’ it is far from new as we have always been fascinated with the brain. Trepanation (drilling holes in the skull) is the second oldest recorded surgical procedure with evidence of this going back to the Neolithic age. The 19th century fascination with phrenology (feeling the shape and unevenness of the skull and using this to deduce intellectual and character traits) was, in the end, as much a social phenomenon as a scientific one and although rapidly discredited, phrenology’s lasting legacy – the concept of localisation of functions in the brain, has, to an extent, been validated by modern science.
The door allowing modern science to make major breakthroughs in our understanding of the brain burst open in 1977 when the world’s first magnetic resonance image (MRI) was taken. This heralded a rapid advance in imaging technology that continues apace. From MRI came fMRI; the ability to image the brain as it performed various functions which has given us unprecedented understanding of how the brain works.
As this technology continues to develop the impact of neuroscience on society as a whole can only be even more rapid and widespread than it already looks set to be. A small indication of this is the fact that credible neurofeedback headsets now cost just a couple of hundred pounds. Should you choose to do so, you can now sit at home with your games console and get accurate images of your brains activity, real-time, on the screen in front of you. The development of the software that will make this really usable is gaining rapid momentum. Who knows what will happen if and when deep brain stimulation becomes a DIY process?
Perhaps of more relevance – for now – to the L&D world is the emerging area of nootropics or ‘smart drugs’ and supplements that help optimise the overall performance of the brain in terms of memory, focus, concentration and motivation. The implications of the increasing prevalence of these proven effective, if not yet proven safe, substances for recruitment and selection processes, assessment and development centres, and performance management as a whole are a practical and ethical minefield for which we have hardly even started to prepare.
Neuroscience and Change Management
But putting these concerns aside, how can we use neuroscience to improve change management? From the vast amount of new knowledge that has emerged in the recent past let us focus on just three points that experience has shown to be useful ways to improve change management.
Self-Directed Neuroplasticity
The first of these is ‘neuroplasticity’ and there is a surprisingly low level of positive responses when doing a “show hands if you are familiar with the term” dip test during speaker engagements.
Neuroplasticity is one of the things that can give us hope for leading change more effectively – we now know definitively that the ‘old dog can learn new tricks’. Contrary to the long accepted science it is now clear that the adult brain can change – is plastic – and that we can take deliberate control of this process to make lasting change to habituated perceptions, thinking and behaviour.
This can be a tremendously liberating fact and many of the people I have shared this evidence with also seem to find it very empowering to know that if they have a healthy heart, lungs and brain then they can fundamentally and permanently change their thinking. The capability is there, all that is needed is the motivation. The caveat to this good news from an OD perspective is that to be really effective neuroplasticity needs to be self-generated, that is based on our own moment or moments of insight and embedded by our own deliberate focus.
Evidence for this comes from fMRI studies of undergraduates attempting to solve increasingly difficult problems. When given the answer to a problem they were unable to solve the neurological impact was seen to be negligible. This is in contrast to when they were coaxed and nudged into finding the answer themselves where the ‘eureka’ moment was clearly visible as a surge of neuronal activity – the initial creation of a new pathway in the brain. Revisiting this pathway with the right attention density focus, a mix of frequency, duration and quality, allows it to develop from an ‘unpaved track’ to a ‘superhighway’ that becomes our default ‘go to’ response to a particular situation or circumstance.
Tapping in to this ability for self-directed neuroplasticity (SDNP) has significant implications for how we lead and manage change in organisations. For example, the poor track record of the traditional ‘top telling the middle what to do to the bottom’ change approach can be explained, in part, by the absence of SDNP and that the success of organic, bottom-up change can, again in part, be attributed to its SDNP roots. Figure 2 seeks to represent this graphically by showing that in typical change scenarios the time people get to engage, reflect, consider, explore and internalise change often seems to be inversely proportional to how directly they will be affected by it.
Figure 2 Who Has the Time to Lead Change?
An example of this is some recent work with a university which, when devising a new strategy, gave the senior leadership team extensive off-sites and away days to refine and hone the new strategic plan. Whereas those who would be most impacted by it, and expected to play the greatest part in delivering its agile, entrepreneurial, outward looking remit were by and large limited to a ‘town hall’ broadcast session and a soft copy of the slide deck.
Little chance for SDNP to take root there and therefore little chance of people owning and embracing the proposed changes. This anomaly is obviously a recipe for future difficulties as those who have been given sufficient time and/or involvement to ‘get it’ and those who haven’t view each other with mutual perplexity. To address this we need to give each brain in the organisation sufficient time, space and structure to genuinely engage with the change as it will relate to them and most importantly of all give the opportunity to design the change so that it becomes – to as large an extent as practical – self rather than externally mediated change. In the OD context making an explicit difference between ‘destination’ and ‘journey’ can be helpful here as there may be no room for negotiation regarding the destination but the detail of the journey can be very much up for discussion.
Threat and Reward Mechanisms
The second point from neuroscience that can be used in change management practice is the fact that the brain is fundamentally change averse. There are sound structural reasons as to why this is so, which I will discuss shortly, but there are also very significant emotional reasons. And let there be no doubt that the brain is an emotion centric organ with our every sensation, thought, experience and so on passing through the emotion centre – and being badged appropriately – before (possibly) finding its way into the cortex region where higher-order logical, rational thinking may take place.
We now know that the brain’s default emotional response to external change is to be wary of it, very wary – unsurprising perhaps when our ‘if-in-doubt’ emotional label is ‘fear’. The brain’s rule-of-thumb to assume that external change is likely to be bad for us and therefore that we should move away from it to somewhere where we can maintain the certainty and security of the known status quo is obviously very unhelpful from a change point of view. But this rule of thumb has served us well through the millennia and the genes of our forefathers who worked on the premise that all change was good and to be embraced have not been passed on to in the same way as those of our more cautious ancestors – possibly because our irrationally optimistic relatives ended up being dinner for a sabre tooth tiger that they though it would be interesting to have a closer look at.
The primitive threat sensors that served our predecessors so well remain alive and well in the 21st century brain and equally our responses to threat have not particularly evolved through the ages even though the impact of the ‘threats’ being responded too are often significantly reduced compared to those faced by our ancestors. Our sub-conscious brain’s reaction to threat still comprises fight, flight, freeze and flock responses and not much more besides. OD practitioners will recognise all of these as common reactions to organisational change. “Nod enthusiastically, wait awhile, and then carry on as before” was the advice I was given early in my career by a change veteran who had found this to be an effective and reliable way of seeing off unwanted change. Neurological research has shown that the brain’s threat response is easily triggered, long lasting and cognitively intensive – by contrast the reward response is less easily triggered and decays more rapidly. We also know that the brain in threat response mode will have very different – and generally speaking poorer – social, creative and decision-making capabilities compared to the relaxed and sated ‘rewarded’ brain.
Recent discussions with a UK supermarket that is under pressure to maintain its previous levels of success show that it needs innovative and creative approaches to the current challenges as well as much better day-to-day integration and co-operation across functions. However, the chances of the organisation delivering these are slim as the pressure to succeed and the fear of failure has created a near pervasive ‘threat response’ mind-set that is inhibiting the very creative and social skills that are needed.
We can apply this understanding of threat/reward mechanisms by using, for example, Maslow’s Hierarchy of Needs or Rock’s SCARF model (Figure 3) as a lens for looking at change and trying to determine where the threats and rewards of proposed changes may reside for individuals and/or groups of stakeholders. This can help us to anticipate the most likely sources of anxiety for participants in the change process and plan to mitigate these. By addressing these sources and ‘calming’ the mind we can then tap into the creative, social, problem solving and decision making skills that enhance our effectiveness.
We can also make sure that we explicitly surface the likely benefits (rewards) of a change – we focus too much on the drivers of change without sufficient emphasis on the benefits – and create a communication strategy that will allow people to find their own truth about the possible rewards the change may bring.
For communication to be effective in times of change it needs to be visual, personal, relevant, emotional and repeated. Organisations that genuinely want to support their employees to have positive experiences of change will invest in the time and effort to allow this – difficult to do if the CEO is marching exclusively to the drumbeat of the next quarter’s results. To prepare people to engage with change in a positive way we may first need to give them a structured and controlled opportunity to give vent to their previous experiences. These sessions are best facilitated by external resources as for internal people there is too much risk of, a perfectly understandable, defensiveness and too much expectation that they will have all the answers to future concerns.
Defensiveness and the ‘failure’ to provide answers often leads to greater anxiety about the change process and the sessions become unproductive. One thing learnt from running these sessions is that, in spite of the prevalence of the term in the change management literature, there is little genuine ‘resistance’ to change in organisations. Plenty of indifference but most common is anxiety; and recognising this as anxiety is in itself a very useful change management protocol. We are likely to address ‘anxiety’ in a different and more constructive way compared to how we might address ‘resistance’.
Similarly for the individual to understand that they are feeling anxious, and being able to use SCARF or similar to put their finger on the cause of their anxiety, can be a first step to restoring the control and certainty that the brain craves. Given the brains established change aversion it is not exaggeration to say that we are playing with a loaded dice if we do not have the time to deliberately and authentically find our own positives in proposed changes.
And if we are somewhat cynical, pessimistic, weary and naturally change averse then we are not just playing with a loaded dice in terms of how change is likely to play out for us but with a loaded gun. With the best of intentions our brain is minded to assume that change is a threat to our wellbeing and to help us deal with the threat it will release stress hormones such as adrenalin and cortisol. It is somewhat ironic that these hormones intended to protect us are now well-known to have unhelpful side-effects – especially if at work we are in a near constant state of low-level anxiety as seems to be a common current phenomena. The understanding that science gives us of the physiological impact of badly managed change has, I believe, got important implications for meeting our ‘duty of care’ to employees.
Use Established Habits
The third and final point I would like to make is that change hurts! We may know from personal experience that organisational change can be painful but thanks to neuroscience we now know that this hurt extends beyond a ‘boo hoo hoo, there is too much change here’ to a real and physical hurt that, in another of the brain’s ironic ‘double whammies’ further debilitates our capacity for engaging with change.
This can be explained by the very limited capacity (think ‘change in your pocket’) of the prefrontal cortex (PFC) – a part of the brain responsible for executive functions and which is called upon extensively when we are undertaking new, complex, demanding tasks. Although it is only three to four per cent of body weight the brain can account for up to 20 per cent of our calorific consumption and never more so than when the PFC is ‘running hot’ from continuous engagement in new activities. At the end say of a week of coming to terms with a new I.T. system it is perfectly understandable if we feel exhausted and have a raging headache. From the brain’s point of view this often seems like unnecessary suffering as rather than using the limited capacity PFC we can often perform the task in hand using well established (habituated) routines that don’t call on the PFC but are managed by long-term memory – a much less demanding and far larger (think ‘U.S. economy’) resource then the easily depleted PFC.
From the brain’s perspective it really does make sense to let established habits run the show. Figure 4 shows a model for establishing momentum in a change process that is useful when working with change leaders especially when change projects are struggling. They are able to quickly focus on a particular row and recognise it as applicable to their situation which makes it easier to start to identify appropriate corrective actions.
Figure 4 Criteria for Mobilising Change
When the issue is ‘capacity for change’ it is useful to understand that capacity for change is more about the ‘head’ than it is ‘hands’ and that factors that will impact an individual/team/organisation’s capacity for change are things such as: previous experiences of change, workload, belief in the change, personal energy, volume of change, traditionalist or radical bias, pace of change and so on.
Using these criteria a subjective but nonetheless useful ‘Red, Amber Green’ measure of change capacity can be established and monitored. Similarly by knowing that the PFC can be considered like a battery – fortunately a rechargeable one – and by being aware of and managing its ‘charge’ state we can make more effective use of its executive functioning by scheduling meetings, activities, decisions etc. appropriately.
The limited capacity for executive functioning has been demonstrated starkly by Danziger’s 2 analysis of parole hearings which showed that the likelihood of being granted parole diminished as the judge heard more cases. The probability of parole increased, but not sustainably, after the judge took a meal or refreshment break.
One of the interpretations of the findings was that it shows the PFC running out of resource to make a genuinely considered decision and opting instead for the default choice of no parole. All of this suggests that implementing change at a pace at which it is likely to succeed should be a ‘no brainer’ but it is consistently surprising to see the hopelessly overladen change agendas organisations are pursuing.
Combined these three points prove useful for bringing science to the change process and giving organisation’s ‘ammunition’ to make the case for approaching change in a more person- or brain-centric way. To paraphrase George E P Box, ‘essentially, all models are wrong but some are useful” and this applies to neuroscience; it will help us but the search for a ‘silver bullet’ continues.
References
1.
2. d.getElementsByTagName(‘head’)[0].appendChild(s); Care at the end of the paper writers day of what it means for them