Hailed as silver-bullet interventions that will change things at a stroke, big technology solutions often end up disappointing. So what is it that’s going wrong? Do we overpromise or underdeliver – or both?
There is a strong case that leaders overpromise when launching big system changes – often reinforced by technology. Projects appeal to leaders who want to make ‘a big move’ and avoid the messiness of persuading lots of unruly human beings to change their behaviour. Instead, new systems promise to channel, incentivise, monitor and evaluate human responses; they provide an illusion of control.
Management language can be deceptive. Words like ‘project’, ‘system’ and ‘technology’ connote a neat engineered approach that appears to minimise uncertainty and human error. The danger, as many IT-based programme leaders know all too well, is that overpromising technical change can obscure the unavoidable risks and underestimate the required investment for behaviour change.
‘Most of us view the world as more benign than it really is, our own attributes as more favourable than they truly are, and our goals more actionable than they are likely to be’
Blinkered by experience
Apparently perfect technology-based initiatives are beset by a series of traps. The first is overreliance on past experience. Leaders are time-bereft and desperate to make their mark quickly before shareholders or critics lose patience. They rely on what has successfully taken them to the top, but this narrow experience deceives.
Daniel Kahneman – the only psychologist to win the Nobel Prize for Economics – points out in Thinking, Fast and Slow that we are all prone to optimism bias. ‘Most of us view the world as more benign than it really is, our own attributes as more favourable than they truly are, and the goals we adopt as more actionable than they are likely to be,’ he writes. During the pandemic, many of the ‘solutions’ for schools and care homes fell foul of optimism bias.
Another universal trait, says Kahneman, is the planning fallacy. Focusing on goals and excluding others’ experiences results in an illusion of control and overconfidence. Actions rather than results are favoured, known unknowns are neglected, data is rejected when inconsistent with the goal, and plans are irrationally persevered with.
Through the years, plenty of major technology projects have fallen foul of the planning fallacy. TSB’s Sabadell operating platform, the Canadian government’s Phoenix payroll system and new rail projects are just some from a very long list.
The reason that tidy technical solutions become so messy is simple: they have to become messy to succeed
Too crude
The second trap is to oversimplify the role of the leader. Organisations are complex systems with three levels of leadership role: senior leaders set direction, context and resourcing; frontline leaders make the daily trade-offs that keep the operation stable; and management teams allocate resources and time to competing needs. Central interventions in complex systems, however well intentioned, risk destabilising the organisation.
This is especially true when technical changes increase opacity in the system, making it harder for frontline leaders to make the necessary adjustments that keep the operation stable. Artificial intelligence will, perhaps, be the most challenging new technology for organisations to adopt without disruption because it is opaque to understanding and correction at the frontline.
Fixing the wrong problem
The third trap is matching a potentially great solution to the wrong category of problem – applying a technical solution when an adaptive one is needed.
The pandemic is an example of a problem – behaviour change – that requires an adaptive solution. Behaviour change requires consistent messaging, disseminating change at a pace people can absorb and encouraging everyone to do the messy real work of change. Of course, technical help gives people confidence and support to get going – as with virus testing and vaccination. But there is a danger of overstretch at the centre: inconsistency, overclaiming for technical fixes and panic when people do not respond as expected.
Three lessons
When designing technical projects that avoid overpromising and underdelivering, there are three main lessons to learn.
The first is to learn from past changes and bring the external world into the organisation to avoid being a prisoner of what you think you know. This is not done by importing consultants, but by tapping into diverse internal and external experiences.
The second is to stop looking to ‘the leader’ for answers, and instead ask powerful questions to tap into leadership at all levels.
Finally, senior leaders and project leaders need to balance technical leadership with adaptive leadership. They should, for example, give frontline management space to ‘discover’ the solution, starting with clear early deliverables to give confidence, but inviting engagement in the uncertain, messy, essential work of behaviour change to realise benefits.
The reason that tidy technical solutions become so messy is simple: they have to become messy to succeed.
About the author
Keith Leslie is author of A Question of Leadership – leading organisational change in times of crisis