Last week, as I was checking in at the airport, I ran into a problem. The man behind the counter politely informed me that my ESTA visa waiver had expired, and that they would not be able to let me fly. While he was only doing his job, for a brief moment I wanted to tell him he was wrong, to check again, that there had been a mistake. When my brain realised that was not fair on him, it recalibrated and instead told me to explain to him how this was not my fault, that the State Department had not done their job by failing to notify me.

The reality is that this was plainly an oversight on my part. Yet, acknowledging this sucked. Particularly for someone who does a fair bit of travel to the USA, this was a rookie error to make. Reflecting on that moment, it is interesting that while travelling involves hundreds of decisions, correct decisions rarely register in my conscious mind. However, when I make the wrong decision – like not checking my ESTA status because I was under the impression I renewed it last year – I notice it, and feel bad.

In the old days everything was better. Except humans, they still made faulty assumptions. Photo by G B_NZ.

I am not alone in this. We like to think that we are always set up to make the right decisions, even though our decision making processes are coloured by biases. For example, we consistently overestimate our ability to have anticipated an event occurring, known as the Hindsight bias. We do not like to consider that we could have held an incorrect belief, whether this be a major conviction or a trivial assumption – like the validity of a visa waiver. Without the right degree of mindfulness this can lead to denial or anger, as I experienced myself!

Russian author Leo Tolstoy expressed this far more eloquently when he wrote:

The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.

When it comes to an organisation’s journey to become more data driven, this effect should not be underestimated. These transformations are not just a case of “running more analyses,” but involve questioning the fundamental decisions that are taking place within an organisation. At the benign end of this spectrum, this is someone realising that they can do their job better using data. At the other extreme, they might be finding out that their faulty decisions have been costing millions of dollars.

"Let us pretend everything is OK as I think that would make me feel better." Photo by Kathryn Powell.

We hear how our data scientists need to learn storytelling, that we need to recruit analytics translators, and why many teams in this domain need to move away from the technical discussions. However, this focuses on the teams doing the work, not on the fundamental discomfort that their audiences might experience – and what that can mean in terms of the message not only landing but being acted on. Given what we know about our unwillingness to face uncomfortable truths, a strong change management skill set can help the business through this type of transition.

So how do we manage this? Firstly, we can acknowledge that these reactions are human, and that there is a big difference between feeling these emotions and acting on them. Secondly, we can realise that this is an inevitable consequence of driving a data transformation program, and ensure that accommodations are made. At the end of the day, we might simply need to learn how not to focus so much on loss and instead appreciate that, in the words of writer Alexander Pope: "To err is human."

— Ryan