Monthly Archives: March 2020

To See What Is In Front of One’s Nose

George Orwell is one of my favorite writers. He combines keen insight into human psychology and political tendencies with tight and well-written prose.

A quote from his essay “In Front Of Your Nose” : “To see what is in front of one’s nose needs a constant struggle… If one recognizes this, one cannot, of course, get rid of one’s subjective feelings, but one can to some extent insulate them from one’s thinking and make predictions cold-bloodedly, by the book of arithmetic.”

A summary of Orwell’s points from the essay:

  1. Human beings are poor observers of reality1
  2. Our observations of reality are systematically distorted by our wishes, fears, and emotions.
  3. Even if we internally have some intuitive grasp of what reality is we are mostly unaware of the ways that our intuition actually diverges from and distorts reality (although it is usually easier to see how others are distorting theirs)
  4. Even when we are aware that our vision of reality is distorted, that we are doing the distorting, and that we do not wish to continue doing so – we are still mostly likely to continue doing so because of how ingrained in human nature the habit is.
  5. Even if at a particular instant/minute/day we are able to get closer to what “reality” is and reducing distortion, the likelihood of us as individuals being able to do this consistently over sustained periods of time (like months or years) is close to zero.

Orwell was writing more about political matters 2 but this human tendency creeps into every aspect of daily human life. Some examples:

  1. Any police officer will tell you that eyewitnesses to crimes are terribly unreliable, say things that didn’t happen, and omit things that did. And this is even when they are not lying; people can say these untrue things and believe them. The more interesting part of this is not that people lie sometimes. We all understand the various motivations to intentionally lie. The interesting part is that even when people want to tell the truth, where “the truth” is the narrative of the actual, factual events that occurred, they are a) often incapable of doing so or even knowing what “the truth” is of the actual events that happened and b) are unaware that they are incapable of doing so and that they are not telling “the truth3
  2. Our own observations about what we eat and how often we exercise. We systematically believe that we eat better than we actually do, that we exercise more than we do, that we get more sleep than we really did, etc., etc. If you have ever done a food or exercise journal then you will have been shocked by how far off your estimations were.
  3. Our observations about how we spend our money and how much we save. Anyone who has ever been consistent about making and keeping a budget knows that one always spends significantly more than one thinks and saves significantly less than one thinks. Even if one already knows that one does this – it still happens.
  4. If you have ever been paid by the hour for work, which most of us have been at some point in our lives and potentially still are, you know that if you don’t actually have to punch into a clock, you likely start a few minutes late, end a few minutes early, take a break now and then, and even when you are “working” there’s probably a lot of chatting and other distractions going on. And in many cases you don’t even notice!
  5. Anything involving us as humans being systematic, consistent, and reliable without some sort of external system or monitoring making us aware of ourselves. Yes, our behavior changes when we know that we are being watched, although there is enough material there for a few books. Bentham’s Panopticon touches on aspects of this. 4

One of the biggest challenges that I encounter in my daily life of working for OnPrem Solution Partners 5 and running our Data & Analytics practice is getting people to embrace data-based decision making. This can definitely be a struggle on our client-facing projects when clients are thinking of the project as primarily a technology exercise. And, to be frank, it is sometimes a struggle even within our firm. We are subject to human tendencies like any organization is, and most of these issues are due to these human tendencies which I will discuss below.

Why Do Analytics Projects Fail?

Earlier in my career, I used to think that the problems with doing analytics projects revolved around technology. After a while I realized that while technology problems do happen, they are not the most common real blockers to analytics projects.

I then thought up until recently that the real problems with analytics projects were organizations, organizational politics, and/or incentivization. That is, problems around getting everybody in the organization onboard to agree to do an analytics project and making it in everyone’s individual interests. These are more fundamental and profound problems then technology issues per se, but I no longer think that these problems are the true roots of failure for many analytics projects.

I now think that the true underlying blockers to analytics projects are usually related to members of organizations believing that:

  1. They understand what is actually going on accurately and in detail in their organization
  2. Their hypotheses are correct about how to address these issues and do not need experimentation/verification
    1. Note: Many of us do put a lot of effort into doing analytics for the purposes of justifying or persuading others of our ideas. I’m not speaking of that here. Here, I mean actually doing measurement and analytics for the purposes of truly doubting our own ideas and hypotheses and verifying them.

In short, the problem is us incorrectly believing that we have a good grasp of what reality is. Or, once realizing that we do not have a good grasp of reality, believing that the sole thing required to fix the issue is a one-time small adjustment to understand what reality is with minimal further effort required.

This is almost a spiritual point here. Embracing true critical thinking and analysis is a never-ending effort. We are constantly mostly blind, constantly mostly blind to being blind, constantly mostly blind in new ways to new things, and often continue to be blind about the fact that we are blind even when we know that we are blind and when we know that we are blind to being blind. Critical thinking is hard work, and given the human temptation to laziness, one is always prone to lapse back out of the effort even when one knows the whole problem. Lest you think that I am exempting myself from this, I am just as subject to this paradox as any other member of the human race.

A Practical Example

Let me give two good example here of how blind we all are – “we” in this case being particularly the professionals that are supposed to be experts about these topics.

Example One – Monitoring Analytics Technical Operations

One of the more common challenges about modern analytics projects is the complexity of building, running, and maintaining the stack of technologies required for data gathering, storage, cleansing, transformation, validation, analysis/visualization, and documentation. Even a simple data stack could easily have 3 or 4 differing technologies involved and tens or hundreds of scheduled data processing jobs. We have to build these data stacks in order to do analytics projects, and the design and decision-making around building these stacks is one of the most important series of choices to be made in an analytics project.

A blog post that has recently received a large amount of attention is titled “Observability for Data Engineering“. From the article: “Observability allows engineers to understand if a system works like it is supposed to work, based on a deep understanding of its internal state and context of where it operates.” The article describes how data processing systems/data stacks are often badly lacking in observability which makes them difficult or even impossible to run and maintain and then goes on to suggest ways make the systems composed of these technology stacks observable and maintainable.

How does this example prove my point?

How in the world, did we, the professionals, who are supposed to endorse rigorous measurements and analytics while moving away from “gut-driven” decision making, ever think that we actually knew what was going on, without rigorous measurement, in extraordinarily complex systems which it is our business and responsibility to design and build? And that we could make good decisions about how to manage these systems without having solid data and processes to make decisions upon?

This has been a massive blind point. “Physician, heal thyself.” We have not being doing that which it is our job to professionally tell and assist others in doing, which is to not make assumptions and to practice evidence-based decision making.

Example Two – The Challenger Disaster

The Space Shuttle Challenger disaster was a fatal incident in the United States space program that occurred on Tuesday, January 28, 1986, when the Space Shuttle Challenger (OV-099) broke apart 73 seconds into its flight, killing all seven crew members aboard. 6.

If you read through the history of the disaster on Wikipedia, it is clear at multiple points during the space program there were individuals at the various third-party firms involved in construction and parts manufacturing processes who were very concerned about the manufacturing and usage of O-rings, which were the parts that ultimately failed and caused the disaster. Also, “NASA managers also disregarded warnings from engineers about the dangers of launching posed by the low temperatures of the morning of launch, and failed to adequately report these technical concerns to their superiors.”

This all occurred because of pressure to launch and because the culture of the organization had evolved to a point where pressure to get “the thing”, that is, the launch, done significantly exceeded the pressure to do “the thing” correctly. This pressure percolated down into the middle management ranks and a series of catastrophic decisions continued to made over a period of time because the various individuals and executives involved were not concerned with consistently answering the fundamental question of “Can the shuttle perform safely, and can we be sure that it will perform safely?”

In Summary

The antidote to all of this, as Orwell suggested, is “keeping a diary”, returning to reflect on the diary, trying to critically separate one’s thinking from one’s emotions, and relying on the “book of arithmetic.” Frankly, Orwell would probably have made a great data analyst.

In practical terms, this means ruthlessly examining our beliefs about how our organizations work, making consistent and meticulous measurements about the processes that we want to observe 7, returning to these measurements to regularly test our beliefs, and making (hopefully) better decisions based on this process. Then eternally repeating the cycle.

If the organizations and individuals involved the project do not agree with this line of thinking about problems with human perception, reliability, and judgement in the first place than the analytics project is not likely to succeed. It may not fail outright — but it is unlikely that it will result in real, successful, transformative organizational change.

Or, as Blaise Pascal said:

People almost invariably arrive at their beliefs not on the basis of proof, but on the basis of what they find attractive.

I have written a follow-up postscript responding to some reasonable and foreseeable concerns and objections here.