Perhaps one of the coolest things about growing up in the house hold of a cognitive psychologist is an exposure to social sciences while actively being experimented on. In all seriousness, my love of technology and data stems from helping my data build his computer laps for testing and collecting data on research they were doing. Today industries worry about big data collection, archival, and analytics, but the concepts that I saw in that lab remain the same. There are biases that can influence the experiment, the tools, the process. Researchers try to do everything they can to isolate the information for the test and eliminating lurking variables.
There are a lot of cognitive biases out there. I recently had the opportunity to give a talk about leaders are impacted by bias and ways to debias decisions. You can view that talk below or see some of the slides posted at the end of the article. As it relates directly to data strategies involved in transformation efforts there are a number of effects that I've observed can impact strategy development and execution. Strategies tend to fall into offensive and defensive capabilities. There are times when leader think the offensive or defensive strategy, they have is good enough, and it might be, but that could also be a status quo bias or blind spot bias affecting their ability to see what's needed for transformation. In a status quo bias, leaders are resistant to change that alters the current state. While it might make sense, it's best to get alternatives when access strategy and reframe options to make sure that an offense strategy is what's needed. If regulation or the industry is shifting, a defensive or balanced strategy may come into play. Blindspot bias is exactly what it sounds like, the belief that you are less bias than those around you. Leaders, when formulating strategy, will think their approach is better because those around them may be more bias. It's important that in leadership roles we resist that, and clearly in data strategy roles because that can influence the data used for making decisions.
One of the most common areas I've seen experience challenges are in the analysis of data. One person complains that the website is hard to use, and a large initiative is spun up from that one piece of data. That's an availability bias, in which an estimate on population frequency is attributed to the ease in which data was available. There's not really much data, there should be some UX research done, but reacting on 1 piece of data and assuming that it is a good estimate of the population is a fallacy. Good guardrails for data collection and perspectives can help to debias this problem.
Perhaps one of the most challenges biases found in data strategy work is confirmation bias. Platforms like linkedin, facebook, twitter, and others take advantage of our confirmation biases by only showing us things that help confirm our beliefs. The corporate office is the same. I've been asked in my career to do an analysis so that the result came out a certain way so one leader can go show another leader that they're more correct. It's one thing to work hard to drive a strategy and collect the information you need, but it's quite another to ignore some data because it doesn't tell the story that you want to tell. Data strategies should be open to alternatives and un-bias perspectives on the results.
These are just a few examples but the effects of bias can be fought through training and awareness. I have to constantly remind myself about biases like these so that i'm not creating a problem with our strategies. Hopefully this was helpful! Jsut remember, when you're thinking about your data strategy, be thinking about how to debias your approaches as well to maximize your success!