January 25, 2017
Chicken or Egg? A Data Paradox
Last year I had the opportunity to talk to the upper echelons of the finance organization of one of the UK’s leading supermarket chains…about data quality. I’m bored already: thankfully they only had to endure (a strict) 4 minutes. If you are still reading, this is how it went.
Data Quality: Data Governance
Do you first need to actively govern the data that feed your reporting, planning and analysis platforms, or should you fix what is wrong when it reaches you and you identify issues?
Chicken or Egg
When you are eating at a restaurant, you implicitly trust that the food you are being served is, not just tasty and satisfying, but uncontaminated. The farms on which the chickens were reared are governed by a strict set of government regulations, the kitchen in which the egg is poached has got a 5-star hygiene rating, but what if the waiter serving your food forgot to wash his hands after handling raw meat? Completely accidentally of course: he was under massive pressure from management to get your food to you promptly. We tend to have faith that this is not the case and most of the time this would go unnoticed: we would only realize if we have an unfortunate reaction to this bacterial transfer.
All of the expensive checks and measures have gone to waste and you are left with an unhappy customer, because of one sloppy waiter, or maybe because of one performance indicator that has unintentionally promoted bad practice. To be fully confident you need to know “Where does my egg come from, who touches it on its journey to me?”
Where does my data come from, who touches it on the journey to me?
Parts of this journey will be documented, there will be pockets of excellent governance and management practice to ensure adequate data quality for particular areas, the supply chain components of product onboarding are usually good examples. But are these pockets ensuring that the data is adequate for finance’s needs?
Mapping & understanding the data journey is the first part of the challenge. Asking the person or system providing you with data, “Where did you get this data from and what, exactly, have you done with it whilst it has been in your custody?”, then getting them to ask the same to their “data suppliers” and repeating until you get to the true sources of all the data is not a straightforward task.
Mapping the journey out can be very revealing. This Client often knows that they have issues with the accuracy of data but cannot pinpoint where the data is open to being touched by dirty hands and in danger of material deviations from the truth. Where are the invisible copy and pasting between spreadsheets, how open is the process to “Could you just rework that spreadsheet for the board meeting that is in 5 minutes?”
Once this data journey is made visible, you can start justifying investment in bolstering governance and putting in data quality checks at points of high risk. A portfolio of pragmatic interventions to measure and reduce the material impact of data issues could include: spot fixes; process monitoring; activity controlling; process automation; system implementation; system retirement; spreadsheet eradication; refocusing existing initiatives; or extending the reach of existing good practice.
You can call this Data Governance or Data Quality Management, and you can call the practitioners Data Stewards, you can call them Chickens, or Eggs. The important things are that we
- Identify and utilize the good practices already within the business (maybe outside of finance)
- Incrementally deliver improved trust in data by executing a portfolio of initiatives.
All while moving closer to being able to stamp a “Lion Quality Mark” on your financial reports, planning and analytics.
At least this held their attention for 4 minutes, and they started relating it to their specific challenges and worries about the unknowns in their data journeys. It got them thinking about the importance of looking after their data through a combination of proactive controls and reactive monitoring – not just in the systems & processes that they are accountable for but all the way along the data supply chain. It got them thinking of ways of simplifying the journey by consolidating steps and the reducing opportunities for contamination.
Whatever your journey, however good your map, however often you wash your hands: good cluck!
Stuart Squires is the Managing Director of Comma Group, a Data Management Consultancy specialising in business, technical & change aspects of PIM and MDM initiatives. He actually doesn’t get bored when talking about data: weird.
(image credit @unsplash @AnnieSpratt)