Data foundation project

WHAT IT MEANS TO BE DATA DRIVEN. Most companies strive to be data driven in their decision making and operations, but not all have the necessary data foundation to make it work. Unfortunately, there are no short cuts. Becoming a data driven company will take investments, management buy-in, continuous maintenance, meticulous follow-up and in many cases a cultural shift which can be perceived as painful. But there are huge benefits which makes the effort worthwhile:

  • Reliable data makes it possible to build structural capital into your operations – to operate according to clear processes and roles instead of relying on specific individuals. This adds efficiency and transparency to operations, allows for learning loops and helps the organisation to retain knowledge when people leave.
  • A solid data structure allows for a common baseline to measure performance and efficiency against. It is surprisingly common that organisations lack clear definitions for even their most basic KPIs. Often, KPIs contradict each other or are changed arbitrarily which makes it impossible to know if you are improving operational efficiency as planned.
  • Reliable operational data allows you to automate internal as well as customer facing processes and tasks. Even the simplest RPA or ML/AI applications need reliable data and repeating processes to work and the better your data foundation is, the more you can leverage automation to save time and improve quality.
  • Lastly, a solid data foundation is a prerequisite for successfully driving change and improvement projects. If you don’t have your data in order, a lot of time and money will be spent on trying to retrieve truths from inconsistent and inconclusive data.

CASE STUDY. This SaaS company in the financial software industry had a large installed base of customers sitting on a wide variety of product versions, installations and contracts. The production systems were not designed for extracting and analysing data. The CFO function was not putting requirements on extractable data for financial analysis.  Since the data was not used in a consistent way in finance or operations, data quality was poor – no one was responsible for data quality. The company used Power BI to analyse and visualize data but on ad hoc basis and often on one-off batches of data making it cumbersome to repeat the analysis or extract longer time series for decision making.

The way forward for this company was a set of parallel initiatives to improve the data quality and accessibility with the goal to become a truly data driven operation.

  1. Agree on where to start. A project to identify the most important data points was initiated. This was done both top-down based on industry benchmarks and best practice in reporting and bottom-up based on each functions operational needs for better data quality. It quickly became clear that product and customer profitability were areas where the company was almost totally blind. Hence, deciding on COGS and cost-to-serve metrics was in focus.
  2. Set Master Data Management. A project was started to assign data owners and data quality responsibility to each agreed datapoint. Automated data quality reports were developed to allow for continuous follow-up. A periodic data forum was  set-up to prioritize improvement requests, discuss challenges and share knowledge cross functions.
  3. Clean and improve the data. The actual data clean-up was automated as far as possible with imports. However, in many cases, the data was non-existent in any structured format and had to be manually added. Trade-off decisions were sometimes made where the cleaning was not deemed worth the effort and in those cases shorter data series were accepted.
  4. Invest in DW/data lake. It was also decided to invest in a data lake to connect the various operational and financial systems for easier access to the data, improved analytic capabilities, simpler governance and improved security and access handling.

TANGIBLE BENEFITS. The result of the efforts was that the company for the first time knew the true profitability of different customers, products and markets. The insights were used for internal benchmarking with best practice sharing cross segments and markets. Packages and prices were adjusted to meet profitability targets. Unprofitable products were discontinued or right-priced.

Product roadmaps were revisited based on solid business cases reflecting both revenue potential and profitability, leading to improved R&D ROI. Likewise, marketing spend became more efficient as marketing activities were retargeted to the areas with the best revenue/profitability mix.

Another benefit, perhaps more difficult to measure, was that the organisation started to trust the data enough to base decisions on it. Data gradually replaced anecdotal evidence and gut feeling as the basis for decision making.

DON’T FORGET. Management buy-in and internal champions are needed to keep the data in shape over time. Each change you make will affect a data point somewhere, and those responsible need to act on it. Like any cleaning, it comes with an effort and needs to be incentivised to happen!