Good Master Data Can Save Your Analytics Project
Early in my career I learned the hard way that successful business intelligence and analytics solutions must have good quality master data.
Master data is an attribute that provides definition to numeric facts. For example, sales volumes and sales values might be associated with a product portfolio, geographical location, retail channel or outlet, perhaps even a customer.
When you are implementing an analytics solution that will provide insights to help you and your team make data-driven decisions, there are some key elements for success: good architecture and design, a performant and scalable technical platform, and knowing from where to source your data and how it will integrate.
When we design a solution from scratch, master data is high on our list of priorities. And when customers ask us to help them resolve issues with existing solutions, for example if the insights are not as they expected, our investigation often follows a trail from calculations through to fact data and on to the master data.
For any system that aims to give insights to its users, therefore, master data is fundamental. A thorough understanding of your master data in an analytics project saves time and expensive remediation effort.
Why is Master Data so important?
Master data is the digital footprint of any business. It changes and aligns with how a business grows and evolves. Over the years, companies develop many applications that depend on master data. Since it resides at the foundational level, any changes to master data reverberates through all the applications using it. If the impact of changes to master data are not considered, disruptions to the business can be high and require onerous restatement exercises to realign datasets across systems.
So how do you ensure that you have high quality master data? And how should ongoing changes to this important data be handled by applications?
If your organization commands a large set of master data, a few simple considerations will help in the long run:
1. Understand your data
Make understanding your master data a key part of the solution architecting process. Have a clear knowledge of your datasets, the master data they need and how they interact (e.g. aggregations, parent-child relationships, geographical diversity, etc.). The greater your understanding, the better your chances of designing a solution that correctly caters to the data structures and delivers the desired outputs.
2. Design your solution to cater for changes
Knowing your master data and having a good design with a point-in-time perspective could leave your system unable to adapt to future changes. Ensure the design includes an approach to handle future master data changes and that the data structures support changes at all levels. A well-designed system minimizes disruptions to users.
3. Have a plan for regular alignment
Change is a constant and every organization will go through reforms that will impact the applications they rely on for insights. Don’t wait until an application completely fails to realign the datasets and master data. Undertake regular restatements to save time and effort by keeping your applications healthy.
4. Invest in a good support system
When a new change (for example a new product or location) is introduced, it is often tempting to implement a quick fix to solve the issue. This might cause misalignment and make future issues harder to trace. Invest in a support service that knows the system logic, understands the data and related business processes, and can work with you to devise and execute an approach that ensures changes are made in all the right places.
Investing a little extra time during the early stages of an engagement to study and understand your master data can reap benefits throughout the application’s life. By reducing the number of issues faced during development, and keeping the application healthy with an ability to adapt to change, time and effort is saved in the long run.