Often in the implementation of software systems, attention to master data is neglected in favour of the transactional data and the processes surrounding that. And almost understandably so as the question to be answered arises from the transactional data, for example, “We need a system to help with planning and scheduling the work”. For that to be effective, you need accurate master data for example what staff members do you have available, what skills do they have, on which assets must the work be performed, where are those assets, etc.
The result of this neglected attention is that master data is then treated as static data which it certainly is not. Staff members, financial codes, assets; they all change just not as often. And then often the master data is not looked at again after implementation until a specific item is not found. I believe that the effective management of master data is facilitated through creating ownership of the data, formalising the processes to manage changes to and regularly evaluating the master data.
1. Single source of truth
Ownership of master data should only be in one system and with specific responsibility to maintain that data. Let us look at financial codes in a CMMS. The financial codes should only be maintained in the financial system or ERP and then used in the CMMS. The ERP becomes the master of the data and the CMMS the slave. Updates to financial codes in the CMMS can only happen as a result of changes in the ERP. Ideally, you want a date stamp on the slave data to indicate when last the data was verified from the master data.
During a recent implementation, I was presented with two spreadsheets containing data on the same items with the two sheets coming from separate management systems. It was impossible to reconcile the information as there was no master system and thus both systems created their own identifiers which were not cross referenced to the other. The cost of this? Someone had to go to each item to identify it and cross reference back to both information systems.
2. Formalised change processes
Once the responsibility and origination of master data are identified, formalised change processes should be in place to manage any changes typically contained in three categories: change, delete and add or insert. It is important that the impact of changes is understood by the person making the change and that the change is communicated effectively. A mere change in staff code in one system can have a dramatic change in other systems example the payroll system!
Quite often we do not want to spend too much time in managing these small changes by engaging with all stakeholders. And whilst every effort must be made not to paralyse systems with change management, effective communication of changes will limit data inaccuracies. A top solution would be to have a list of impacted systems contained on the master record for reference.
3. Scheduled assessments
When electronic interfaces are used to effect changes to master data between systems care must be taken to not only rely on the error notifications of the interfaces. The error notifications are typically reviewed by an IT specialist responsible for the interface but they are not directly impacted by the change itself. Here a monthly master data change log report can be a valuable tool to verify what the changes were and that the changes have been effected.
Other than only managing the changes to the master data it is also important to schedule master data assessments. This is similar to assessing the condition of the assets or assessing the condition of the tools to be used by artisans. Master data must be assessed for relevance (do we still require it), quality (is it useable) and ownership (who is maintaining it). Your master data is the foundation on which transactions are done and just as you would assess your assets in order to create products, you need to assess master data to facilitate transactions.
In closing, the control of master data is crucial to the success of any information system. This is why we spend such a concentrated effort on master data during an implementation of On Key. Consider the cost of losing control of your master data in terms of the time and effort to reconcile data between systems with the truth and the time wasted on completion of transactions due to inaccurate master data. Also, consider the cost of missed opportunity when analysis of information is based on inaccurate master data. The effort to reconcile master data is always more than the incremental effort to maintain the data.
subscribe to our newsletter