img-data-quality

DATA QUALITY

Data Quality has always been a significant concern in Data Management. At Data Millennium we see Data Quality as being composed of four components:

DATA QUALITY LITERACY

Data can sometimes just be wrong, but sometimes it can be correct but not useful for a particular use case. This means that Data Quality is sometimes objective and sometimes subjective, which can be very confusing for organizations. We provide training in Data Quality Literacy which helps data users understand the basic Data Quality terminology and concepts.

This training is also important for staff engaged in creating data, as many Data Quality issues can be prevented at input time. However, these staff are often pushed to meet quantity and timeliness goals, rather than quality goals. Also, with improved Data Quality Literacy the elimination of root causes that lie in the areas of people and processes becomes easier.

Fig. 1 “Data Quality” can mean different things in different situations.  We need to clearly distinguish these two aspects of it, especially when “Data Quality” gets blamed for business problems.  

DATA QUALITY MONITORING TECHNOLOGY IMPLEMENTATION

Our philosophy is that data environments are too complex for simple profiling and Data Quality Business Rules Engines are needed for continuous monitoring.

Simple profiling is the inference of Data Quality issues by examining data values to look for outliers. This definitely has some value, and with the rise of AI there is some promise that it could become much more accurate.

There are several Data Quality Business Rules Engines in the market today and we are familiar with most of them. All of them require business users to specify what the rules are to test for data quality. These are then converted into an executable form and run. Any exceptions that are detected are sent to stewards to go into the Data Issue Management Process.

Implementation of a Data Quality Business Rules Engine has some technical activities, but a lot of it is about the methodology that goes around the tool, and a big part of this is formulating and governing Data Quality Business Rules. Data Millennium has a proven approach to governing Data Quality Business Rules that will prevent the rules themselves becoming unknown quantities that are not trusted and which nobody is sure are needed. We ensure rules have a well-defined business-understandable form, have sufficient metadata to manage them, and are kept up to date.

DATA ISSUE MANAGEMENT PROCESSES

Once a data exception is found, there has to be a standard process that will take it to the point where a resolution is agreed on. Data Millennium has a number of patterns for such processes, and we can find and customize one for your enterprise. Tooling is less of a concern in this area, but some enterprises do want to use workflow tools for Data Issue Management, and our patterns can be adapted to them.

Fig 2: Data Quality consists of these three parts. An overall Data Quality Program must ensure they are all implemented.

DATA CHANGE MANAGEMENT

Once a resolution for a Data Quality issue has been decided on, it needs to be implemented. This can involve many different kinds of planning and coordination – many of them not technical. There can also be regulatory and compliance needs that must be considered at this point. Having a robust Data Change Management capability is important so that new issues are not created. Again, we can help enterprises establish this capability.

FOR MORE INFORMATION, CONTACT US AT

CONTACT OUR TEAM OF EXPERTS

Don't Hesitate To Connect With Us!

Please use this form and we will get back to you as soon as possible.