Continuous Process Improvement, Lean Improvement or Six Sigma efforts can only be as successful or as reliable as the quality of data they use!
Data Quality is a pernicious, persistent and widespread problem in every organization. On the surface, reports look neat, wrapped up, and reliable but quite often the data they rely upon can be of differing quality levels. Even if backend enterprise information systems are all reliable, established and running for sometime, a lot of the Information people use may come from Data that may be from manually generated Excel-Spreadsheet-based Skunkworks Reporting Systems!
There are some simple ways to apply the same techniques you use for Process Improvement that you can use for ensuring Data Quality improvement.
The first of these is to apply the Six Sigma techniques that you use for Process Improvement to improve the quality of data. The first task may be to apply Paretos Law (80/20 rule) to narrow down the key pieces of data that are most important to the process improvement task at hand. For example, in a Business Process, Productivity may be the Key Performance Indicator (KPI) that is most important rather than another KPI like Absenteeism or Employee Turnover. Focusing on the Productivity KPI alone and the Data that goes into its calculation may be the way to go.
Not all data are equal. Some data may be more equal than others for your process improvement purposes! Concentrating on only those may be the most pragmatic way to go!
Tracking errors over a period of time in the Data of interest and reducing them to a minimum and more importantly reducing the variation in data quality from period to period may be important. WIld variations in the data quality make the data that much more unreliable.
Once the problematic areas are identified, it makes sense to do Root Cause Analysis on the sources and methods of creation of the data. This could be related to people or technology/software related issues. Figuring out where the root cause of the problem lies goes a long way in fixing the diaease rather than symptoms!
Monitoring Data Quality is important in making sure that your own observations before and after process improvement have validity and reliability, and you are not deluding yourself with faulty data in the first place!
Two men were examining the output of the new computer in their department. After an hour or so of analyzing the data, one of them remarked: "Do you realize it would take 400 men at least 250 years to make a mistake this big ?” – Anonymous