twitter

News & Views

Information Management and Data Quality: The Siamese Twins.

shutterstock_37745392
According to Gartner, in a survey of over 140 participants, organizations lose on average £5.1 million every year due to poor data quality. This is a phenomenal amount. And it underlines the importance of data within an organization. Although organizations are aware of the importance of quality data, most are oblivious to the impact defective data has on their business. In business terms, the price organizations pay for using poor quality data is ominous and tangible.
Empower Your Business with Quality Data.
 
A lot has been written and spoken about data quality. But, what is data quality? Data quality is not about data that is devoid of errors. It is wholly about how relevant the data is to the business. It is about the suitability of data to meet business requirements. Drilling further into what is suitable will vary from organization to organization. However, regardless of organization, 7 key components could be identified as constituents of quality data.
To be useful to an organisation, data has to be
·      Accurate,
·      Possess a high level of Integrity,
·      Consistent across the enterprise,
·      Complete,
·      Valid at the point of delivery,
·      Delivered on time, on request, and
·      Readily accessible to the business.
The top 5 is all about content and structure. However, it serves the business no good if the data is error free, or near error free and does not serve any core business purpose. Hence, the last two deals with usability and usefulness.
The following are industry best practices for enhancing data quality.
Launch a Data Quality Program:
The effort to deliver quality data within an organization should start with the launch of a DQ program. Experiences from past projects have revealed that most successful DQ programs were launched from the top of the organization, with the corporate level vision serving as a key driver for successful execution.
As part of launching a DQ initiative, an enterprise-wide data stewardship should be established with well-defined roles and responsibilities. These roles may or may not be full time roles.
Develop a Project Plan:
Following the launch of a DQ program should be the development of a project plan. There have been instances where organizations have started a DQ program with no defined metrics for success and follow-up. A clear plan, scoped with realistic goals and the expected return of interest (ROI) should be laid out and well documented.
Build a Team:
With a plan in hand, suitable human resources should be identified to fill the roles and undertake the required responsibilities.
Review Business Processes:
A review of how data is used within the various units should be carried out as well. More importantly, a review of how shared data is generated and managed. There is also a need to review the underlying architecture that supports the data. Hence, a data audit of all data repositories is necessary. Also an evaluation of the current and future strategic business processes is necessary, to establish how to proceed. A review should unravel cases where and how customer records are stored in various systems.
Assess Data Quality:
The general purpose of the assessment is to
(1) Identify common data defects
(2) Create metrics to detect defects as they enter the data warehouse or other systems, and
(3) Create rules or recommend actions for fixing the data.
Be sure to look out for where there are incomplete, inaccurate, missing or duplicate values.
Cleanse Data:
This step involves cleansing data identified as not fit-for-purpose. Inaccurate data, incomplete data, duplicates, inconsistent data, could all be classified as not fit-for-purpose. Data should be in a standardized form, accurately reflecting the business. Robust data quality checks should be insert as part of the Extract Transform Load (ETL) processes. The end implementation could be Master Data Management (MDM), Data integration project or simply a data cleansing exercise.
It is good practice to have a temporary storage area (staging area), which serves as a buffer between the main storage area and the source applications. Such a design prevents ‘bad’ data from entering the central business storage area and enables effective tracing and auditing.
Monitor Data:
A process to continuously monitor the quality of data items should be put in place. Reports should be generated at intervals, to enable management have an oversight over the monitoring process. Service level agreements (SLAs) should be reached on what is acceptable or not. Alerts should be raised once the agreed threshold has been passed.
Improve Business Processes:
To ensure the sustained delivery of quality data to the business, appropriate practices should be cultivated within the organization. Some of these practices include educating and training members of staff. The use of standardised codes, definitions and rules across the enterprise goes a long way in improving the overall business processes. Also, creating and enforcing referential integrity will prevent unsuitable data from entering the system.

« / »