A Monthly Article from our Speakers
Current Article of the month
IQ AND DATA WAREHOUSING: TRENDS AND BEST PRACTICES
This article is a brief review of information quality in data warehousing and trends that are increasing business intelligence effectiveness through proactive information quality.
Management’s Need for Quality Business Intelligence
Management has long had pent-up needs to understand how well the enterprise is operating. Operational systems support operations processes, but cannot support directly the analysis of the operations. Business Intelligence must deliver information in ways that support the strategic and tactical processes of, including fact-based management decisions, understanding customer or product and service trends.
In the 1970s, organizations created “Management Information” functions to deliver information. In the 1980’s, “Information Centers” sprang up, the precursor to modern Business Intelligence environments. Executive Information Systems (EIS) came into being to extract information and present “key business indicators” to executives, leading to today’s dashboards and data warehouses.
Business Intelligence and Information Quality
Early data warehouses exposed huge information quality issues:
- Excessive redundant and disparately defined operational databases created huge efforts and costs in extracting, transform, transforming and loading the data warehouses.
- Missing, invalid and inaccurate data caused huge data cleanup activities, but showed you cannot correct it all.
- Duplicate customer records, both within and across multiple databases caused additional high costs in matching, reconciliation and synchronization of data and spectacular failures when consolidate two different people’s vital information!!!
One information quality lesson was that my department’s information quality does not equate to enterprise quality information. “My” data may be okay for our department, but could caluse “your” processes to fail. Integration efforts simply could not resolve all issues. These data warehousing experiences proved the common definition of information quality as “fitness for purpose” was not sufficient for information quality. Data warehousing shows that for information to be high quality, it must meet all information consumer needs, not just the “information producers.”
Organizations with these problems are not managing information as a strategic resource. Neither are they applying sound quality management principles to information as a shared product, required by many information consumers.
The Value Proposition for Quality Information
The driver behind Business Intelligence is a simple fact common to all resources, from financial, human, materials and equipment to information. The value of any resource is derived only when we apply the resource or put that resource to work. Money has value only when we spend or invest it wisely; people contribute value when they perform their “value work”; a manufacturing plant has value when it is in production, producing products.
Information has value only when knowledge workers or applications retrieve it and apply it to perform work correctly or make intelligent decisions.
But there’s a catch. Information without quality is dangerous, misleading thoughtful knowledge workers to make wrong decisions.
The Evolution of Information Quality Practices in Business Intelligence
Most early data warehouse “quality” approaches were reactionary, correcting data in the data warehouse or in the staging area before loading. This early and immature data quality approach parallels early quality practices in manufacturing of the Industrial Age. Quality meant putting inspectors at the end of the assembly line to inspect and pull defects off to be reworked or scrapped if they could not be fixed. Hence, we have the “inspect and correct” or “scrap and rework” approach to quality.
Manufacturing quality management matured, replacing “inspect and correct” approach with a proactive “design quality in” approach. This eliminated the costs of “inspection” (that does not add value) and “scrap and rework” (waste of time and materials that could have been saved by producing products correctly).
Leading-edge organizations are proving that “designing quality in” to processes and operational systems at the source is a more cost-effective way to solve IQ problems. Some of their lessons learned:
- Use the operational data store (ODS) concept to create an enterprise-strength ODB (operational database) so you can reengineer and replace obsolete disparately-defined data structures in a gradual, phased in approach. This reduces the costs and risks of integration while eliminating a major cause of poor quality information.
- Solve information quality problems in the source processes—not downstream. This prevents inconsistency problems of data in source and target databases.
- Conduct data cleanup as a one-time project for data in a given dataset, and,
- Improve the source process and application with error-proofing techniques to prevent defects.
- Provide training to information producers managers, helping them understand their downstream information consumers’ information quality requirements and equipping them with methods and techniques to improve their processes.
- Implement accountability for information quality in every manager’s job description. Provide them training and give them six months to a year to put IQ principles, process improvement and staff training in place. Then hold them accountable.
Organizations today are doing one of two things: (1) building more and more data warehouses without proactively addressing information quality problems; or (2) solving information quality issues at the source, enabling their organization to become “Intelligent Learning Organizations.”