Skip to content ↓

New program aims to improve TQM data

Total Quality Management (TQM) programs to improve productivity and customer satisfaction often overlook the quality of the data on which TQM decisions are based, according to Sloan School of Management researchers.

Professors Richard Wang and Stuart Madnick believe new tools and concepts are needed for businesses to deliver timely, cost-effective information. They have initiated the Total Data Quality Management (TDQM) program with the goal of defining a theoretical foundation for data quality research and identifying improvement methods for business and industry.

The program links the MIT Information Technology group, the program's founding industry partners-Fujitsu Personal Systems, Inc., and Bull HN Information Systems, Inc.-and related industry-specific MIT research programs (including the Leaders for Manufacturing, the International Financial Services Research Center, and the Center for Transportation Studies).

The TDQM team has already completed more than 10 papers on topics such as defining, measuring, analyzing, and improving data quality. One effort, involving a survey of data consumers, surprisingly showed that users consider the believability, value added, and relevancy of data to be even more important than accuracy when making strategic decisions.

Another effort involves developing quality measures for reports based on raw data that are not defect free. A data-quality calculus has been developed to assess the quality of such derived data. The calculus computes the quality of the result, allowing the user to judge the risk involved in using the data, and evaluate alternative quality-control strategies.

"The TDQM project is a unique research effort that has both a long-term and a short-term focus," said Dr. Wang, codirector of the TDQM research program and assistant professor of management science.

"In the long term, our goal is to create a theory of data quality based on reference disciplines such as computer science, organizational behavior, statistics, accounting, and total quality management. This theory of data quality, in turn, may serve as a foundation for other research efforts that require quality information. In the short term, our goal is to create a center of excellence among practitioners of data quality techniques and to act as a clearinghouse for effective techniques and methods."

The major components of the TDQM research framework, Dr. Wang said, are definition, analysis, and improvement. Definition focuses on defining and measuring data quality. Analysis determines the impact of poor-quality data and the benefit of high-quality information on an organization's effectiveness. Improvement involves redesigning the business and implementing new technologies to significantly upgrade the quality of corporate information.

Dr. Madnick, John Norris Maguire Professor of Information Technology and Leaders for Manufacturing Professor, who is the other codirector of the TDQM research program, cited a case study in which a transportation company determined that almost 25 percent of its deliveries were delayed due to erroneous, missing, mistrusted, or inappropriate data. This resulted in unnecessary redeliveries, avoidable compensation payments, and dissatisfied and lost customers. The company instituted a comprehensive data quality improvement program, which included advanced data collection techniques. Enhanced data quality resulted in more effective deliveries, estimated cost savings of several million dollars, and the reversal of declining market share.

Lou Panetta, executive vice president of marketing and sales at Fujitsu Personal Systems, said, "Total data quality management of today's corporations requires 100 percent accurate data input from the field sales, service, and support organizations. Erroneous data entering the organization compounds itself as decisions are made based on inaccurate information."

Dr. Robb Wilmot, Fujitsu Personal Systems chairman, agrees, "The prototypical chairman's statement in an annual report mentions 100 percent customer satisfaction by the third paragraph-if not sooner-with no explicit recognition in the enterprise that the data with which this goal is to be achieved is typically between 5-10 percent defective. If you analyze the supply-chain costs in a large company, it is not uncommon to find that half of the total cost is rework caused by defective data-with untold competitive costs."

Daryll Wartluft, vice president of Bull Information Systems, noted that "the TDQM effort is an important part of our relationship with MIT and ties in well with Bull's worldwide commitment both to our own internal quality and to assisting our customers in better understanding and improving their information quality."

The research team will disseminate results to sponsors at seminars and symposia, Professor Wang said. Sponsors will participate in on-site research and case-study development. Sam Levine, a leading practitioner in data-quality management, is sponsor liaison for the program. He said that one of TDQM's first activities will be a symposium with corporate sponsors to discuss the business costs of poor-quality data and the benefits that can be realized through using high-quality information. For information on the TDQM program, contact Professor Wang at x3-0442, or write to The TDQM Research Program, Rm E53-317.

A version of this article appeared in the April 14, 1993 issue of MIT Tech Talk (Volume 37, Number 29).

Related Topics

More MIT News

Headshot of Catherine Wolfram

A delicate dance

Professor of applied economics Catherine Wolfram balances global energy demands and the pressing need for decarbonization.

Read full story