AccessMyLibrary provides FREE access to millions of articles from top publications available through your library.
In the movie War Games, set in the U.S. Strategic Air Command Center under a mountain near Colorado Springs, there is a critical scene: on a huge map of the globe are arrows indicating large numbers of nuclear missiles approaching the U.S. from the Soviet Union. The military official in charge is trying to decide if he should ask the President for permission to launch a retaliatory attack because the technicians are telling him the threat is real. At this point, the scientist who originally designed the system (and who knows that something is wrong with the system itself) says, "General, what you see on that board is not reality, it is a computer-generated hallucination!"
Recently, a former Director of the CIA revealed that a real-life version of this fictional scenario was actually played out when a test tape was inadvertently installed and the screen at a similar command center warned of a similar nuclear attack. As computers play increasingly important roles in the real world -- a world in which computer-generated outputs often present a picture of the real world for critical activities -- it is increasingly vital that the pictures being displayed are correct!
In the mid-1970s, a number of my colleagues and I developed a model for information systems that predicted: (1) major computer systems problems involved with making the transition to the year 2000; (2) data quality difficulties in many operational systems being developed at the time; and (3) fundamental issues involved in the accuracy of confidential/secret data.
The theory that allowed us to formulate our predictions involved viewing information systems as subsystems embedded in a larger framework of a real-world feed-back-control system (FCS) (Figure 1). Two observations from our work caused us to look at information systems this way: (1) all of the information systems we developed operated in a larger, goal-seeking, organizational environment, and (2) those systems that failed to take into account a larger FCS context were difficult to operate and their outputs were difficult to reconcile with the real world. We began to see that the data and data quality in our information systems did not exist in a vacuum. As a result, we began to explore the implications of a true systems cybernetic) model for information systems.
[FIGURE 1 ILLUSTRATION OMITTED]
The principal role of an information system is to present views of the real world so that the people in an organization can create products or make decisions. If those views do not substantially agree with the real world for any extended period of time, then the system is a poor one, and, ultimately, like a delusional psychotic, the organization will begin to act irrationally.
From the FCS standpoint, data quality is actually quite easy to define. Data quality is the measure of the agreement between the data views presented by an information system and that same data in the real world. A system's data quality of 1009o' would indicate, example, that our data views are in perfect agreement with the real world, whereas a data quality rating of 0% would indicate no agreement at all.
Now, no serious information system has data quality of 100%. The real concern with data quality is to ensure not that the data quality is perfect, but that the quality of the data in our information systems is accurate enough, timely enough, and consistent enough for the organization to survive and make reasonable decisions.
Ultimately, the real difficulty with data quality is change. Data …