eGovernment for Development
eHealth Case Study No.7: Design-Reality Gaps
Computerising a Central Asian Epidemiology Service
Case Study Authors
Valeriya Krasnikova and Richard Heeks
Organisation
The national Epidemiology Service provides plans, reports and programmes about a variety of disease and public health issues.
Application Description
The application was the planned introduction of computers into the Epidemiology Service to replace the previously-manual processes of gathering, processing, storing and reporting disease and public health data. The main software was a series of packages for registration and analysis of various specific diseases and public health risks. These created a single common computer-based information system with local, regional and national databases that relied on common data items.
Application Drivers
Within the Epidemiology Service, there was a general awareness that the existing information systems did not allow the Service to monitor and analyse current health trends properly, or to make public health decisions in an effective and timely manner. Thus, top managers within the Service were a key driving force behind the application as they sought improvements in the Service's performance. There was also general external support for the programme, with the President wedded to modernisation of the public sector, and with citizens - becoming used to growing democratic processes in the country - voicing demands for better information and better health services more generally.
Stakeholders
The project was initiated by senior managers of the Epidemiology Service, and agreed by staff of the Ministry of Health. Key users were most staff at all levels within the Epidemiology Service - managers, health specialists, statistical specialists, and (later) information systems personnel, and external users in various ministries, local authorities, research institutions and international organisations. Ordinary citizens were the ultimate source of much of the data, and also the ultimate intended beneficiaries of the project.
Design-Reality Gap Analysis
Design-reality gap analysis compares the assumptions/requirements within the application design with the reality pertaining just before that design was implemented along seven 'ITPOSMO' dimensions:
- Information : the design built on the pre-existing data items and systems within the Epidemiology Service but altered these to create an updated and rationalised set of data classes, and a new set of data capture forms. This created a medium design-reality gap on this dimension. Gap score: 5.
- Technology : the design assumed the use of a broad range of software and hardware, with two PCs per statistical department in all the offices of the Service. The initial reality was manual operations: paper supported by typewriters, phone, fax and post. This created a large design-reality gap on this dimension. Gap score: 8.5.
- Processes : the design assumed automation of pre-existing Epidemiology Service processes with some amendments made to the way in which data was gathered, processed, stored and output - with many previously human processes (including checking and retyping of figures) being altered to computerised processes. Almost all of the key public health decisions were to remain as they did prior to computerisation (only assumed to be made more efficiently and effectively). This created a medium design-reality gap on this dimension. Gap score: 5.
- Objectives and values : the design assumed that the objectives of the project (automation of processes, better decision-making) were shared by all stakeholders. In reality, prior to computerisation, most senior officers supported these objectives, since it was they who had initiated the project. However, most staff within the statistical departments initially opposed the system; they feared changes in their working patterns and they feared job losses. Overall, there was a medium design-reality gap on this dimension. Gap score: 5.
- Staffing and skills : the design made two significant assumptions. First, it assumed the ongoing presence during and after implementation of a cadre of staff with strong IT and information systems skills. The initial reality was that no such staff existed in the Epidemiology Service. Second, it assumed a reduction by 50% in the numbers of staff within the statistical departments since human intervention in many data-handling processes would no longer be required. Clearly, this was significantly different from the initial reality before computerisation. The design also assumed the addition of some new skills, but no other changes in large numbers of jobs within the Service. Overall, this created a medium/large design-reality gap on this dimension. Gap score: 7.
- Management systems and structures : the design proposed some cosmetic changes to pre-existing structures, with the statistical departments being renamed information systems departments. Otherwise, though, the Epidemiology Service's systems and structures were designed to remain as they were at the point of initial reality. Overall, this created a small design-reality gap on this dimension. Gap score: 1.5.
- Other resources : there were fairly generous allowances for the project timescale - approximately two years - within the overall design that mapped well onto the availability of personnel. The overall design budget called for expenditure of around US$1.1m, but this was seen as being likely to be roughly matched by staff cost savings in the statistical department. This meant an overall small design-reality gap on this dimension. Gap score: 1.5.
- Overall : there was an overall medium design-reality gap in this case. Total gap score: 33.5.
Design-Reality Gap Reductions During Implementation
A medium/33.5-score design-reality gap may well bring some problems for an e-health application (follow this link for further explanation of the score).
However, the Epidemiology Service took actions during the implementation process that helped to reduce the size of the gaps:
- The automated system was prototyped with various groups of staff. This led to alterations in design elements to ensure they were closer to the real objectives/vision of those staff members; it enabled the active involvement of staff in the project, causing some of their real objectives and values vis-à-vis the system to come more into line with those anticipated within the design; and it reduced the time required to implement the package, bringing it even closer to the reality of time that was available.
- The Service's Human Resource department organised a promotional campaign on behalf of the new system, informing all staff about its function and value. They created a set of personal reward/motivation plans. They prepared a set of job positions for those displaced who did not wish to leave the Service. All of these helped alter the reality of staff objectives and values held about the system, particularly staff within the statistical departments. These objectives and values came more into line with those required within the automation system's design.
- The HR department also organised a series of training activities, and supported production of a good set of guidance documentation on the system. Both of these brought the reality of staff skills and other competencies more into line with the design requirements.
As a result, what was judged to be a medium design-reality gap at the start of the implementation process had been reduced to a small/medium design-reality gap later in the implementation process.
Evaluation: Failure or Success?
The overall computerisation project was largely successful, as might well be expected with a small/medium final design-reality gap implemented over two years.
The project was completed within time and within budget. All of the installed software systems are in frequent use within the Epidemiology Service itself, though usage rates by external users are lower. There has been a small but consistent and significant increase in usage of data provided by the Service. Perhaps most importantly, the system can be credited with a key role in disease control. For example, shortly after the system's introduction in 1997 a rise in diptheria cases was detected via the system. Coverage of the vaccination programme was strengthened, and revaccination was organised. By 2000, coverage levels had risen from an average 88% to 99%, and diptheria case levels had returned to their historical norm. Such actions were possible with the manual system, but automation reduced the decision-making period from 15 to 2 days, and also helped cut costs by allowing better prioritisation, planning and targetting of vaccination.
The only reported problems faced by the project have been constraints due to some rather outdated PCs being included in the installation, and conflicts that initially arose between health/epidemiological staff and IS staff, including conflicts at management level. Of course, those staff who were displaced from the statistical departments might well have a different opinion about the application, and not rate it as 'largely successful' from their perspective.
Recommendations: Reducing Design-Reality Gaps
One of the good practices of this project was that it was participative, ensuring that the design and implementation process involved a broad range of stakeholders. Information needs analysis covered managers, statistical officers, and external users to ensure a small gap between designed and actual information needs. The project was guided by a mixed team of epidemiological and information systems specialists, ensuring that design elements such as processes or skill requirements did not fall too far out of line with existing realities in the Epidemiology Service.
The project - if sensitised to the socio-technical rather than purely technical nature of information systems - could have anticipated some of the objectives/values problems. If so, it could have introduced from the start the improvisations introduced during implementation by the HR department.
Case Details
Author Data Sources/Role : Documents and Interviews; No Direct Role
Outcome : Largely Successful.
Region : Central Asia. Start Date : 1995. Submission Date : January 2003.