The measures used within the work of the IPC program are designed primarily to help sites test the degree to which their ideas for change result in the predicted improvement. Usually, this is done in the context of a series of small tests of change called Plan-Do-Study-Act cycles. By sequentially testing ideas for change under different circumstances, knowledge is built regarding which changes result in the most improvement.
This type of measurement evaluates ideas for change. It does not evaluate whole programs and their system-wide effects over long periods of time. Nor does this approach answer questions about the merit, worth, and significance of the program for the whole system. It does not address long-term health outcomes, costs, and satisfaction of patients and employees. To answer these types of questions, the IPC Evaluation Team, a working group of technical experts from throughout the Indian health system, was convened in 2008. Evaluation efforts can be divided into two phases: the evaluation of IPC 2 and the evaluation of IPC 3.
After convening a meeting of experts in the evaluation of quality improvement in 2008, the Evaluation Team designed a least-cost evaluation strategy using existing data and qualitative inquiry. The aims of the evaluation were:
Aims A and B are primarily addressed through the analysis of existing data sources, which includes:
Quantitative data analyses continue as data for the follow-up periods become available. The IPC 2 sites that have continued in the Learning Network can be followed into the future. Aim C items were addressed through qualitative interviews of staff and managers at selected IPC sites during 2010. This analysis has been completed.
A more extensive evaluation is required to respond to additional information needs of stakeholders and decision makers as the IPC program has expanded to IPC 3. With encouragement from IHS headquarters, the Evaluation Team chose to plan the evaluation of IPC 3 using the Framework for Program Evaluation in Public Health devised by the Centers for Disease Control and Prevention. The CDC framework draws from evaluation in many fields—from health care to education to criminal justice. The framework-inspired steps for developing and implementing a quality evaluation include:
In this approach, a committee of stakeholders who will make decisions about the program’s future, or who are affected by the program (staff, patients, community leaders), will work together with the evaluation team to plan the evaluation and to interpret and communicate the results. Drawing from a preliminary logic model, the stakeholder committee will decide on the priority questions the evaluation will answer. The qualitative and quantitative methods for the evaluation will then be based on the priority evaluation questions and the relevant measures suggested by the logic model. Relevant measures will be selected from:
Within the limits of the budget, the evaluation of IPC 3 will be open to quantitative and qualitative methods. The evaluation will collect original data from patients and charts. A variety of methodologies from the fields of health services research will be employed, including evaluation in the social sciences, applied economics, applied anthropology, and epidemiology.
The evaluation effort is accountable to the Office of Clinical and Preventive Services (OCPS). This effort is being planned and partially executed by the Evaluation Team and external academic and consulting partners coordinated by representatives from OCPS.
To help plan the evaluation effort and to interpret and communicate the findings, an evaluation stakeholder committee has been convened. Evaluation stakeholders represent those who will use the findings of the evaluation to make decisions about the program’s future and those affected by the evaluation, namely staff, patients, and tribal leaders.
The Evaluation Team will interface with IPC sites to collect additional information, procure permissions, and communicate findings through the evaluation liaison. An evaluation liaison will be selected from each IPC team. Academic and consulting partners will continue to be important players in the evaluation effort. Whether brought in to the effort by funding mechanisms (such as contracts, cooperative agreements, and grants) or as volunteer advisors, external partners will serve on the Evaluation Team.