Process Evalution

In those cases it’s often the funder that requires the evaluation evidence – so one important improvement could be to identify the most efficient / cost effective way to gather such data. Alternatively, you might be interested in a different approach that digs a little deeper and explores complementary aspects that you’re not so sure about. For this reason we’ve tried to include a wide range of tools and techniques to evaluate outreach activities, to suit different needs. The data collection tools we’ve included are well suited for summative evaluation (e.g. at the end of an activity or event), but can also be useful in formative evaluation (e.g. informing the development of future activities). In Phase 1, data should be collected from a range of different actors within the system.
process evalution
You might also consider the budgetary and time resources available and whether there are any ethical considerations you need to take into account. Studying fidelity also involves exploring whether intervention providers have added to or subtracted components from the original intervention design. Such modification may be influenced by providers’ training or experiential backgrounds, or by their response to perceived client need and can have critical positive or negative effects on intervention effectiveness. In order to tell your story effectively, it’s critical for you to consider what you want to communicate about the results or processes of your project, what audiences are most important to communicate with, and what are the most appropriate methods for disseminating your information. Descriptive analysis of weekly logs focused on which strategies were implemented; evidence from meeting notes was again analysed using NPT assisted by NVivo as above.

What theory to use

Specific steps in the evaluation are shown in the squares with directions and prompts to the evaluators at each step provided in italics. The figure underscores the ways in which the outputs of Phase 1 inform the direction and scope of inquiry during Phase 2. Table 2 also shows the role of qualitative methods in a process evaluation and how these map onto the application of concepts from systems thinking and complexity science. No tools exist to assess the quality of process evaluations informed by a complex systems perspective. Therefore, for this review, we critically appraised how systems thinking and complexity science were employed in each paper.
process evalution
This included existing frameworks such as the Cynefin framework [48], Consolidated Framework for Implementation Research [59], a complex adaptive systems framework [54,62], and the socioecological model [41]. Further analysis of observations and interviews with people involved in the intervention is required to gain field-based understanding of the evolution of the intervention, the mechanisms triggering effect and the perceptions of actors on what crucial elements or moments have been in the evolution. Thus, the evaluations must be interdisciplinary, combining techniques and methods from a range of sources including project management, anthropology, psychology and clinical sciences. A series of implications for projects and policies were highlighted by the process evaluations. For the LHW intervention, the importance of a supportive and well-resourced clinic environment, strong management of the Primary Health Care facilities, and motivated staff that relate well to the patients were identified as fundamental for successful task shifting operations. The importance of continued training, communication and programmatic support was highlighted through a number of process evaluations.

Process and Outcome Evaluation: A Brief Overview

It must be stressed that the design of a process evaluation requires flexibility to adapt to time and budget constraints, to creatively use different kinds of information that may be available, and to adapt to changes in project implementation and the changing environment in which the project operates. For example, a general election may bring in a new government with different priorities for the project or changing migration patterns or political unrest might affect the design or implementation of the project or the attitude of the target population. The six-step approach we describe in this blog (see Figure 2) should be considered as a design framework that must be adapted to each project context. The findings from most process evaluations demonstrate both the importance and the challenges of adapting initial research plans to accommodate the constraints in a (low resource) context. Detailed discussions are required to understand context and expectations of local stakeholders.
process evalution
We believe the current approach facilitated more in-depth analyses, thereby enriching the findings of this study. A substantial challenge faced by implementation researchers is to understand if, why and how an intervention has worked in a real world context, and to explain how research systematic testing that has demonstrated effectiveness in one context may or may not be effective in another context or setting [1]. Process evaluation provides a process by which researchers can explain the outcomes resulting from complex interventions that often have nonlinear implementation processes.

Process Evaluation vs. Outcome Evaluation

Evaluators may find a number of different data collection methods useful, including, but not limited to, an initial documentary review, interviews, and workshops. The boundary decision and the identification of system elements will inform from whom data are collected and through which methods [14]. Qualitative methods can be used to explore processes in more depth, such as participant perceptions of the intervention and how the intervention and its context interact with each other. These might involve qualitative analyses (for example, thematic analyses) of interviews with participants or with those who delivered the intervention.

Using a mixed-methods approach deepens the understanding by providing different perspectives, validation and triangulation by using multiple sources [2, 26]. Qualitative analysis enables exploration of the acceptability of an intervention, how it worked and why [27]. Process evaluation is increasingly recognized as an important component of effective implementation research and yet, there has been surprisingly little work to understand what constitutes best practice. Researchers use different methodologies describing causal pathways and understanding barriers and facilitators to implementation of interventions in diverse contexts and settings.

  • Over-emphasizing outcome evaluation at the cost of other types, especially process evaluation, is a disservice to nonprofits and the sector.
  • Process evaluations, like other theory-based approaches to evaluation (such as realist evaluation and ‘theory of change’ approaches) investigate the underpinning theory of the intervention.
  • This formative research is aimed at assessing the major components of the intervention and how these components should vary among and between people, countries and cultures.
  • The remainder of this section elaborates on what is meant by complex systems and process evaluations and discusses why qualitative methods are a particular area of interest for public health evaluators interested in complex systems.
  • Finally, further work remains on the ways in which realist and mixed methods approaches can more explicitly contribute to a process evaluation from a complex systems perspective, but it is beyond the scope of this current review.

However, potential moderators of adherence, such as intervention complexity, strategies used to facilitate implementation, quality of delivery and participant responsiveness,172 also need consideration. Several of the identified process evaluations were conducted alongside or after impact/outcome evaluations of the same intervention. Knai and colleagues integrated data from several evaluative strands including impact and process evaluations [63]. Five studies reported accompanying outcome evaluations, but those results were not presented alongside the process evaluation reports [43,44, 46,47,59,64]. Three studies presented outcome data alongside their process evaluations [50–52, 60,61]. Finally, 2 papers reported independent outcome evaluations that were not linked to their own process evaluations [49,58].

Initially, there were regular differences in allocation to some codes between the two coders; a main code was discussed and agreed as differences arose. Across site summaries were made for each NPT dimension, by merging the site summaries so that the range of views expressed across all sites, and the main agreements and differences between sites, could be identified. The number of sites contributing to a particular finding is indicated in brackets at the end of the sentence, for example ‘Sites reported some difficulties with assessment (4), weekly reviews (2), and daily logs (1)’, or is reported in the text, for example ’Four sites had a toileting regime in place’. We used maximum variance sampling173 to generate a range of participants in terms of gender, age, ethnicity, type of incontinence and stroke severity. Participants were also chosen to reflect those with a range of outcomes at discharge (defined in terms of the frequency of incontinent episodes).

Condividi