Implementation Science: Evaluation Stage: Difference between revisions

No edit summary
Line 6: Line 6:
The term ''evaluation'' has many meanings and myths associated with it, leading to confusion, fear and resistance to its benefits.<blockquote>Evaluation involves all of the following:  
The term ''evaluation'' has many meanings and myths associated with it, leading to confusion, fear and resistance to its benefits.<blockquote>Evaluation involves all of the following:  


* gathering reliable and valid information in a systematic way from all intervention stakeholders
* Gathering reliable and valid information in a systematic way from all intervention stakeholders
* attributing value to the intervention implementation process and strategies or outcomes of implementation process  
* Attributing value to the intervention implementation process and strategies or outcomes of implementation process
* informing future rehabilitation intervention decision-making
* Informing future rehabilitation intervention decision-making
</blockquote>The backbone of any implementation evaluation is having a clear evaluation purpose, and direct relevant and answerable evaluation questions that are aligned with the evaluation approaches and methods. Evaluations are typically associated with judging the effectiveness of the implementation process but can also inform decisions about the implementation process and outcomes of the implementation process.<ref name=":1">Naccarella, L. Implementation Science: Evaluation Stage. Implementation Science Foundational Training Programme. Physioplus. 2022.  </ref>
</blockquote>The backbone of any implementation evaluation is having a clear evaluation purpose, and direct relevant and answerable evaluation questions that are aligned with the evaluation approaches and methods. Evaluations are typically associated with judging the effectiveness of the implementation process but can also inform decisions about the implementation process and outcomes of the implementation process.<ref name=":1">Naccarella, L. Implementation Science: Evaluation Stage. Implementation Science Foundational Training Programme. Physioplus. 2022.  </ref>


== Implementation Evaluation and Contextual Challenges ==
== Implementation Evaluation and Contextual Challenges ==
“What will affect what you implement?” To refresh your memory, please see [https://www.physio-pedia.com/Implementation_Science:_Pre-Implementation_Stage#Understanding_Context this article] to review the multiple factors that may affect the successful implementation of evidence-based rehabilitation interventions.   
“What will affect what you implement?” To refresh your memory, please see [https://www.physio-pedia.com/Implementation_Science:_Pre-Implementation_Stage#Understanding_Context this article] to review the multiple factors that may affect the successful implementation of evidence-based rehabilitation interventions.   


It is easy to become overwhelmed given the increasing quantity of information surrounding these multiple contextual challenges to implementing rehabilitation interventions. It is therefore beneficial to use a more comprehensive approach to evaluating which contextual factors have influence on a particular implementation's outcome. '''Box 10''' provides several key implementation process evaluation questions.<ref name=":1" />
It is easy to become overwhelmed given the increasing quantity of information surrounding these multiple contextual challenges to implementing rehabilitation interventions. It is therefore beneficial to use a more comprehensive approach to evaluating which contextual factors have influence on a particular implementation's outcome. '''Box 10''' provides several key implementation process evaluation questions.<ref name=":1" />
[[File:Implementation science box 10.png|center|thumb|800x800px|These implementation process evaluation questions are based upon several existing implementation science evaluation frameworks listed below.]]
[[File:Implementation science box 10.png|center|thumb|800x800px|These implementation process evaluation questions are based upon several existing implementation science evaluation frameworks listed below.]]
'''Examples of implementation science evaluation frameworks that can identify barriers and facilitators to key implementation outcomes:'''
'''Examples of implementation science evaluation frameworks that can identify barriers and facilitators to key implementation outcomes:'''
Line 21: Line 21:
* [https://www.aacpdm.org/UserFiles/file/IC24-Reedman.pdf Theoretical Domains Framework] (TDF)<ref>Atkins L, Francis J, Islam R, O’Connor D, Patey A, Ivers N, Foy R, Duncan EM, Colquhoun H, Grimshaw JM, Lawton R. [https://implementationscience.biomedcentral.com/articles/10.1186/s13012-017-0605-9 A guide to using the Theoretical Domains Framework of behaviour change to investigate implementation problems.] Implementation science. 2017 Dec;12(1):1-8.</ref>
* [https://www.aacpdm.org/UserFiles/file/IC24-Reedman.pdf Theoretical Domains Framework] (TDF)<ref>Atkins L, Francis J, Islam R, O’Connor D, Patey A, Ivers N, Foy R, Duncan EM, Colquhoun H, Grimshaw JM, Lawton R. [https://implementationscience.biomedcentral.com/articles/10.1186/s13012-017-0605-9 A guide to using the Theoretical Domains Framework of behaviour change to investigate implementation problems.] Implementation science. 2017 Dec;12(1):1-8.</ref>
* Integrating Promoting Action on Research Implementation in Health Services (I-PARIHS)<ref>Hunter SC, Kim B, Mudge A, Hall L, Young A, McRae P, Kitson AL. [https://link.springer.com/article/10.1186/s12913-020-05354-8 Experiences of using the i-PARIHS framework: a co-designed case study of four multi-site implementation projects.] BMC health services research. 2020 Dec;20(1):1-4.</ref><ref>Roberts NA, Janda M, Stover AM, Alexander KE, Wyld D, Mudge A. [https://link.springer.com/article/10.1007/s11136-020-02669-1 The utility of the implementation science framework “Integrated Promoting Action on Research Implementation in Health Services”(i-PARIHS) and the facilitator role for introducing patient-reported outcome measures (PROMs) in a medical oncology outpatient department.] Quality of Life Research. 2021 Nov;30(11):3063-71.</ref>
* Integrating Promoting Action on Research Implementation in Health Services (I-PARIHS)<ref>Hunter SC, Kim B, Mudge A, Hall L, Young A, McRae P, Kitson AL. [https://link.springer.com/article/10.1186/s12913-020-05354-8 Experiences of using the i-PARIHS framework: a co-designed case study of four multi-site implementation projects.] BMC health services research. 2020 Dec;20(1):1-4.</ref><ref>Roberts NA, Janda M, Stover AM, Alexander KE, Wyld D, Mudge A. [https://link.springer.com/article/10.1007/s11136-020-02669-1 The utility of the implementation science framework “Integrated Promoting Action on Research Implementation in Health Services”(i-PARIHS) and the facilitator role for introducing patient-reported outcome measures (PROMs) in a medical oncology outpatient department.] Quality of Life Research. 2021 Nov;30(11):3063-71.</ref>
* [https://www.physio-pedia.com//episframework.com/ Exploration, Preparation, Implementation and Sustainment] (EPIS)
* [https://episframework.com/ Exploration, Preparation, Implementation and Sustainment] (EPIS)




'''There are many other evaluation frameworks that are consistently used evaluate the effectiveness of evidence-informed rehabilitation interventions. This type of framework includes:'''  
'''There are many other evaluation frameworks that are consistently used evaluate the effectiveness of evidence-informed rehabilitation interventions. This type of framework includes:'''  


* The [https://www.re-aim.org/wp-content/uploads/2018/02/Planning-and-Evaluation-Questions-for-Initiatives-Intended-to-Produce-Public-Health-Impact-_Final.pdf Reach, Effectiveness, Adoption, Implementation, Maintenance] (RE-AIM) framework<ref>Glasgow RE, Vogt TM, Boles SM. [https://ajph.aphapublications.org/doi/pdfplus/10.2105/AJPH.89.9.1322 Evaluating the public health impact of health promotion interventions: the RE-AIM framework.] American journal of public health. 1999 Sep;89(9):1322-7.</ref><ref>Bondarenko J, Babic C, Burge AT, Holland AE. [https://openres.ersjournals.com/content/erjor/7/2/00469-2020.full.pdf Home-based pulmonary rehabilitation: an implementation study using the RE-AIM framework.] ERJ open research. 2021 Apr 1;7(2).</ref>: used to evaluate the success of implementation and impact of translating research to “real-world” conditions
* The [https://www.re-aim.org/wp-content/uploads/2018/02/Planning-and-Evaluation-Questions-for-Initiatives-Intended-to-Produce-Public-Health-Impact-_Final.pdf Reach, Effectiveness, Adoption, Implementation, Maintenance] (RE-AIM) framework<ref>Glasgow RE, Vogt TM, Boles SM. [https://ajph.aphapublications.org/doi/pdfplus/10.2105/AJPH.89.9.1322 Evaluating the public health impact of health promotion interventions: the RE-AIM framework.] American journal of public health. 1999 Sep;89(9):1322-7.</ref><ref>Bondarenko J, Babic C, Burge AT, Holland AE. [https://openres.ersjournals.com/content/erjor/7/2/00469-2020.full.pdf Home-based pulmonary rehabilitation: an implementation study using the RE-AIM framework.] ERJ open research. 2021 Apr 1;7(2).</ref>: used to evaluate the success of implementation and impact of translating research to “real-world” conditions
Line 30: Line 30:


== Implementation Evaluation and Implementation Strategies ==
== Implementation Evaluation and Implementation Strategies ==
“What will help what you implement?”  To refresh your memory, please see [[Implementation Science: Implementation Stage#Implementation Strategy Classification|this article]] to review the multiple implementation strategies which support the successful implementation of evidence-based rehabilitation interventions.<ref name=":1" /> <blockquote>The five classes of implementation strategies:   
“What will help what you implement?” To refresh your memory, please see [[Implementation Science: Implementation Stage#Implementation Strategy Classification|this article]] to review the multiple implementation strategies which support the successful implementation of evidence-based rehabilitation interventions.<ref name=":1" /> <blockquote>The five classes of implementation strategies:   


# Implementation process strategies  
# Implementation process strategies  
Line 37: Line 37:
# Capacity building strategies  
# Capacity building strategies  
# Scale up strategies  
# Scale up strategies  
</blockquote>It is easy to become overwhelmed given all these implementation strategies. It is therefore beneficial to use a more comprehensive approach to evaluating have influence on a particular implementation strategy's outcome. '''Box 11''' provides several key implementation strategy evaluation questions.<ref name=":1" />  
</blockquote>It is easy to become overwhelmed given all these implementation strategies. It is therefore beneficial to use a more comprehensive approach to evaluating their influence on a particular implementation strategy's outcome. '''Box 11''' provides several key implementation strategy evaluation questions.<ref name=":1" />  
[[File:Implementation science box 11.png|center|thumb|800x800px|Key implementation strategy evaluation questions]]
[[File:Implementation science box 11.png|center|thumb|800x800px|Key implementation strategy evaluation questions]]


== Evaluate Implementation Outcomes ==
== Evaluate Implementation Outcomes ==
It is important to acknowledge that an unresolved issue in the field of implementation science is how to evaluate implementation effectiveness of evidence-based interventions. Distinguishing implementation effectiveness from intervention effectiveness is critical for transporting interventions from laboratory settings to real-world and or community settings. When such efforts fail it is important to know if the failure occurred because the intervention was ineffective in the new setting (intervention failure), or if a good intervention was implemented incorrectly (implementation failure).<ref name=":1" /> <blockquote>'''Implementation Outcomes'''- refer to the effects of deliberate implementation strategies to adopt and embed new interventions, programs or practices into real world rehabilitation settings.<ref name=":1" />  </blockquote>Three clusters of implementation outcomes have been suggested: <ref name=":0">Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. [https://link.springer.com/article/10.1007/s10488-010-0319-7 Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda.] Administration and policy in mental health and mental health services research. 2011 Mar;38(2):65-76.</ref>  
It is important to acknowledge that an unresolved issue in the field of implementation science is how to evaluate implementation effectiveness of evidence-based interventions. Distinguishing implementation effectiveness from intervention effectiveness is critical for transporting interventions from laboratory settings to real-world and or community settings. When such efforts fail, it is important to know if the failure occurred because the intervention was ineffective in the new setting (intervention failure), or if a good intervention was implemented incorrectly (implementation failure).<ref name=":1" /> <blockquote>'''Implementation Outcomes''' - refer to the effects of deliberate implementation strategies to adopt and embed new interventions, programs or practices into real world rehabilitation settings.<ref name=":1" />  </blockquote>Three clusters of implementation outcomes have been suggested:<ref name=":0">Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. [https://link.springer.com/article/10.1007/s10488-010-0319-7 Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda.] Administration and policy in mental health and mental health services research. 2011 Mar;38(2):65-76.</ref>  


# Implementation outcomes - the effects of implementation strategies undertaken to implement a new intervention such as: acceptability, adoption, appropriateness, uptake, feasibility, fidelity, implementation cost, penetration, and sustainability of the evidence-based rehabilitation interventions
# Implementation outcomes - the effects of implementation strategies undertaken to implement a new intervention such as: acceptability, adoption, appropriateness, uptake, feasibility, fidelity, implementation cost, penetration, and sustainability of the evidence-based rehabilitation interventions
# Service system outcomes - the effects of interventions on service outcomes such as: efficiency, safety, effectiveness, equity, patient centredness, timeliness of the evidence-based rehabilitation interventions
# Service system outcomes - the effects of interventions on service outcomes such as: efficiency, safety, effectiveness, equity, patient centredness, timeliness of the evidence-based rehabilitation interventions
# Patient Outcomes -the effects of intervention on patient outcomes such as: changes in patient satisfaction function or symptomology as a result of the evidence-based rehabilitation interventions
# Patient outcomes - the effects of intervention on patient outcomes such as: changes in patient satisfaction, function or symptomology as a result of the evidence-based rehabilitation interventions




Measuring implementation outcomes in addition to client or service system outcomes is crucial for distinguishing effective or ineffective programs that are well or poorly implemented.   While all three clusters of implementation outcomes are key to focus on this article will mainly focus on implementation outcomes.<ref name=":1" />  '''Box 12''' provides definitions of implementation outcome dimensions adapted from Proctor et al. (2011)
Measuring implementation outcomes in addition to client or service system outcomes is crucial for distinguishing effective or ineffective programmes that are well or poorly implemented. While all three clusters of implementation outcomes are key to focus on, this article will mainly focus on implementation outcomes.<ref name=":1" /> '''Box 12''' provides definitions of implementation outcome dimensions adapted from Proctor et al.<ref name=":0" />
[[File:Implementation science box 12.png|center|thumb|800x800px|Definitions of implementation outcome dimensions<ref name=":0" />]]
[[File:Implementation science box 12.png|center|thumb|800x800px|Definitions of implementation outcome dimensions<ref name=":0" />]]




Given that there are eight implementation outcomes, once again it can be overwhelming as to how and what outcome dimension to select.  There are several factors to consider when choosing which implementation outcomes to evaluate:<ref name=":1" />  
Given that there are eight implementation outcomes, once again it can be overwhelming as to how and what outcome dimension to select. There are several factors to consider when choosing which implementation outcomes to evaluate:<ref name=":1" />  


* the specific barriers to implementation you have observed
* The specific barriers to implementation you have observed
* the novelty of the evidence-based practice you are trying to implement
* The novelty of the evidence-based practice you are trying to implement
* the setting in which implementation is taking place
* The setting in which implementation is taking place
* the resources for and quality of usual training for implementation
* The resources for and quality of usual training for implementation


The stage of implementation and your unit of analysis can also influence. For example, acceptability may be more appropriate to study during early implementation and sustainability may be more appropriately measured later in the implementation process.<ref name=":1" />  
The stage of implementation and your unit of analysis can also have an impact. For example, acceptability may be more appropriate to study during early implementation and sustainability may be more appropriately measured later in the implementation process.<ref name=":1" />  


== Implementation Fidelity ==
== Implementation Fidelity ==
Line 70: Line 70:
#* Quality of delivery refers to the manner in which a practitioner or administrator or volunteer delivers an intervention  
#* Quality of delivery refers to the manner in which a practitioner or administrator or volunteer delivers an intervention  
#* Participant responsiveness measures how far participants respond to, or are engaged by, an intervention
#* Participant responsiveness measures how far participants respond to, or are engaged by, an intervention
</blockquote>It is easy to become overwhelmed given the large number of elements that may influence implementation fidelity. It is therefore beneficial to use a more comprehensive approach to evaluating the influence on the implementation fidelity outcomes. '''Box 13''' provides several key implementation outcome (fidelity) evaluation questions.<ref name=":1" />
</blockquote>It is easy to become overwhelmed given the large number of elements that may influence implementation fidelity. It is therefore beneficial to use a more comprehensive approach to evaluating the influence on the implementation fidelity outcomes. '''Box 13''' provides several key implementation outcome (fidelity) evaluation questions.<ref name=":1" />
[[File:Implementation science box 13.png|center|thumb|800x800px|Key implementation outcome (fidelity) evaluation questions]]
[[File:Implementation science box 13.png|center|thumb|800x800px|Key implementation outcome (fidelity) evaluation questions]]


== Summary ==
== Summary ==
Given the multiple challenges to implementing rehabilitation interventions it is important to think about and assess both the implementation process and the outcomes of your implementation process.  To inform future implementation facilitation efforts, it is important to know not just ''what'' worked but ''how'' and ''why'' the selected implementation strategies performed.
Given the multiple challenges to implementing rehabilitation interventions, it is important to think about and assess both the implementation process and the outcomes of your implementation process. To inform future implementation facilitation efforts, it is important to know not just ''what'' worked but ''how'' and ''why'' the selected implementation strategies performed.


It is important to distinguish implementation effectiveness from intervention effectiveness. Given that there are multiple implementation effectiveness outcomes  be selective about implementation outcomes to assess.  
It is important to distinguish implementation effectiveness from intervention effectiveness. Given that there are multiple implementation effectiveness outcomes  be selective about implementation outcomes to assess.  
Line 86: Line 86:


'''Recommended Reading:'''
'''Recommended Reading:'''
*Allen CG, Barbero C, Shantharam S, Moeti R. Is theory guiding our work? [https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6395551/ A scoping review on the use of implementation theories, frameworks, and models to bring community health workers into health care settings]. Journal of public health management and practice: JPHMP. 2019 Nov;25(6):571.
*Allen CG, Barbero C, Shantharam S, Moeti R. [https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6395551/ Is theory guiding our work? A scoping review on the use of implementation theories, frameworks, and models to bring community health workers into health care settings]. Journal of public health management and practice: JPHMP. 2019 Nov;25(6):571.
*Keith RE, Crosson JC, O’Malley AS, Cromp D, Taylor EF. [https://implementationscience.biomedcentral.com/articles/10.1186/s13012-017-0550-7 Using the Consolidated Framework for Implementation Research (CFIR) to produce actionable findings: a rapid-cycle evaluation approach to improving implementation.] Implementation Science. 2017 Dec;12(1):1-2.
*Keith RE, Crosson JC, O’Malley AS, Cromp D, Taylor EF. [https://implementationscience.biomedcentral.com/articles/10.1186/s13012-017-0550-7 Using the Consolidated Framework for Implementation Research (CFIR) to produce actionable findings: a rapid-cycle evaluation approach to improving implementation.] Implementation Science. 2017 Dec;12(1):1-2.
*Khadjesari Z, Boufkhed S, Vitoratou S, Schatte L, Ziemann A, Daskalopoulou C, Uglik-Marucha E, Sevdalis N, Hull L. [https://pubmed.ncbi.nlm.nih.gov/32811517/ Implementation outcome instruments for use in physical healthcare settings: a systematic review.] Implementation Science. 2020 Dec;15(1):1-6.
*Khadjesari Z, Boufkhed S, Vitoratou S, Schatte L, Ziemann A, Daskalopoulou C, Uglik-Marucha E, Sevdalis N, Hull L. [https://pubmed.ncbi.nlm.nih.gov/32811517/ Implementation outcome instruments for use in physical healthcare settings: a systematic review.] Implementation Science. 2020 Dec;15(1):1-6.

Revision as of 01:39, 4 July 2022

Original Editor - Stacy Schiurring based on the course by Lucio Naccarella

Top Contributors - Stacy Schiurring, Tarina van der Stockt, Kim Jackson and Jess Bell  

An Implementation Evaluation Mindset[edit | edit source]

The term evaluation has many meanings and myths associated with it, leading to confusion, fear and resistance to its benefits.

Evaluation involves all of the following:

  • Gathering reliable and valid information in a systematic way from all intervention stakeholders
  • Attributing value to the intervention implementation process and strategies or outcomes of implementation process
  • Informing future rehabilitation intervention decision-making

The backbone of any implementation evaluation is having a clear evaluation purpose, and direct relevant and answerable evaluation questions that are aligned with the evaluation approaches and methods. Evaluations are typically associated with judging the effectiveness of the implementation process but can also inform decisions about the implementation process and outcomes of the implementation process.[1]

Implementation Evaluation and Contextual Challenges[edit | edit source]

“What will affect what you implement?” To refresh your memory, please see this article to review the multiple factors that may affect the successful implementation of evidence-based rehabilitation interventions.

It is easy to become overwhelmed given the increasing quantity of information surrounding these multiple contextual challenges to implementing rehabilitation interventions. It is therefore beneficial to use a more comprehensive approach to evaluating which contextual factors have influence on a particular implementation's outcome. Box 10 provides several key implementation process evaluation questions.[1]

These implementation process evaluation questions are based upon several existing implementation science evaluation frameworks listed below.

Examples of implementation science evaluation frameworks that can identify barriers and facilitators to key implementation outcomes:


There are many other evaluation frameworks that are consistently used evaluate the effectiveness of evidence-informed rehabilitation interventions. This type of framework includes:

Implementation Evaluation and Implementation Strategies[edit | edit source]

“What will help what you implement?” To refresh your memory, please see this article to review the multiple implementation strategies which support the successful implementation of evidence-based rehabilitation interventions.[1]

The five classes of implementation strategies:

  1. Implementation process strategies
  2. Dissemination strategies
  3. Integration strategies
  4. Capacity building strategies
  5. Scale up strategies

It is easy to become overwhelmed given all these implementation strategies. It is therefore beneficial to use a more comprehensive approach to evaluating their influence on a particular implementation strategy's outcome. Box 11 provides several key implementation strategy evaluation questions.[1]

Key implementation strategy evaluation questions

Evaluate Implementation Outcomes[edit | edit source]

It is important to acknowledge that an unresolved issue in the field of implementation science is how to evaluate implementation effectiveness of evidence-based interventions. Distinguishing implementation effectiveness from intervention effectiveness is critical for transporting interventions from laboratory settings to real-world and or community settings. When such efforts fail, it is important to know if the failure occurred because the intervention was ineffective in the new setting (intervention failure), or if a good intervention was implemented incorrectly (implementation failure).[1]

Implementation Outcomes - refer to the effects of deliberate implementation strategies to adopt and embed new interventions, programs or practices into real world rehabilitation settings.[1]

Three clusters of implementation outcomes have been suggested:[9]

  1. Implementation outcomes - the effects of implementation strategies undertaken to implement a new intervention such as: acceptability, adoption, appropriateness, uptake, feasibility, fidelity, implementation cost, penetration, and sustainability of the evidence-based rehabilitation interventions
  2. Service system outcomes - the effects of interventions on service outcomes such as: efficiency, safety, effectiveness, equity, patient centredness, timeliness of the evidence-based rehabilitation interventions
  3. Patient outcomes - the effects of intervention on patient outcomes such as: changes in patient satisfaction, function or symptomology as a result of the evidence-based rehabilitation interventions


Measuring implementation outcomes in addition to client or service system outcomes is crucial for distinguishing effective or ineffective programmes that are well or poorly implemented. While all three clusters of implementation outcomes are key to focus on, this article will mainly focus on implementation outcomes.[1] Box 12 provides definitions of implementation outcome dimensions adapted from Proctor et al.[9]

Definitions of implementation outcome dimensions[9]


Given that there are eight implementation outcomes, once again it can be overwhelming as to how and what outcome dimension to select. There are several factors to consider when choosing which implementation outcomes to evaluate:[1]

  • The specific barriers to implementation you have observed
  • The novelty of the evidence-based practice you are trying to implement
  • The setting in which implementation is taking place
  • The resources for and quality of usual training for implementation

The stage of implementation and your unit of analysis can also have an impact. For example, acceptability may be more appropriate to study during early implementation and sustainability may be more appropriately measured later in the implementation process.[1]

Implementation Fidelity[edit | edit source]

Fidelity translates as “faithfulness”; thus, fidelity of intervention means faithful and correct implementation of the key components of a defined intervention. Unless such an evaluation is made, it cannot be determined whether a lack of impact is due to poor implementation or inadequacies inherent in the intervention in the real-world setting. Evidence-based practice also assumes that an intervention is being implemented in full accordance with its published details. This is particularly important given the greater potential for inconsistencies in implementation of an intervention in real world rather than experimental conditions. Evidence-based practice needs a means of evaluating whether the intervention is actually being implemented as the designers intended.[1]

Implementation fidelity can be described in terms of three key elements that need to be measured including:

  1. Adherence to an intervention - whether an intervention is being delivered as it was designed or written as far as content of the intervention; the exposure or dose of an intervention received by participants
  2. Intervention complexity - complex interventions have greater scope for variation in their delivery, therefore are more vulnerable to one or more components not being correctly implemented
  3. Facilitation strategies - the provision of manuals, guidelines, training, monitoring and feedback, capacity building, and incentives:
    • Quality of delivery refers to the manner in which a practitioner or administrator or volunteer delivers an intervention
    • Participant responsiveness measures how far participants respond to, or are engaged by, an intervention

It is easy to become overwhelmed given the large number of elements that may influence implementation fidelity. It is therefore beneficial to use a more comprehensive approach to evaluating the influence on the implementation fidelity outcomes. Box 13 provides several key implementation outcome (fidelity) evaluation questions.[1]

Key implementation outcome (fidelity) evaluation questions

Summary[edit | edit source]

Given the multiple challenges to implementing rehabilitation interventions, it is important to think about and assess both the implementation process and the outcomes of your implementation process. To inform future implementation facilitation efforts, it is important to know not just what worked but how and why the selected implementation strategies performed.

It is important to distinguish implementation effectiveness from intervention effectiveness. Given that there are multiple implementation effectiveness outcomes be selective about implementation outcomes to assess.

Resources[edit | edit source]

Optional Video:

Please view this approximately 20-minute video for a detailed description and comparision of implementation science frameworks.

[10]

Recommended Reading:


Additional Implementation Outcome Measurement Tools:

Given the complexity of thinking and measuring implementation outcomes, for more detailed information about instruments to measure implementation outcomes please review:

References[edit | edit source]

  1. 1.00 1.01 1.02 1.03 1.04 1.05 1.06 1.07 1.08 1.09 1.10 Naccarella, L. Implementation Science: Evaluation Stage. Implementation Science Foundational Training Programme. Physioplus. 2022.  
  2. Keith RE, Crosson JC, O’Malley AS, Cromp D, Taylor EF. Using the Consolidated Framework for Implementation Research (CFIR) to produce actionable findings: a rapid-cycle evaluation approach to improving implementation. Implementation Science. 2017 Dec;12(1):1-2.
  3. Atkins L, Francis J, Islam R, O’Connor D, Patey A, Ivers N, Foy R, Duncan EM, Colquhoun H, Grimshaw JM, Lawton R. A guide to using the Theoretical Domains Framework of behaviour change to investigate implementation problems. Implementation science. 2017 Dec;12(1):1-8.
  4. Hunter SC, Kim B, Mudge A, Hall L, Young A, McRae P, Kitson AL. Experiences of using the i-PARIHS framework: a co-designed case study of four multi-site implementation projects. BMC health services research. 2020 Dec;20(1):1-4.
  5. Roberts NA, Janda M, Stover AM, Alexander KE, Wyld D, Mudge A. The utility of the implementation science framework “Integrated Promoting Action on Research Implementation in Health Services”(i-PARIHS) and the facilitator role for introducing patient-reported outcome measures (PROMs) in a medical oncology outpatient department. Quality of Life Research. 2021 Nov;30(11):3063-71.
  6. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. American journal of public health. 1999 Sep;89(9):1322-7.
  7. Bondarenko J, Babic C, Burge AT, Holland AE. Home-based pulmonary rehabilitation: an implementation study using the RE-AIM framework. ERJ open research. 2021 Apr 1;7(2).
  8. Allen CG, Barbero C, Shantharam S, Moeti R. Is theory guiding our work? A scoping review on the use of implementation theories, frameworks, and models to bring community health workers into health care settings. Journal of public health management and practice: JPHMP. 2019 Nov;25(6):571.
  9. 9.0 9.1 9.2 9.3 Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Administration and policy in mental health and mental health services research. 2011 Mar;38(2):65-76.
  10. Youtube. Theories and Frameworks in Implementation Science | IRL. Available from: https://www.youtube.com/watch?v=fdaTFgX0II0 [last accessed 11/05/2022]
  11. Khadjesari Z, Boufkhed S, Vitoratou S, Schatte L, Ziemann A, Daskalopoulou C, Uglik-Marucha E, Sevdalis N, Hull L. Implementation outcome instruments for use in physical healthcare settings: a systematic review. Implementation Science. 2020 Dec;15(1):1-6.