Skip to main content
  • Systematic review
  • Open access
  • Published:

Finding the right dose: a scoping review examining facilitation as an implementation strategy for evidence-based stroke care

Abstract

Background

Despite evidence supporting interventions that improve outcomes for patients with stroke, their implementation remains suboptimal. Facilitation can support implementation of research into clinical practice by helping people develop the strategies to implement change. However, variability in the amount (dose) and type of facilitation activities/facilitator roles that make up the facilitation strategies (content), may affect the effectiveness of facilitation. This review aimed to determine if, and how, facilitation dose is measured or reported and the type of facilitation strategies used to support adoption of stroke interventions in hospitals and subacute settings. We also assessed whether the included studies had reporting checklists or guidelines.

Methods

The scoping review was based on Arksey and O’Malley’s framework. Cochrane, CINAHL and MEDLINE databases were searched to identify randomised trials and quasi-experimental studies of stroke interventions published between January 2017 and July 2023. Accompanying publications (quantitative, qualitative, mixed methods or process evaluation papers) from eligible studies were also included. Narrative data synthesis was undertaken.

Results

Ten studies (23 papers) from 649 full-text papers met the inclusion criteria. Only two studies reported the total facilitation dose, measured as the frequency and duration of facilitation encounters. Authors of the remaining eight studies reported only the frequency and/or duration of varying facilitation activities but not the total dose. The facilitation activities included remote external facilitator support via ongoing telecommunication (phone calls, emails, teleconferences), continuous engagement from on-site internal facilitators, face-to-face workshops and/or education sessions from external or internal facilitators. Facilitator roles were broad: site-specific briefing, action planning and/or goal setting; identifying enablers and barriers to change; coaching, training, education or feedback; and network support. Only two studies included reporting checklists/guidelines to support researchers to describe interventions and implementation studies in sufficient detail to enable replication.

Conclusions

There is a paucity of information on the measurement of facilitation dose and reporting on specific details of facilitation activities in stroke implementation studies. Detailed reporting of dose and content is needed to improve the scientific basis of facilitation as strategic support to enable improvements to stroke care. Development of a standardised measurement approach for facilitation dose would inform future research and translation of findings.

Peer Review reports

Introduction

Achieving successful knowledge translation and implementation of evidence-based interventions in clinical practice is often difficult [1, 2]. In addition to the complexity and challenges, there remains a lack of knowledge about what strategies are most effective in changing clinician behaviour and successfully implementing evidence into practice [3].

Implementation frameworks highlight the need for appropriate facilitation to improve the potential of implementation success [3,4,5]. Facilitation refers to the process of providing help and support to individuals and teams to enable them achieve a specific goal [6]. It is ‘a process of interactive problem solving and support that occurs in a context of a recognized need for improvement and a supportive interpersonal relationship’ [7]. Facilitation may occur in various forms, using either an external facilitator, an internal facilitator, or a combination of both [6]. Facilitators are considered as ‘change agents’ or ‘champions’ and their key roles are to identify, engage, and connect stakeholders; facilitate collaboration including the development of implementation action plans; support communication and information sharing; and evaluate practice change [6, 8,9,10].

Despite evidence supporting interventions that improve outcomes for patients with stroke, implementation of these evidence-based stroke interventions remains suboptimal [11, 12]. Facilitation has the potential to improve stroke evidence translation and, thus, clinician practice. Facilitator roles that have been examined in published studies of stroke interventions and shown to be effective include external facilitators undertaking telephone contact and on‐site visits with clinicians to facilitate improvement in venous thromboembolism prevention for stroke patients [13]; internal clinical facilitators facilitating improvements in the organisation and delivery of stroke patient care [14]; and internal non‐clinical facilitators facilitating improvements in adherence to clinical processes of care [15].

Evidence in support of facilitation as an implementation strategy for increasing uptake of evidence-based interventions into clinical practice is mixed [13,14,15,16]. The specific reason for this is poorly understood due to a lack of conceptual clarity but may occur because of variability in the amount (dose) and types of facilitation activities/facilitator roles that make up the facilitation strategies (content) [17]. As a result, it is recommended that both the dose and content of facilitation are measured in effectiveness and comparative effectiveness studies [18]. This is particularly important to show the minimal dose and content required to obtain the strongest effect as well as to have a better understanding of the processes and mechanisms by which implementation strategies exert their effects [18, 19]. Despite its potential benefits, the facilitation dose and content required for optimal uptake of interventions, that is, how much facilitation results in successful outcomes, is yet to be thoroughly investigated [20,21,22,23]. Further, how to define or measure facilitation dose and content is unclear particularly because many studies use facilitation as part of a multifaceted implementation strategy. Recent findings from a case study of the Coordination Toolkit and Coaching Project which conceptualized facilitation intensity showed that intensity could be assessed quantitatively by the frequency and duration of facilitation encounters (dose) and qualitatively by the review of written facilitator reflections [22].

A multicomponent implementation strategy that comprises facilitation was particularly successful for evidence implementation in stroke care in the landmark Quality in Acute Stroke Care (QASC) Trial and translation studies [24,25,26]. Building on our previous research, the ongoing QASC Australasia Trial [27] is testing two different facilitation intensities or doses to support delivery of the Fever Sugar Swallow Protocols for stroke patients. As part of designing this trial, we undertook a scoping review to examine the evidence regarding how facilitation dose and content are described and reported in studies of evidence implementation in stroke intervention studies. Scoping reviews are well suited to clarifying concepts and definitions in a specific field as well as identifying key characteristics related to a concept [28, 29]. Guided by the Coordination Toolkit and Coaching Project’s quantitative measurement of facilitation dose, the specific research questions examined were:

  1. 1)

    Was facilitation dose (measured as the frequency and duration of facilitation encounters) reported in the included studies?

  2. 2)

    In what other ways, if any, was facilitation dose measured or reported in the included studies besides the frequency and duration of facilitation encounters?

  3. 3)

    What were the types of facilitation strategies (content) used to implement interventions in the included studies?

  4. 4)

    Did the included studies have reporting checklists or guidelines, and if so, which ones?

The findings from this scoping review will contribute to the body of knowledge on facilitation as an implementation strategy for evidence translation in healthcare settings.

Methods

Study design

This scoping review was conducted following the methodological framework described by Arksey and O’Malley which consists of five steps: (1) formulating the research question, (2) identifying relevant studies, (3) selecting eligible studies, (4) charting the data and (5) collating, summarising and reporting the results [30]. Reporting of the review complied with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses – Scoping Reviews (Additional file 1) [31].

Protocol and registration

The scoping review was registered with Open Science Framework (https://doiorg.publicaciones.saludcastillayleon.es/https://doiorg.publicaciones.saludcastillayleon.es/10.17605/OSF.IO/WD5BJ).

Eligibility criteria

Randomised controlled trials (RCTs) and quasi-experimental studies (non-randomised trial, pre-test and post-test [before-after], interrupted time series) [32] that evaluated facilitation (as defined by authors of included studies) as an implementation strategy to improve the uptake of stroke and/or transient ischemic attack (TIA) interventions were included in the review. Accompanying publications reporting on secondary outcomes (quantitative, qualitative, mixed methods or process evaluation papers) from eligible RCTs and quasi-experimental studies were also included if either the main results paper was unpublished, or the secondary outcomes papers provided relevant information not published in the main results paper. Studies were included if they were undertaken in acute and/or subacute care settings and evaluated interventions targeted at improving stroke and/or TIA management. Only peer-reviewed journal articles written in English language were included. Studies conducted in non-acute care settings only (e.g., primary health care, community clinics, nursing homes, pharmacies) were excluded. Grey literature such as theses/dissertations, conference abstracts, letters to editors, reports and guidelines were also excluded.

Information sources

Electronic bibliographic databases Cochrane, CINAHL and MEDLINE were searched from January 2017 to March 2022 (with an updated search in July 2023), to identify eligible studies. The last two decades has seen considerable advancement in the field of implementation science, with a better understanding of implementation strategies [19]. The five-year search period was considered appropriate as research prior to this time was likely to demonstrate a lack of conceptual clarity on discrete implementation strategies, specifically inconsistency in the use of terminology and insufficient description of strategies [7].

Search

The search strategy used different combinations of keywords and medical subject headings (MeSH) search terms with Boolean operators: facilitat* OR knowledge broker OR coach OR consultant OR mentor OR trainer OR implementation practitioner; intensity OR dose OR level OR amount OR type; implementation* OR dissemination; and intervention, quality improvement, and knowledge translation. The search terms were adapted for use with each electronic bibliographic database (Additional file 2). The reference lists of included papers were hand-searched for additional papers. The final search results were exported into EndNote X9.2 (Clarivate Analytics, Philadelphia).

Selection of sources of evidence

The titles and abstracts of all papers retrieved from electronic databases and the additional papers identified from manual hand-searching were independently screened by one team member (HC) against relevance to the review questions and inclusion and exclusion criteria. After first-stage screening, the full texts of papers meeting the inclusion criteria were assessed by two team members (HC and OF) to determine the final studies included for analysis. The reference lists of the final included studies were also checked for relevant studies that could be included in the review. Disagreements regarding the study selection were resolved by a third member (SD).

Data extraction

Data extraction was undertaken using Cochrane’s data collection form for RCTs and non-RCTs [33], which was adapted for the purpose of this review. Data extracted included: first author; year of publication; country of study; study participants; study setting; number of participants; study design; facilitation intensity or dose; mode of facilitation (internal, external, remote, or in-person); description of intervention and facilitation strategy; and study findings. Data extraction was performed independently by three research assistants with consensus on discrepancies undertaken by one team member (HC). Study authors were not contacted to identify additional information. Where available, additional information for the included studies were retrieved from their respective study protocols, clinical trial registrations, supplementary files and/or process evaluation papers to obtain detailed descriptions of the intervention and facilitation strategy. Study authors were not contacted for additional information.

Critical appraisal of individual sources of evidence

The methodological quality or risk of bias of the included papers was determined. Critical appraisal of the papers was done by HC and a research assistant using the Mixed Methods Appraisal Tool [34], with consensus on discrepancies resolved by OF. Papers were assessed against five domains depending on the study design and the individual domains were rated as having either high, low, or unclear risk of bias.

Synthesis of results

Narrative synthesis of the data was undertaken according to the Economic and Social Research Council’s guideline on the conduct of narrative synthesis [35]. To guide the analysis, the synthesis was structured around the four research questions. Preliminary synthesis involved the use of tabulation to enable data comparison across the different studies. In addition, textual descriptions of the studies were undertaken to summarise individual study findings and extract information relevant to the research questions. The characteristics of the included studies were explored to identify any similarity and/or differences in the studies in relation to the research questions.

Results

A total of 8783 and 43 papers were identified in database and citation searches, respectively. After screening the titles and abstracts of papers against the inclusion and exclusion criteria, 649 full-text papers were assessed for inclusion. Of these, 23 papers from 10 studies were included in the analysis (Fig. 1).

Fig. 1
figure 1

PRISMA diagram

Characteristics of sources of evidence

Descriptive characteristics of the 10 studies and their accompanying papers are reported in Table 1. The study designs used in the included studies varied: three cluster RCTs [36,37,38] (the main study results of the Stroke Canada Optimization of Rehabilitation by Evidence Implementation Trial are yet to be published); two non-randomised controlled trials [39, 40]; and five pre-test and post-test (before-after) studies [41,42,43,44,45]. The accompanying papers from the ten studies also used a range of study designs, commonly mixed methods [46,47,48] and qualitative designs [49,50,51]. The main secondary outcomes evaluated in the accompanying papers were process evaluations [23, 38, 52, 53] and evaluations of stakeholder perspectives [47, 48, 50, 51, 54]. The 10 studies were conducted in the United States [39], Australia [36, 37, 40, 41, 45], Canada [38, 42, 43], and the Netherlands [44].

Table 1 Study characteristics

Different stroke and/or TIA interventions were evaluated. Six studies involved stroke rehabilitation interventions [38, 40, 42,43,44,45]. One study [37] focused on stroke intervention implementation in the emergency department while the remaining three studies involved interventions aimed at improving the quality of stroke or TIA inpatient care [36, 39, 41]. Of the 10 studies, seven evaluated interventions in patients with TIA [39], stroke [37, 38, 40, 42], and either stroke or TIA [36, 41]. The remaining three studies involved stroke survivors and carers [44, 45] and physical therapists [43].

The patient and clinical process of care outcomes reported in the studies varied. However, there were a few studies that reported similar outcomes, such as guideline-based stroke or TIA processes of care [23, 36, 47, 53]; administration of rehabilitation walk tests [42, 52]; and mortality at 90 days post-discharge [36, 37, 39].

Facilitation dose reporting and measurement

Table 1 provides information on the facilitation dose reported in the included studies. Of the 10 studies, only two reported on the total facilitation dose, quantitatively measured as the frequency and duration of facilitation encounters or activities such as education delivery, coaching, training, barrier and enabler identification, action plan development and data collection,

Thayabaranathan et al. [23] reported the amount of external facilitation as ‘the frequency and duration of professional behaviour change support provided to clinicians, mode of support delivery, and time spent delivering support’ [23]. This was measured for two implementation strategies: one 3-h face-to-face workshop to develop an implementation strategy action plan to improve stroke care, and ongoing phone, email, or face-to-face support. There was a mean of 30 h (standard deviation [SD]: 14) of total facilitation time for the 19 participating hospitals, constituting a mean of 19 h (SD: 11) of face-to-face contact, 5 h (SD: 2) of phone contact, and 7 h (SD: 4) of email contact. There was a clinically significant, but not statistically significant, difference in hours of facilitation time between the 14 hospitals with an implementation strategy action plan (mean: 32, SD: 15) and the 5 hospitals which did not develop an implementation strategy action plan (mean: 25, SD: 10).

Damush et al. [47] defined facilitation dose as interactions between the site team members and external facilitator (a quality improvement nurse or physician) by phone, email, Skype teleconference, or in-person. These interactions (referred to as episodes) were frequently contained in a single email chain but could extend over several weeks and were regarding a specific request, problem, or question. The facilitation dose was measured for the six participating sites with each site receiving a mean of 24 episodes of external facilitation before and during the one-year active implementation period [48]. The facilitation dose (episodes) was also measured for specific activities performed by the external facilitator namely education (mean: 8), quality process monitoring (mean: 10), planning (mean: 12) and networking (mean: 11) [47].

Authors of the remaining eight studies reported on only the frequency and/or duration of individual facilitation encounters with no combined measurement of facilitation dose. For example, the frequency and duration of education sessions or workshops conducted by the facilitator were reported in some of the included studies: one one-hour videoconference and one two-hour in-person workshop [36]; one two-day workshop [38]; and three learning sessions [55].

Authors of several studies also reported on staff support time. This was sometimes measured by frequency and duration, for example four hours of internal facilitator support per week for 16 months [38]; or measured only by duration, for example email or phone support by external facilitator for 21 months [43, 55]; or phone and email contact, newsletters and phone consultations by external facilitators for 13 months [44].

Mentoring, coaching sessions and team meetings by the facilitator were either reported by frequency and/or duration. For example, fortnightly or monthly coaching with external facilitator across three months [40], weekly team meetings for 12 months [42], and fortnightly team meetings with ongoing staff training for five months [45].

While Middleton et al. [37] inferred that the regular provision of ongoing intensive structured support to clinical champions by an external facilitator represented facilitation dose, there was no overall quantitative measure of facilitation dose reported in the study. The intensive structured support comprised sustained engagement with direct contact every six weeks (alternating between site visits and teleconferences every three months), facilitation of two one-hour face-to-face multidisciplinary team workshops, and a 30-min education session at each site.

Facilitation activities and facilitator roles

Although the type of facilitation strategies used to implement stroke and/or TIA interventions were broad, there were similarities between the studies. All studies used external facilitators who either supported implementation of the intervention [36, 40, 41, 45, 47] or supported internal facilitators [44, 50, 55] to implement some or all aspects of the intervention. In the study by Middleton et al. [37], external facilitators performed both roles, while in the study by Moore et al. [42], external facilitators worked collaboratively with the clinical team and leadership to co-create and implement a knowledge translation intervention.

Four studies used internal facilitators, such as local clinicians [37, 43, 44, 50]. Nine studies used remote facilitation via telecommunication (phone calls, emails, teleconferences) [36,37,38,39, 41,42,43,44] or online tools (comments/questions to the research team via Trello) [40]. The studies also used in-person facilitation, such as in-person facilitation of kick-off or planning meetings [36, 37, 39, 41, 44], in-person facilitation of education sessions or workshops [37, 40, 42, 43, 45] or site/outreach visits [37, 44]. All studies combined different modes of facilitation: external and in-person [45]; external, remote and in-person [36, 39,40,41]; internal, external and in-person [42]; internal, external, remote and in-person [37, 38, 43, 44].

Facilitator roles varied and there was an overlap of roles between studies. Roles involved individualised and site-specific briefing, set up, action planning and/or goal setting; problem-solving; or supporting clinicians to identify enablers and barriers to change [36,37,38,39,40,41,42, 44, 45]. Facilitators delivered coaching, training, education, progress feedback, or ongoing support; or provided consultations [36,37,38,39,40,41,42,43,44,45]. Facilitators also undertook site visits as well as monitored and collected data for research or quality improvement processes [37, 44, 45]. Only two studies involved facilitators assisting with the development of implementation resources [40, 45].

Reporting checklists/guidelines

Eight papers [36, 37, 39, 41, 46, 48, 55, 56] from five studies had guidelines or checklists, but only two papers [36, 48] provided completed checklists as supplementary material (Table 2). One paper [55] had the TIDier guideline and three [39, 46, 48] had the Standards for Reporting Implementation Studies (StaRI) statement, which are intended to support researchers to describe interventions and implementation studies in sufficient detail to enable replication. One paper [36] had the Consolidated Standards of Reporting Trials (CONSORT) statement which aims to improve reporting of RCTs. One paper [37] used the Standard Protocol Items: Recommendations for Interventional Trials (SPIRIT) statement in the trial protocol, which enhances clinical trial protocol reporting. One paper [56] had the Standards for QUality Improvement Reporting Excellence (SQUIRE) statement which guides the reporting of system-level initiatives to improve healthcare quality.

Table 2 Guidelines and checklists used in the included studies

Critical appraisal within sources of evidence

Overall, the quality of included studies was mixed. The five qualitative papers [49,50,51, 53, 55] had a low risk of bias for all the five domains that were assessed (Fig. 2). The two RCTs with published trial results [36, 37] (Fig. 3) and the eight quantitative non-randomised papers [23, 39,40,41,42,43, 54, 56] (Fig. 4) generally had a low or unclear risk of bias for the domains assessed. Three of the seven mixed methods papers had a high risk of bias for the domains of adequately integrating quantitative and qualitative data, explaining where divergence in quantitative and qualitative data occurred, and overall quantitative and qualitative data quality [44, 45, 52]. The remaining four mixed methods papers [46,47,48] had a low or unclear risk of bias for the domains of rationalising the used of mixed methods, integrating quantitative and qualitative data, and overall quantitative and qualitative data quality (Fig. 5). The quantitative descriptive paper [38] had a low risk of bias for all the domains (Fig. 6).

Fig. 2
figure 2

Critical appraisal of qualitative papers

Fig. 3
figure 3

Critical appraisal of randomised controlled trials

Fig. 4
figure 4

Critical appraisal of quantitative non-randomised papers

Fig. 5
figure 5

Critical appraisal of mixed methods papers

Fig. 6
figure 6

Critical appraisal of quantitative descriptive paper

Discussion

We examined the concepts of facilitation dose and content within the context of implementation of stroke and/or TIA interventions. Our findings revealed a significant gap in the literature regarding both the measurement and reporting of facilitation dose and content. Only two of the 10 studies measured the total facilitation dose while the remaining eight studies reported only on different facilitation encounters/activities with no overall measurement of facilitation dose. We found that the content of the facilitation strategies was broad with an overlap between studies in the roles performed by facilitators. In addition, there was a minimal use of reporting checklists/guidelines, particularly those intended to support researchers to describe interventions and implementation studies in sufficient detail to enable replication.

A notable finding from this scoping review was the lack of a standardised method for measuring facilitation dose in the included stroke/TIA studies. While some studies measured an overall dose for all facilitation encounters or activities, others only self-reported individual facilitation encounters without any quantitative measures. Furthermore, some studies measured the facilitation duration without the frequency and vice versa. Our measurement of facilitation dose was guided by the Coordination Toolkit and Coaching Project which was based on both the frequency and duration of facilitation encounters [22]. However, this measurement approach is limited by using only time as a measure for dose. While time is clearly one dimension of dose, there are other dimensions which might modify the effectiveness of facilitation that are important to also measure. For example, 8 h of didactic sessions are not the same as 8 h of interactive sessions and measuring only the time spent may not account for variability in engagement and other context-dependent factors. There has been a recent call for research across healthcare services to develop measures that assess facilitation intensity beyond its frequency and duration, and take into account the energy (mental, emotional, physical) expended by implementation facilitators during each activity and cumulatively across activities [22].

It is important to note that measurement of the true dose or intensity of facilitation may be challenging as facilitation is a complex multifaced concept which encompasses a broad range of techniques and strategies needed to bring about intervention implementation success [6, 22, 57]. Further, facilitation may take different forms such as internal facilitation (facilitation by existing staff within the implementation site), external facilitation (facilitation by a person external to the implementation site), or a combination of both [6]. As a result, there is the likelihood that a considerable amount of facilitation occurs outside the defined prescribed role, particularly for internal facilitation. For example, when a staff member employed in a clinical leadership role that involves implementing practice change has to also perform internal facilitator duties for a quality improvement initiative. The external facilitator role is usually better delineated, as this is typically a dedicated role undertaken by an individual employed by an external organisation [6] and may therefore be easier to measure. In addition, factors such as the specific objectives of the intervention, the interpersonal and communication skills, and the familiarity with local processes and culture of the facilitator [58] may have a potential impact on the facilitation process. Therefore, the true facilitation dose may not always be easily and consistently measured or accurately estimated.

To better understand implementation strategies such as facilitation, it is also essential to delve deeper into the content or elements that make up the facilitation strategies. We found that the facilitation activities and roles of facilitators in the included studies were broad, which is consistent with the implementation science literature [6, 8, 9, 20, 59]. Further, facilitation was used in all the studies as part of a multicomponent implementation strategy. Therefore, the facilitator role often involved the incorporation or use of other implementation strategies to provide support for intervention delivery such as conducting educational meetings, outreach visits, training, providing reminders, or undertaking audit and feedback. This complex multifaceted nature of facilitation reflects the diversity of approaches in getting evidence into practice [20, 57]. This complexity also contributes to the challenges of clearly describing, operationalising and measuring facilitation [18].

Process evaluations aim to provide insight into the context in which implementation strategies such as facilitation are applied in real-world settings [60, 61]. They are able to describe in detail how the strategy was developed, delivered, participants exposure and their experience with the implementation activities that make up the strategy, as well as the contextual factors impacting on the strategy [61]. Evidence shows that facilitation activities tend to occur flexibly in response to local circumstances with the ever-evolving context dictating the intensity of most facilitation activities [62]. Furthermore, findings from the process evaluation of a trial which evaluated two facilitation doses with no significant difference between them [21] revealed that the facilitation types were unable to overcome the influence of contextual factors such as limited resources and lack of managerial and staff support [63]. This shows that there are factors which impact on facilitation that cannot necessarily be quantified and are better understood through concurrently undertaken process evaluations.

Our findings also illustrate the lack of reporting on specific details of intervention delivery thereby highlighting the importance of transparent reporting practices in research [64, 65]. Only eight papers [36, 37, 39, 41, 46, 48, 55, 56] from five studies reported using guidelines or checklists, of which four papers [39, 46, 48, 55] from two studies used either the TIDIer guideline or StaRI statement which aim to support researchers to describe interventions and implementation studies in sufficient detail to enable replication. Unsurprisingly, three of these four papers had a low risk of bias across the assessment domains, indicating that studies which adhere to reporting guidelines for implementation studies may be more likely to be reported precisely. Powell et al. [19] noted that implementation strategies are often poorly described in study protocols or empirical studies. This has the potential to limit the reproducibility of research as well as the interpretation of study findings [18, 66].

Another important implication of our findings is the potential impact of facilitation on intervention effectiveness and efficiency. Knowledge of the optimal facilitation dose and type of facilitation strategies required to achieve desired outcomes is important for designing effective interventions and allocating resources efficiently. If researchers are unable to draw meaningful conclusions regarding the impact of facilitation on study outcomes due to inadequate measurement and reporting, they may erroneously attribute differences in outcomes solely to the intervention. This could result in overestimation of intervention effectiveness. Our ongoing cluster randomised controlled QASC Australasia Trial will explore the relationship between facilitation dose and intervention outcomes [27]. Furthermore, facilitation can be costly with estimates of organisational facilitation costs (salary support for internal and external facilitators, facilitation support staff and stakeholders) from one study with four clinics said to be as high as US $263 490 during a 28-month period [67]. Further, the higher the facilitation dose, the greater the costs [68]. Given the implications for resourcing of higher facilitation doses and content, precise reporting and measurement is warranted to ensure adequate use of limited resources. An economic evaluation is planned as part of the QASC Australasia Trial to estimate the costs of the low and high dose facilitation interventions and identify an effective and affordable facilitation model for the implementation of evidence-based stroke protocols. It is hoped that the findings from this trial will generate new knowledge on the impact of facilitation dose on intervention effectiveness and efficiency and contribute to the field of implementation science.

As our scoping review evaluated facilitation as an implementation science strategy to improve the uptake of stroke and/or TIA interventions, we compared our findings to the newly updated Cochrane review by Lynch et al. [69] which evaluated the effects of implementation interventions in improving the delivery of evidence‐based stroke care. Of the seven acute stroke improvement intervention RCTs included in the review, six involved facilitators (referred to as change agents, site champions, quality improvement advisors or quality coordinator) in delivering or supporting the implementation of the intervention [25, 70,71,72,73,74]. The six RCTs only reported on the frequency and/or duration of different facilitation activities – a single 2.5 h interactive education session and workshop [73]; two face-to-face workshops and one site meeting [70]; two 30 to 60 min education sessions, one 60 min barrier identification and strategy development workshop and monthly phone or email contact for four months [71]; weekly online sharing and learning sessions [72]; two one-hour face-to-face multidisciplinary team workshops, and a 30-min education session [25]. The roles of facilitators were broad and involved activities such as leading education and working groups, goal setting, team building, performance feedback and action planning. Four RCTs [70,71,72,73] reported using guidelines or checklists, but only two [71, 72] provided completed checklists as supplementary material. These findings are consistent with our scoping review and emphasise an important gap in the implementation science literature regarding measurement and reporting on specific details of implementation strategies.

Overall, our findings are comparable to the non-stroke literature. Facilitation dose was also often measured based on frequency and/or duration (time) of facilitation Garner etl al. [74] tested an implementation and sustainment facilitation strategy for helping HIV organisations implement an intervention to decrease substance use disorders in their clients. Facilitation dose was measured in hours to reflect the frequency and duration of facilitation (maximum possible dose of 30 h comprising 18 h for up to 18 monthly virtual external facilitation meetings lasting up to 1 h each; and 12 h for up to two in-person facilitation meetings lasting up to 6 h each). Bucknall et al. [75] investigated the effectiveness of a facilitation intervention to improve nurses’ response to patient deterioration. Dose was measured as time allocated for facilitation, for example the internal hospital facilitator provided 5 h of support per week to intervention wards for 6 months [75]. In the systematic review by Baskerville et al. [76] which evaluated practice facilitation for the implementation of evidence-based practice guidelines within primary care practice settings, the authors measured facilitation intensity or dose by multiplying the mean number of contacts with a practice by the mean meeting time in hours. Sarkies et al. [77] evaluated the effectiveness of a knowledge broker strategy to facilitate evidence-informed resource allocation to inpatient weekend allied health services. Facilitation dose was described as the frequency of contacts over a 12-month period [77]. Similarly, the facilitation activities and facilitator roles in non-stroke studies were broad – readiness assessment, barrier and facilitator identification, ongoing training and consultation [78]; local needs assessment and plan development [77]; development of quality improvement tools, readiness assessment and barrier identification, identification and preparation of champions [79]; action plan development, auditing, care plan development and staff support to complete assessment forms [21].

Our findings, in addition to those from the non-stroke literature, have implications for the wider field of implementation science with the potential to contribute to the body of knowledge on facilitation as an implementation strategy. Without understanding facilitation strategies used, we cannot truly understand facilitation effectiveness. Consequently, this limits our ability to develop effective facilitation processes to maximise research use, guide facilitator behaviours, and determine the appropriate dose and content of facilitation [17]. Given the lack of a consistent measurement approach across stroke and other disciplines, we recommend that implementation science researchers should consider the development and validation of standardised methods (quantitative and qualitative) for measuring facilitation dose. In addition, the breadth of facilitation activities and roles of facilitators highlight the need for future studies to better operationalise the definition of facilitation by examining what elements of the role and activities could be delivered in fixed and discretionary ways. We also suggest embedding process evaluations into intervention effectiveness studies which use facilitation either as a discrete or part of a multifaceted implementation strategy to have a better understanding of the impact of context on facilitation. Encouraging use of standardised reporting guidelines for implementation studies may also help promote the explicit reporting of facilitation dose and content and improve transparency and rigor in research [80,81,82,83,84,85].

Our scoping review is limited by the inclusion of studies which evaluated interventions in patients with stroke and/or TIA in acute and/or subacute care settings. Therefore, our findings may lack generalisability beyond these settings. While measurement of facilitation dose has been attempted in general practice [76] and long-term care [21] settings, the focus of this review was on stroke implementation interventions which are typically provided in acute and subacute care (in-patient) settings. Given the wide variation in terminology used to describe different implementation support roles [86], we may have missed studies that used facilitation as an implementation strategy despite including common terms for this concept in our search strategy. Despite these limitations, our scoping review contributes to the body of knowledge on facilitation as an implementation strategy for evidence translation and sheds light on a critical aspect of implementation science – dose.

Conclusion

This scoping review examined the evidence regarding the concepts of facilitation dose and content for implementing evidence-based stroke interventions. The findings of this review have the potential to better operationalise the measurement and reporting of facilitation. Further research on the impact of facilitation dose and content on intervention effectiveness and efficiency is needed to advance the field of implementation science.

Data availability

Not applicable.

Abbreviations

CONSORT:

Consolidated Standards of Reporting Trials

QASC:

Quality in Acute Stroke Care

RCTs:

Randomised controlled trials

SPIRIT:

Standard Protocol Items: Recommendations for Interventional Trials

SQUIRE:

Standards for QUality Improvement Reporting Excellence

StaRI:

Standards for Reporting Implementation Studies

TIA:

Transient ischemic attack

TIDier:

Template for Intervention Description and Replication

References

  1. Curtis K, Fry M, Shaban RZ, Considine J. Translating research findings to clinical nursing practice. J Clin Nurs. 2017;26(5–6):862–72.

    Article  PubMed  Google Scholar 

  2. Quinn JM, Gephart SM, Davis MP. External Facilitation as an Evidence-Based Practice Implementation Strategy During an Antibiotic Stewardship Collaborative in Neonatal Intensive Care Units. Worldviews Evid Based Nurs. 2019;16(6):454–61.

    Article  PubMed  Google Scholar 

  3. Kitson AL, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A. Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges. Implement Sci. 2008;3(1):1–12.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated Consolidated Framework for Implementation Research based on user feedback. Implement Sci. 2022;17(1):75.

    Google Scholar 

  5. Meyers DC, Durlak JA, Wandersman A. The quality implementation framework: a synthesis of critical steps in the implementation process. Am J Community Psychol. 2012;50:462–80.

    Google Scholar 

  6. Stetler CB, Legro MW, Rycroft-Malone J, Bowman C, Curran G, Guihan M, Hagedorn H, Pineros S, Wallace CM. Role of “external facilitation” in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implement Sci. 2006;1(1):1–15.

    Article  Google Scholar 

  7. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, Proctor EK, Kirchner JE. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):1–14.

    Article  Google Scholar 

  8. Bornbaum CC, Kornas K, Peirson L, Rosella LC. Exploring the function and effectiveness of knowledge brokers as facilitators of knowledge translation in health-related settings: a systematic review and thematic analysis. Implement Sci. 2015;10(1):1–12.

    Google Scholar 

  9. Dogherty EJ, Harrison MB, Baker C, Graham ID. Following a natural experiment of guideline adaptation and early implementation: a mixed-methods study of facilitation. Implement Sci. 2012;7(1):1–12.

    Article  Google Scholar 

  10. Miech EJ, Rattray NA, Flanagan ME, Damschroder L, Schmid AA, Damush TM. Inside help: an integrative review of champions in healthcare-related implementation. SAGE Open Med. 2018;6:2050312118773261.

    Google Scholar 

  11. Machline-Carrion MJ, Santucci EV, Damiani LP, Bahit MC, Málaga G, Pontes-Neto OM, Martins SCO, Zétola VF, Normilio-Silva K, De Freitas GR. Effect of a quality improvement intervention on adherence to therapies for patients with acute ischemic stroke and transient ischemic attack: a cluster randomized clinical trial. JAMA Neurol. 2019;76(8):932–41.

  12. Donnellan C, Sweetman S, Shelley E. Health professionals’ adherence to stroke clinical guidelines: a review of the literature. Health Policy. 2013;111(3):245–63.

    Google Scholar 

  13. Williams L, Daggett V, Slaven JE, Yu Z, Sager D, Myers J, Plue L, Woodward-Hagg H, Damush TM. A cluster-randomised quality improvement study to improve two inpatient stroke quality indicators. BMJ Qual Saf. 2016;25(4):257–64.

    Article  PubMed  Google Scholar 

  14. Cadilhac DA, Purvis T, Kilkenny MF, Longworth M, Mohr K, Pollack M, Levi CR. Evaluation of rural stroke services: does implementation of coordinators and pathways improve care in rural hospitals? Stroke. 2013;44(10):2848–53.

    Article  PubMed  Google Scholar 

  15. Purvis T, Moss K, Francis L, Borschmann K, Kilkenny MF, Denisenko S, Bladin CF, Cadilhac DA. Benefits of clinical facilitators on improving stroke care in acute hospitals: a new programme for Australia. Intern Med J. 2017;47(7):775–84.

    Article  PubMed  Google Scholar 

  16. Bidassie B, Williams LS, Woodward-Hagg H, Matthias MS, Damush TM. Key components of external facilitation in an acute stroke quality improvement collaborative in the Veterans Health Administration. Implement Sci. 2015;10(1):1–9.

    Article  Google Scholar 

  17. Berta W, Cranley L, Dearing JW, Dogherty EJ, Squires JE, Estabrooks CA. Why (we think) facilitation works: insights from organizational learning theory. Implement Sci. 2015;10:1–13.

  18. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8:139.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Powell BJ, Fernandez ME, Williams NJ, Aarons GA, Beidas RS, Lewis CC, McHugh SM, Weiner BJ. Enhancing the Impact of Implementation Strategies in Healthcare: A Research Agenda. Front Public Health. 2019;7:3.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Dogherty EJ, Harrison MB, Graham ID. Facilitation as a role and process in achieving evidence-based practice in nursing: A focused review of concept and meaning. Worldviews Evid Based Nurs. 2010;7(2):76–89.

    PubMed  Google Scholar 

  21. Seers K, Rycroft-Malone J, Cox K, Crichton N, Edwards RT, Eldh AC, Estabrooks CA, Harvey G, Hawkes C, Jones C. Facilitating Implementation of Research Evidence (FIRE): an international cluster randomised controlled trial to evaluate two models of facilitation informed by the Promoting Action on Research Implementation in Health Services (PARIHS) framework. Implement Sci. 2018;13(1):1–11.

    Article  Google Scholar 

  22. Olmos-Ochoa TT, Ganz DA, Barnard JM, Penney L, Finley EP, Hamilton AB, Chawla N. Sustaining implementation facilitation: a model for facilitator resilience. Implement Sci Commun. 2021;2(1):1–9.

    Article  Google Scholar 

  23. Thayabaranathan T, Andrew NE, Grimley R, Stroil-Salama E, Grabsch B, Hill K, Cadigan G, Purvis T, Middleton S, Kilkenny MF, et al. Understanding the Role of External Facilitation to Drive Quality Improvement for Stroke Care in Hospitals. Healthcare (Basel). 2021;9(9):25.

    Google Scholar 

  24. Middleton S, Lydtin A, Comerford D, Cadilhac DA, McElduff P, Dale S, Hill K, Longworth M, Ward J, Cheung NW. From QASC to QASCIP: successful Australian translational scale-up and spread of a proven intervention in acute stroke using a prospective pre-test/post-test study design. BMJ Open. 2016;6(5): e011568.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Middleton S, McElduff P, Ward J, Grimshaw JM, Dale S, D’Este C, Drury P, Griffiths R, Cheung NW, Quinn C. Implementation of evidence-based treatment protocols to manage fever, hyperglycaemia, and swallowing dysfunction in acute stroke (QASC): a cluster randomised controlled trial. Lancet. 2011;378(9804):1699–706.

    Article  PubMed  Google Scholar 

  26. Middleton S, Dale S, McElduff B, Coughlan K, McInnes E, Mikulik R, Fischer T, Van der Merwe J, Cadilhac D, D’Este C, et al. Translation of nurse-initiated protocols to manage fever, hyperglycaemia and swallowing following stroke across Europe (QASC Europe): A pre-test/post-test implementation study. Eur Stoke J. 2022;8(1):132–47.

    Google Scholar 

  27. Fasugba O, Dale S, McInnes E, Cadilhac DA, Noetel M, Coughlan K, McElduff B, Kim J, Langley T, Cheung NW, et al. Evaluating remote facilitation intensity for multi-national translation of nurse-initiated stroke protocols (QASC Australasia): a protocol for a cluster randomised controlled trial. Implement Sci. 2023;18(1):2.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Munn Z, Peters MD, Stern C, Tufanaru C, McArthur A, Aromataris E. Systematic review or scoping review? Guidance for authors when choosing between a systematic or scoping review approach. BMC Med Res Methodol. 2018;18:1–7.

    Article  Google Scholar 

  29. Pollock D, Davies EL, Peters MD, Tricco AC, Alexander L, McInerney P, Godfrey CM, Khalil H, Munn Z. Undertaking a scoping review: A practical guide for nursing and midwifery students, clinicians, researchers, and academics. J Adv Nurs. 2021;77(4):2102–13.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Arksey H, O’Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol. 2005;8(1):19–32.

    Article  Google Scholar 

  31. Tricco AC, Lillie E, Zarin W, O’Brien KK, Colquhoun H, Levac D, Moher D, Peters MD, Horsley T, Weeks L. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. 2018;169(7):467–73.

    Article  PubMed  Google Scholar 

  32. Handley MA, Lyles CR, McCulloch C, Cattamanchi A. Selecting and improving quasi-experimental designs in effectiveness and implementation research. Annu Rev Public Health. 2018;39:5–25.

    Google Scholar 

  33. The Cochrane Collaboration. 2011. Data collection form - form for RCTs and non-RCTs. Available from: https://view.officeapps.live.com/op/view.aspx?src=https%3A%2F%2Ftraining.cochrane.org%2Fsites%2Ftraining.cochrane.org%2Ffiles%2Fpublic%2Fuploads%2Fresources%2Fdownloadable_resources%2FEnglish%2FCollecting%2520data%2520-%2520form%2520for%2520RCTs%2520and%2520non-RCTs.doc&wdOrigin=BROWSELINK.

  34. Hong QN, Pluye P, Fabregues S, Barlett G, Boardman F, Cargo M, Dagenais P, Gagnon M-P, Griffiths F, Nicolau B, et al. 2018. MIXED METHODS APPRAISAL TOOL (MMAT). Available from: http://mixedmethodsappraisaltoolpublic.pbworks.com/w/file/fetch/127916259/MMAT_2018_criteria-manual_2018-08-01_ENG.pdf.

  35. Popay J, Roberts H, Sowden A, Petticrew M, Arai L, Rodgers M, Britten N, Roen K, Duffy S. 2006. Guidance on the Conduct of Narrative Synthesis in Systematic Reviews: A Product from the ESRC Methods Programme. Available from: https://www.lancaster.ac.uk/media/lancaster-university/content-assets/documents/fhm/dhr/chir/NSsynthesisguidanceVersion1-April2006.pdf.

  36. Cadilhac DA, Marion V, Andrew NE, Breen SJ, Grabsch B, Purvis T, Morrison JL, Lannin NA, Grimley RS, Middleton S, Kilkenny MF. A Stepped-Wedge Cluster-Randomized Trial to Improve Adherence to Evidence-Based Practices for Acute Stroke Management. Jt Comm J Qual Saf. 2022;48(12):653–64.

    Google Scholar 

  37. Middleton S, Dale S, Cheung NW, Cadilhac DA, Grimshaw JM, Levi C, McInnes E, Considine J, McElduff P, Gerraty R, et al. Nurse-Initiated Acute Stroke Care in Emergency Departments: The Triage, Treatment, and Transfer Implementation Cluster Randomized Controlled Trial. Stroke. 2019;50(6):1346–55.

    Article  PubMed  Google Scholar 

  38. Salbach NM, Wood-Dauphinee S, Desrosiers J, Eng JJ, Graham ID, Jaglal SB, Korner-Bitensky N, MacKay-Lyons M, Mayo NE, Richards CL, et al. Facilitated interprofessional implementation of a physical rehabilitation guideline for stroke in inpatient settings: process evaluation of a cluster randomized trial. Implement Sci. 2017;12:1–11.

    Article  Google Scholar 

  39. Bravata DM, Myers LJ, Perkins AJ, Zhang Y, Miech EJ, Rattray NA, Penney LS, Levine D, Sico JJ, Cheng EM, Damush TM. Assessment of the Protocol-Guided Rapid Evaluation of Veterans Experiencing New Transient Neurological Symptoms (PREVENT) Program for Improving Quality of Care for Transient Ischemic Attack: A Nonrandomized Cluster Trial. JAMA Netw Open. 2020;3(9):e2015920–e.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Jolliffe L, Hoffmann T, Churilov L, Lannin NA. What is the feasibility and observed effect of two implementation packages for stroke rehabilitation therapists implementing upper limb guidelines? A cluster controlled feasibility study. BMJ Open Qual. 2020;9(2):05.

    Article  Google Scholar 

  41. Cadilhac DA, Grimley R, Kilkenny MF, Andrew NE, Lannin NA, Hill K, Grabsch B, Levi CR, Thrift AG, Faux SG, et al. Multicenter, Prospective, Controlled, Before-and-After, Quality Improvement Study (Stroke123) of Acute Stroke Care. Stroke. 2019;50(6):1525–30.

    Article  PubMed  PubMed Central  Google Scholar 

  42. Moore JL, Virva R, Henderson C, Lenca L, Butzer JF, Lovell L, Roth E, Graham ID, Hornby TG. Applying the Knowledge-to-Action Framework to Implement Gait and Balance Assessments in Inpatient Stroke Rehabilitation. Arch Phys Med Rehabil. 2020;27:27.

    Google Scholar 

  43. Salbach NM, MacKay-Lyons M, Howe JA, McDonald A, Solomon P, Bayley MT, McEwen S, Nelson M, Bulmer B, Lovasi GS. Assessment of Walking Speed and Distance Post-Stroke Increases After Providing a Theory-Based Toolkit. J Neurol Phys Ther. 2022;46(4):251–9.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Wielaert S, van de Sandt-Koenderman MWME, Dammers N, Sage K. ImPACT: a multifaceted implementation for conversation partner training in aphasia in Dutch rehabilitation settings. Disabil Rehabil. 2018;40(1):76–89.

    Article  PubMed  Google Scholar 

  45. Levy T, Killington M, Laver K, Lannin NA, Crotty M. Developing and implementing an exercise-based group for stroke survivors and their carers: the Carers Count group. Disabil Rehabil. 2022;44(15):3982–91.

    Article  PubMed  Google Scholar 

  46. Bravata DM, Miech EJ, Myers LJ, Perkins AJ, Zhang Y, Rattray NA, Baird SA, Penney LS, Austin C, Damush TM. The Perils of a “My Work Here is Done” perspective: a mixed methods evaluation of sustainment of an evidence-based intervention for transient ischemic attack. BMC Health Serv Res. 2022;22(1):857.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Damush TM, Penney LS, Miech EJ, Rattray NA, Baird SA, Cheatham AJ, Austin C, Sexson A, Myers LJ, Bravata DM. Acceptability of a complex team-based quality improvement intervention for transient ischemic attack: a mixed-methods study. BMC Health Serv Res. 2021;21(1):453.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Penney LS, Damush TM, Rattray NA, Miech EJ, Baird SA, Homoya BJ, Myers LJ, Bravata DM. Multi-tiered external facilitation: the role of feedback loops and tailored interventions in supporting change in a stepped-wedge implementation trial. Implement Sci Commun. 2021;2(1):82.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Rattray NA, Damush TM, Miech EJ, Homoya B, Myers LJ, Penney LS, Ferguson J, Giacherio B, Kumar M, Bravata DM. Empowering Implementation Teams with a Learning Health System Approach: Leveraging Data to Improve Quality of Care for Transient Ischemic Attack. J Gen Intern Med. 2020;35(2):823–31.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Munce SEP, Graham ID, Salbach NM, Jaglal SB, Richards CL, Eng JJ, Desrosiers J, MacKay-Lyons M, Wood-Dauphinee S, Korner-Bitensky N, et al. Perspectives of health care professionals on the facilitators and barriers to the implementation of a stroke rehabilitation guidelines cluster randomized controlled trial. BMC Health Serv Res. 2017;17:1–13.

    Article  Google Scholar 

  51. Wielaert SM, Berns P, van de Sandt-Koenderman MWME, Dammers N, Sage K. ‘Now it is about me having to learn something ….’ Partners’ experiences with a Dutch conversation partner training programme (PACT). Int J Speech Lang Pathol. 2017;52(2):143–54.

    Google Scholar 

  52. Salbach NM, MacKay-Lyons M, Solomon P, Howe J-A, McDonald A, Bayley MT, Veitch S, Sivarajah L, Cacoilo J, Mihailidis A. The role of theory to develop and evaluate a toolkit to increase clinical measurement and interpretation of walking speed and distance in adults post-stroke. Disabil Rehabil. 2022;44(14):3719–35.

    Article  PubMed  Google Scholar 

  53. McInnes E, Dale S, Craig L, Phillips R, Fasugba O, Schadewaldt V, Cheung NW, Cadilhac DA, Grimshaw JM, Levi C, et al. Process evaluation of an implementation trial to improve the triage, treatment and transfer of stroke patients in emergency departments (T3 trial): a qualitative study. Implement Sci. 2020;15(1):99.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Wielaert SM, Sage K, Heijenbrok-Kal MH, Van De Sandt-Koenderman MWME. Candidacy for conversation partner training in aphasia: findings from a Dutch implementation study. Aphasiology. 2016;30(6):699–718.

    Article  Google Scholar 

  55. Salbach NM, McDonald A, MacKay-Lyons M, Bulmer B, Howe J-A, Bayley MT, McEwen S, Nelson M, Solomon P. Experiences of Physical Therapists and Professional Leaders With Implementing a Toolkit to Advance Walking Assessment Poststroke: A Realist Evaluation. Phys Ther. 2021;101(12):1–11.

    Article  Google Scholar 

  56. Cadilhac D, Stroil Salama E, Hill K, Middleton S, Horton E, Meade I, Kuhle S, Nelson M, Grimley R. Improving discharge care: the potential of a new organisational intervention to improve discharge after hospitalisation for acute stroke, a controlled before–after pilot study. BMJ Open. 2017;7(8):e016010.

    Article  PubMed  PubMed Central  Google Scholar 

  57. Ritchie MJ, Parker LE, Kirchner JE. From novice to expert: a qualitative study of implementation facilitation skills. Implement Sci Commun. 2020;1:25.

    Article  PubMed  PubMed Central  Google Scholar 

  58. Hunter SC, Young JA, Lawless MT, Kitson AL, Feo R. Introducing an interactional approach to exploring facilitation as an implementation intervention: examining the utility of Conversation Analysis. Implement Sci Commun. 2020;1:1–11.

    Google Scholar 

  59. Cranley LA, Cummings GG, Profetto-McGrath J, Toth F, Estabrooks CA. Facilitation roles and characteristics associated with research use by healthcare professionals: a scoping review. BMJ Open. 2017;7(8):e014384.

    Article  PubMed  PubMed Central  Google Scholar 

  60. Ellard D, Parsons S. Process evaluation: understanding how and why interventions work. In: Thorogood M, Coombes Y, editors. International review of cytology. London: Oxford Academic; 2010. p. 87–104.

  61. Hulscher M, Wensing M. Process evaluation of implementation strategies. In: Wensing M, Grol R, Grimshaw J, editors. Improving Patient Care: The Implementation of Change in Health Care. United States: Wiley; 2020. p. 369–87.

  62. Ritchie MJ, Kirchner JE, Parker LE, Curran GM, Fortney JC, Pitcock JA, Bonner LM, Kilbourne AM. Evaluation of an implementation facilitation strategy for settings that experience significant implementation barriers. Implement Sci. 2015;10:1–3.

    Google Scholar 

  63. Rycroft-Malone J, Seers K, Eldh AC, Cox K, Crichton N, Harvey G, Hawkes C, Kitson A, McCormack B, McMullan C. A realist process evaluation within the Facilitating Implementation of Research Evidence (FIRE) cluster randomised controlled international trial: an exemplar. Implement Sci. 2018;13:1–15.

  64. Altman DG, Moher D: Importance of Transparent Reporting of Health Research. In Guidelines for Reporting Health Research: A User’s Manual. 2014: 1–13

  65. Simera I, Moher D, Hirst A, Hoey J, Schulz KF, Altman DG. Transparent and accurate reporting increases reliability, utility, and impact of your research: reporting guidelines and the EQUATOR Network. BMC Med. 2010;8(1):24.

    Article  PubMed  PubMed Central  Google Scholar 

  66. Michie S, Fixsen D, Grimshaw JM, Eccles MP. Specifying and reporting complex behaviour change interventions: the need for a scientific method. Implement Sci. 2009;4:40.

    Article  PubMed  PubMed Central  Google Scholar 

  67. Ritchie MJ, Kirchner JE, Townsend JC, Pitcock JA, Dollar KM, Liu C-F. Time and organizational cost for facilitating implementation of primary care mental health integration. J Gen Intern Med. 2020;35:1001–10.

    Google Scholar 

  68. Reilly KL, Reeves P, Deeming S, Yoong SL, Wolfenden L, Nathan N, Wiggers J. Economic analysis of three interventions of different intensity in improving school implementation of a government healthy canteen policy in Australia: costs, incremental and relative cost effectiveness. BMC Public Health. 2018;18(1):378.

    Article  PubMed  PubMed Central  Google Scholar 

  69. Lynch EA, Bulto LN, Cheng H, Craig L, Luker JA, Bagot KL, Thayabaranathan T, Janssen H, McInnes E, Middleton S, Cadilhac DA. Interventions for the uptake of evidence-based recommendations in acute stroke settings. Cochrane Database Syst Rev. 2023;8(8):Cd012520.

    PubMed  Google Scholar 

  70. Levi CR, Attia JA, D’Este C, Ryan AE, Henskens F, Kerr E, Parsons MW, Sanson-Fisher RW, Bladin CF, Lindley RI. Cluster-Randomized trial of thrombolysis implementation support in metropolitan and regional Australian stroke centers: lessons for individual and systems behavior change. J Am Heart Assoc. 2020;9(3):e012732.

    Google Scholar 

  71. Lynch EA, Cadilhac DA, Luker JA, Hillier SL. Education-only versus a multifaceted intervention for improving assessment of rehabilitation needs after stroke; a cluster randomised trial. Implement Sci. 2016;11(1):120.

  72. Power M, Tyrrell PJ, Rudd AG, Tully MP, Dalton D, Marshall M, Chappell I, Corgié D, Goldmann D, Webb D, et al. Did a quality improvement collaborative make stroke care better? A cluster randomized trial. Implement Sci. 2014;9(1):40.

    Google Scholar 

  73. Shrubsole K, Worrall L, Power E, O’Connor DA. The Acute Aphasia IMplementation Study (AAIMS): a pilot cluster randomized controlled trial. Int J Lang Commun Disord. 2018;53(5):1021–56.

  74. Garner BR, Gotham HJ, Chaple M, Martino S, Ford JH, Roosa MR, et al. The implementation and sustainment facilitation strategy improved implementation effectiveness and intervention effectiveness: Results from a cluster-randomized, type 2 hybrid trial. Implementation Research and Practice. 2020;1:1–23.

    Article  Google Scholar 

  75. Bucknall TK, Considine J, Harvey G, Graham ID, Rycroft-Malone J, Mitchell I, Saultry B, Watts JJ, Mohebbi M, Mudiyanselage SB. Prioritising Responses Of Nurses To deteriorating patient Observations (PRONTO): a pragmatic cluster randomised controlled trial evaluating the effectiveness of a facilitation intervention on recognition and response to clinical deterioration. BMJ Qual Saf. 2022;31(11):818–30.

  76. Baskerville NB, Liddy C, Hogg W. Systematic review and meta-analysis of practice facilitation within primary care settings. Ann Fam Med. 2012;10(1):63–74.

    Article  PubMed  PubMed Central  Google Scholar 

  77. Sarkies MN, Robins LM, Jepson M, Williams CM, Taylor NF, O’Brien L, Martin J, Bardoel A, Morris ME, Carey LM. Effectiveness of knowledge brokering and recommendation dissemination for influencing healthcare resource allocation decisions: A cluster randomised controlled implementation trial. PLoS Med. 2021;18(10):e1003833.

  78. Smith SN, Liebrecht CM, Bauer MS, Kilbourne AM. Comparative effectiveness of external vs blended facilitation on collaborative care model implementation in slow-implementer community practices. Health Serv Res. 2020;55(6):954–65.

    Google Scholar 

  79. Garner BR, Gotham HJ, Chaple M, Martino S, Ford JH, Roosa MR, Speck KJ, Vandersloot D, Bradshaw M, Ball EL. The implementation and sustainment facilitation strategy improved implementation effectiveness and intervention effectiveness: Results from a cluster-randomized, type 2 hybrid trial. Implement Res Pract. 2020;1:2633489520948073.

  80. Colquhoun H, Leeman J, Michie S, Lokker C, Bragge P, Hempel S, McKibbon KA. Towards a common terminology: a simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. J Towards a common terminology: Implement Sci. 2014;9(1):781.

    Google Scholar 

  81. Bragge P, Grimshaw JM, Lokker C, Colquhoun H, Albrecht L, Baron J, Dadich A, Damschroder L, Danko K, Fernandez ME, et al. AIMD - a validated, simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. BMC Med Res Methodol. 2017;17(1):38.

    Article  PubMed  PubMed Central  Google Scholar 

  82. Hoffmann TC, Glasziou PP, Boutron I, Milne R, Perera R, Moher D, Altman DG, Barbour V, Macdonald H, Johnston M, et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348: g1687.

    Article  PubMed  Google Scholar 

  83. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ, Rycroft-Malone J, Meissner P, Murray E, Patel A, et al. Standards for Reporting Implementation Studies (StaRI) Statement. BMJ. 2017;356: i6795.

    Article  PubMed  PubMed Central  Google Scholar 

  84. Albrecht L, Archibald M, Arseneau D, Scott SD. Development of a checklist to assess the quality of reporting of knowledge translation interventions using the Workgroup for Intervention Development and Evaluation Research (WIDER) recommendations. Implement Sci. 2013;8:52.

    Article  PubMed  PubMed Central  Google Scholar 

  85. Scott SD, Albrecht L, O’Leary K, Ball GDC, Hartling L, Hofmeyer A, Jones CA, Klassen TP, Burns KK, Newton AS, et al. Systematic review of knowledge translation strategies in the allied health professions. Implement Sci. 2012;7(1):70.

    Article  PubMed  PubMed Central  Google Scholar 

  86. Albers B, Metz A, Burke K. Implementation support practitioners–a proposal for consolidating a diverse evidence base. BMC Health Serv Res. 2020;20(1):368.

Download references

Acknowledgements

The authors thank research assistants Nhat Nguyen, Tan Bui and Sawsan Saheb for assisting with data charting and critical appraisal of evidence.

Dr Fasugba, Ms Dale, Ms Coughlan, Prof Cadilhac, Mr Hill and Prof Middleton acknowledges support from the Centre of Research Excellence to Accelerate Stroke Trial Innovation and Translation, Australian National Health and Medical Research Council, grant number 2015705.

Funding

This study is funded by a National Health and Medical Research Council Investigator Grant (Grant ID: APP1196352) awarded to SM. The funding body had no role in the design of the study and collection, analysis, and interpretation of data and in writing the manuscript.

Author information

Authors and Affiliations

Authors

Contributions

OF conceived the study. SM, EM, SD and KC helped with conception of the study. OF, HC and SD coordinated the study. EM provided scoping review and qualitative research methods expertise. JMG, DAC, KH, CL, AR, NWC, KH, KP, ES, EN, VP, JS and EG participated in the design of the study. HC, OF and SD conducted the database searches, title and abstract screening and full text review. OF and HC prepared the first draft of the manuscript. JMG, DAC, CL, AR, NWC, KH, SD, KC, KP, ES, EN, VP, JS, EG, EM and SM provided critical input and revised the manuscript draft. All authors read and approved the final version.

Corresponding author

Correspondence to Sandy Middleton.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

13012_2025_1415_MOESM1_ESM.docx

Additional file 1. Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) Checklist

Additional file 2. Search strategy

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fasugba, O., Cheng, H., Dale, S. et al. Finding the right dose: a scoping review examining facilitation as an implementation strategy for evidence-based stroke care. Implementation Sci 20, 4 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-025-01415-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-025-01415-w

Keywords