Skip to main content

Advertisement

Simple rules for evidence translation in complex systems: A qualitative study

Abstract

Background

Ensuring patients benefit from the latest medical and technical advances remains a major challenge, with rational-linear and reductionist approaches to translating evidence into practice proving inefficient and ineffective. Complexity thinking, which emphasises interconnectedness and unpredictability, offers insights to inform evidence translation theories and strategies. Drawing on detailed insights into complex micro-systems, this research aimed to advance empirical and theoretical understanding of the reality of making and sustaining improvements in complex healthcare systems.

Methods

Using analytical auto-ethnography, including documentary analysis and literature review, we assimilated learning from 5 years of observation of 22 evidence translation projects (UK). We used a grounded theory approach to develop substantive theory and a conceptual framework. Results were interpreted using complexity theory and ‘simple rules’ were identified reflecting the practical strategies that enhanced project progress.

Results

The framework for Successful Healthcare Improvement From Translating Evidence in complex systems (SHIFT-Evidence) positions the challenge of evidence translation within the dynamic context of the health system. SHIFT-Evidence is summarised by three strategic principles, namely (1) ‘act scientifically and pragmatically’ – knowledge of existing evidence needs to be combined with knowledge of the unique initial conditions of a system, and interventions need to adapt as the complex system responds and learning emerges about unpredictable effects; (2) ‘embrace complexity’ – evidence-based interventions only work if related practices and processes of care within the complex system are functional, and evidence-translation efforts need to identify and address any problems with usual care, recognising that this typically includes a range of interdependent parts of the system; and (3) ‘engage and empower’ – evidence translation and system navigation requires commitment and insights from staff and patients with experience of the local system, and changes need to align with their motivations and concerns. Twelve associated ‘simple rules’ are presented to provide actionable guidance to support evidence translation and improvement in complex systems.

Conclusion

By recognising how agency, interconnectedness and unpredictability influences evidence translation in complex systems, SHIFT-Evidence provides a tool to guide practice and research. The ‘simple rules’ have potential to provide a common platform for academics, practitioners, patients and policymakers to collaborate when intervening to achieve improvements in healthcare.

Background

There is an urgent need to improve the delivery of high quality healthcare, including the need to improve patient safety and reduce harm [1,2,3], to ensure care is patient centred and compassionate [4, 5], to improve health and wellbeing [6], and to reduce inequalities at the local, regional, national and global scale [7,8,9], all within an increasingly constrained financial environment [10, 11].

To address these challenges, there is a need to bridge the gap between the production of research evidence and the consistent delivery of evidence-based care in routine practice [12,13,14,15]. There is growing acknowledgement that translation of evidence is often ineffective and inefficient, and there is a need to develop a scientific and practical understanding of how to implement evidence into practice and achieve fast and reliable improvements in care [16,17,18].

Traditional approaches to translating evidence into practice have taken a rational-linear approach (where knowledge is created by one set of experts and passed on to another set to be implemented) [19, 20]. Evaluations have focused on identifying simple causal relationships between interventions and outcomes, aiming to produce generalisable knowledge about what works [16]. To establish causal relationships, studies tend to be conducted in controlled environments, where interference from context variables is considered problematic and controlled for by randomisation and protocol design [17].

It is increasingly recognised that context matters; having an ‘appropriate’ context can support an intervention achieve its outcome [21]. Approaches to translating evidence into practice have taken an interest in how interventions can be adapted to work in different settings [22, 23], and many researchers have turned to realist evaluations in an attempt to understand ‘what works, for whom, in what settings’ and establish more nuanced and caveated causal statements [21, 24].

When designing intervention and implementation strategies, as well as when conducting rigorous evaluations, there is a tendency to reduce messy real world situations into the individual component parts in an attempt to determine the relationships between them. Doing so risks overlooking the complex and intricate patterns that emerge from their interactions.

Complexity sciences provide an alternative approach to studying interventions in complex systems such as healthcare. Complexity science originated in physical chemistry as a ‘push-back’ against traditional reductionist approaches [25]. Put simply, life is more than molecules and atoms – it is the complex patterns of organisation that emerge between them [26, 27]. Similarly, it has been proposed that healthcare can be considered as a complex system [28, 29] (or complex adaptive system) [30, 31], with the whole being more than simply the sum of its parts. On their own, the professionals, equipment and devices in any healthcare setting achieve nothing; it is the interactions between them, and with patients, that result in the delivery of care.

Complex systems are characterised as a dynamic network of agents acting in parallel, constantly reacting to what the other agents are doing, which in turn influences the behaviour of the network as a whole [32]. The interconnected nature of their interactions can lead to uncertainty and surprise as systems self-organise and evolve over time in response to internal and external stimuli and feedback loops [28, 33]. This non-linearity means that complex systems can defy orchestrated intervention, wherein seemingly obvious solutions can have minimal impact on system behaviour (e.g. policy resistance) [34], whilst small changes can have big unanticipated consequences. Such systems have strong historical path dependencies, meaning that initial conditions are influenced by historic events and patterns, and that they can markedly influence what happens in the future.

On the one hand, complex systems are highly dynamic, continually responding and adapting to internal and external stimuli. While, on the other, they can demonstrate inertia where embedded behaviours remain unchanged and even temporary perturbations or major structural alterations can fail to disrupt existing norms [34, 35]. From these unpredictable and evolving systems emerge patterns, behaviours, structures and routines which define the system and guide behaviours within it [33, 36]. Complexity theorists propose that ‘simple rules’ offer a means of understanding and managaing the emergent behaviour of complex systems [26, 34].

The use of complexity science as a lens to understand healthcare systems is increasing [36]. To date, research studies have predominantly focused on describing healthcare systems as complex, yet there is less understanding of how to predict or intervene [37]. Advances have tended to be theoretical with the purpose of guiding evaluations or further research [38, 39]. Whilst there is an increased use of the term complexity, there is little evidence that the concepts of complex systems have been applied to the design of interventions or implementation strategies [40]. As such, Braithwaite et al. [36] have called for a greater clarity about how to study and apply the principles of complex systems in practice.

This study aims to develop a deeper explanation of evidence translation in healthcare using a complex systems lens, thereby contributing to both the fields of implementation science and complexity science. Drawing on detailed insights into complex micro-systems, this research advances empirical and theoretical understanding. A primary focus is given to understanding the implications of complexity theory with an objective of identifying a series of ‘simple rules’ about how to intervene in complex systems. The ‘simple rules’ aim to make complexity navigable (whilst recognising that it will never be simple), providing actionable guidance to both practice and research.

Methods

Study design

The study was conducted using an analytical auto-ethnography and grounded theory approach (Fig. 1). An analytical auto-ethnographic approach was adopted reflecting that the authors of this paper were full members of the research setting (conducting ethnography of ‘our own people’ as members of the "core team" (Fig. 1)), visible as such a member in published texts, and committed to developing theoretical understandings of broader social phenomena [41].

Fig. 1
figure1

A schematic representation of data collection and coding approach

Empirical data was collected through participant observation and document analysis of the National Institute of Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care (CLAHRC), Northwest London (NWL) programme (UK), and 22 evidence translation projects (Additional file 1). This allowed direct access to and observation of actions, events, scenes and people in real-time over a 5-year period, with opportunities to follow-up on emergent patterns and problems. Concurrently, extensive literature was reviewed using a snowballing approach to identify frameworks, models, systematic reviews and other relevant literature (further details on data collection and literature review can be found in Additional file 2).

A grounded theory approach guided the data collection and analysis [42, 43]. Data was analysed using open, axial and selective coding, in parallel with theoretical sampling, to explore emergent categories and themes over time. This iterative analysis led to a process of ‘abduction’ to make sense of material that did not ‘fit’ into pre-established categories (including published frameworks and theories), thereby reconceptualising the challenge of evidence translation and improvement into a new substantive theory (to provide explanations and predictions related to the specific context of study) and conceptual framework (indicating how aspects of the theory are connected to each other). Further details are provided in Additional files 2 and 3.

This exploratory research approach was chosen to ensure that the resulting findings were empirically informed and theoretically grounded in the practical reality of evidence translation and improvement in real world (complex) settings. We chose not to build exclusively on any existing theories as no single existing framework fit well with our experiences. Whilst several fields of study were relevant, no single frameworks brought together concepts from different fields, including knowledge translation, implementation, improvement and complexity.

Results from the grounded theory analysis were interpreted through complex systems thinking [26, 28, 34, 35]. Emphasis was placed on developing a series of ‘simple rules’, which were identified through establishing relationships between challenges experienced by the project teams, and the actions and strategies that, if taken, had a positive effect on project progress and outcomes or, if they were absent or overlooked, were observed to have a detrimental impact.

Setting

The NIHR established the CLAHRC programme in England to accelerate the translation of evidence into practice for the benefit of patients. Thirteen regional CLAHRC programmes were funded, each led by academic and healthcare partnerships and with autonomy to decide how they would approach ‘closing’ the translational gap [44,45,46].

The CLAHRC NWL approach brought together healthcare staff, including clinical, managerial and support staff (hereafter referred to as ‘staff’) with patients, carers, family members and the wider community (hereafter, ‘patients’) and academic partners from a diverse range of disciplines (hereafter, ‘academics’) into project teams of 5–15 people to translate evidence into practice in their local micro-systems. Project teams used a suite of quality improvement tools and methods, including the model for improvement, action-effect diagrams and plan-do-study-act cycles, process mapping, statistical process control, stakeholder engagement, and patient and public involvement combined with iterative evaluation, to guide and support the implementation process [47,48,49,50,51].

During the first 5 years of the CLAHRC NWL (2008–2013), 22 diverse topics considered of clinical importance were explored with 55 teams over four rounds of 18-month projects (Fig. 1) in various settings (acute, community, primary care, mental health, etc.) (Additional file 1). All projects had a common goal of translating existing evidence into practice to achieve improvements in quality of care provision, with the aspiration of delivering corresponding improvements in patient outcomes. Two detailed case study examples are presented in the results section (Boxes 1 and 2).

This paper represents a consolidation of cross-project learning from the programme and peer-reviewed literature. Existing publications relating to evaluation of individual projects, cross-project analysis, use of quality improvement approaches, and external programme evaluations are listed in Additional file 1.

Results

Results are divided into two sections. Firstly, the new conceptual framework Successful Healthcare Improvements From Translation of Evidence into practice (SHIFT-Evidence) is presented, introducing the three strategic principles of the framework, namely ‘act scientifically and pragmatically’, ‘embrace complexity’ and ‘engage and empower’, and the 12 ‘simple rules’.

Secondly, there is a detailed presentation of the 12 ‘simple rules’ and accompanying substantive theory. The results demonstrate how the theory and rules emerged from the empirical data and how understanding is enhanced by application of a complex systems lens. The presentation of the rules and substantive theory is accompanied by two illustrative case examples from CLAHRC NWL projects to bring to life the practical reality of evidence translation.

A conceptual framework for SHIFT-Evidence

The theory of SHIFT-Evidence can be summarised as follows: to achieve successful improvements from evidence translation in healthcare, it is necessary to ‘act scientifically and pragmatically’ whilst ‘embracing the complexity’ of the setting in which change takes place and ‘engaging and empowering’ those responsible for and affected by the change.

SHIFT-Evidence reflects the nature of work and breadth of effort required to translate evidence into complex systems. The findings revealed that attention and effort was often drawn away from the original project focus in directions that were not anticipated in advance, such as dependent issues relating to people, processes or structures, or to resolve existing problems with ‘usual care’. We established that failure to resolve these issues compromised the success of an intervention and diminished the ability to draw useful conclusions about the effectiveness of any intervention in a real-world setting. As such, the SHIFT-Evidence framework is conceptually based on the premise that the implementation of evidence-based interventions is not necessarily sufficient to achieve improvements in care, and that it is not possible to fully anticipate what changes will be required in any individual setting. In short, evidence translation and wider systems improvement are inextricably linked within complex systems.

The accumulating data about the ‘daily realities’ of evidence translation and improvement required us to reconceptualise our understanding of the problem, and associated potential solutions. Our focus moved from evidence-based medicine and interventions, to focusing on the complexity of the systems within which we hoped to intervene. As such, literature relating to complex systems thinking grew in importance over time to become the primary lens by which we were able to make sense of our experiences (further details on this process of reconceptualisation are provided in Additional file 2).

Reflecting this reconceptualisation, ‘act scientifically and pragmatically’ was identified as the core category for selective coding. It was chosen to reflect the interaction between our starting world view (the need to use scientific evidence) and our core learning (the need to understand and respond to the constraints and opportunities of the local system). Our analysis indicated the tension between these perspectives, and also the opportunity for increased synergy between them, as follows:

  • An underlying tension was observed in the literature and in our empirical data between the production and use of generalisable knowledge (influenced by positivist and realist philosophical perspectives) and local context-specific problem solving (influenced by pragmatist and participatory philosophical perspectives).

  • We recognised the value of drawing insights from both perspectives. Effective improvement initiatives can benefit from drawing on a scientific knowledge base (evidence-based medicine, or other knowledge of effective interventions or change processes), and from making pragmatic adjustments in line with the opportunities and constraints of the actual setting for the change.

  • The change process can be guided by applying aspects of the scientific method at a local level so that clear aims and measures guide structured experimental processes to assess, learn and inform next steps. This resonates with the pragmatist notion of science to solve local problems of societal importance [52], and with the complexity literature notion of the “science of muddling through” in dynamic and evolving systems [53].

Two further important key categories were identified, namely ‘embrace complexity’ and ‘engage and empower’. These three high level conceptual categories are termed strategic principles, reflecting the guidance on how to conduct and research evidence translation and improvement in complex systems. These principles are underpinned by 12 associated ‘simple rules’, which describe the actions required to achieve each strategic principle.

The three strategic principles and 12 ‘simple rules’ are the following:

Act scientifically and pragmatically: Knowledge of existing evidence needs to be combined with knowledge of the unique initial conditions of a system. Interventions need to adapt as the complex system responds and learning emerges about unpredictable effects. This strategic principle reflects the high level stages of an improvement initiative through the four simple rules:

  • Understand problems and opportunities

  • Identify, test and iteratively develop potential solutions

  • Assess whether improvement is achieved, and capture and share learning

  • Invest in continual improvement

Embrace complexity: Evidence-based interventions only work if related practices and processes of care within the complex system are functional. Evidence-translation efforts need to identify and address existing problems with usual care, recognising that this typically includes a range of interdependent parts of the system. This principle emphasises the need to investigate and understand the uniqueness of each local system and respond to complexity from the micro- to macro-system as reflected by the four rules:

  • Understand processes and practices of care

  • Understand the types and sources of variation

  • Identify systemic issues

  • Seek political, strategic and financial alignment

Engage and empower: Evidence translation and system navigation requires commitment and insights from staff and patients with experience of the local system. Changes need to align with their motivations and concerns. The four rules reflect factors that influence engagement at an individual and team level through to supporting infrastructure and organisational level:

  • Actively engage those responsible for and affected by change

  • Facilitate dialogue

  • Foster a culture of willingness to learn and freedom to act

  • Provide headroom, resources, training and support

Relationship between SHIFT-Evidence principles: The process of evidence translation and improvement, as represented in SHIFT-Evidence, is intended to be a progressive iterative process. The ‘simple rules’ provide a conceptual framework to guide practice and research in complex systems, responding to emergent challenges and capturing generative learning (Fig. 2). In practice, feedback loops exist between each of the rules as learning emerges about the changes required and effectiveness of interventions. Few improvement initiatives follow a smooth linear pattern.

Fig. 2
figure2

A schematic representing the SHIFT-Evidence conceptual framework including the three strategic principles (act scientifically and pragmatically, embrace complexity, and engage and empower) with the 12 associated ‘simple rules’. The diagram represents the continual iterative process of evidence translation and improvement in healthcare

Our hypothesis is that all SHIFT-Evidence strategic principles and ‘simple rules’ are necessary to achieve successful and sustained improvements in care, and are interdependent. For example, ‘active engagement’ of healthcare professionals and patients is necessary to fully ‘understand practices and processes of care’. Equally, ‘active engagement’ of staff may reveal that their priorities do not align with current ‘strategic, political and financial’ incentives, and vice versa. Our hypothesis implies that such tensions, if unresolved, will negatively impact success.

Project narratives, common challenges and simple rules

Two of the 22 CLAHRC NWL project narratives are presented as detailed examples to illustrate the practical reality of evidence translation and improvement (Box 1 and 2). Both demonstrated measurable success against their original objectives, although each encountered unexpected obstacles. This is followed by presentation of the 12 simple rules, describing how the simple rules relate to the project narratives and substantive theory (Tables 1, 2 and 3), and reflecting on insights provided by complex systems thinking.

Table 1 Substantive theory for acting scientifically and pragmatically – challenges and corresponding actions required for successful evidence translation and improvement
Table 2 Substantive theory for embracing complexity – challenges and corresponding actions required for successful evidence translation and improvement
Table 3 Substantive theory for engaging and empowering – challenges and corresponding actions required for successful evidence translation and improvement

The project narrative in Box 1 outlines the challenges to embedding evidence-based practices and achieving improvements in care quality for patients with community-acquired pneumonia (CAP).

The second project (Box 2) illustrates the complexity of healthcare systems and how this was experienced by a clinical team attempting to improve medicines management (MM) for patients following discharge from hospital.

Act scientifically and pragmatically

The strategic principle ‘act scientifically and pragmatically’ demonstrates that knowledge of existing evidence is only one part of the effort required to achieve sustainable improvements in care in complex systems.

Understand the problem and opportunities

The two case studies reveal the challenges of introducing evidence-based practices or interventions into complex systems, and demonstrate how any intervention is sensitive to the unique initial conditions of the local system.

The MM project narrative shows how the interconnectivity of different system elements influenced the work that was required to improve the system; the desired intervention (a follow-up phone call) could not be initiated until dependent processes (medicines reconciliation at discharge from hospital) were improved.

The CAP project narrative demonstrates how the autonomy of individual agents working in the system challenged the introduction of the care bundle intervention; at the outset of the project, there was little incentive or motivation to take action to address a problem many perceived did not exist. Baseline data was required to help create tension for change by demonstrating the extent of the local problem.

Linear models for the spread and scale-up of evidence-based practices assume the same intervention can be applied to the same problem in multiple settings. Understanding the consequences of working in complex systems challenges these assumptions; the historical origins and path-dependency of any given system means that somewhat different problems or configurations of problems and opportunities will exist in any given setting [35]. To gain traction, effort needs to be invested in understanding priority issues and areas for improvement within the local system, and any interventions need to be perceived as relevant and actionable by system agents [54].

Identify, test and iteratively develop potential solutions

Both project narratives reveal how system interconnectedness presented a challenge to fully anticipating which changes were required. This was reflected at two levels. Firstly, each intervention needed to be refined and adapted in response to emergent learning about local practice and to fit with established processes (e.g. modifications to the design of the CAP care bundle or MM reconciliation form). Secondly, each project needed to address multiple parallel or dependent issues beyond the original scope of their project to achieve their improvement goal (e.g. CAP project needed to address antibiotic prescribing policies and microbiological test ordering processes, MM project needed to address pharmacy staffing rotas and the roles and responsibilities of junior doctors).

Observing evidence-translation through a complexity lens therefore suggests the need to consider multiple strategies for intervening and the considerable effort that is required to support the uptake of any specific evidence-based practices. System understanding emerges over time, and often in unexpected ways, through testing intervention ideas in practice and responding to insights and challenges that are often difficult to anticipate, reflecting tacit knowledge or deep-seated routines and cultural practices [55].

Assess whether improvement is achieved, and capture and share learning

Both project narratives reveal the challenges of gauging performance in a complex system from an individual perspective. Objective measurement revealed in both instances that standards of care were lower than anticipated (CAP patients receiving evidence-based care standards; MM patients with fully reconciled medicines at discharge). These findings provided an insight into ‘hidden’ system performance, and reflects, despite the good intentions and hard work from individual agents, the challenges of coordinating collective behaviour of agents towards a common goal.

The need for measurement to guide improvement efforts also applied when sharing learning. As the CAP care bundle rolled out to local hospitals, the original site shared their experiences of developing the intervention and implementation. Whilst some learning was captured formally in versions of the care bundle form and summaries of the actions taken, the written material provided only a partial representation of the issues encountered and their resolution. Much of the learning about what had happened was shared through dialogue. Even armed with this learning, local sites essentially started from the beginning, understanding their own local problems and opportunities, building will and motivation to adopt new ways of working, and adapting intervention concepts to work in their local setting.

Given the uncertainty and unpredictability of intervening in complex systems, objective measures can provide a driving force to inform project progress. Rather than assuming that interventions were used and effective, measurement supported teams to accurately assess progress towards their goal and revise and adapt interventions and implementation approaches in light of results [56].

Invest in continual improvement

The challenge of sustaining initial improvements required teams to navigate both system inertia, attempting to pull practices back to ‘the way things have always been done’, and system evolution in response to internal and external stimuli.

Whilst all CAP sites achieved initial success, not all sites sustained those gains. High staff turnover was a persistent challenge to maintaining improvements with systems suffering ‘memory loss’, particularly when junior doctors leave en masse during clinical rotations. Other challenges included the consistency of clinical and managerial leadership, their ability to maintain a high profile for the work, and to cope when other emerging and often competing priorities drew attention to other parts of the system. Sites that did sustain were able to connect care bundle use to other substantive practices such as standardised admission processes and a history of care bundle use for other clinical presentations.

This learning demonstrates that improvements in care are not static; indeed, the complex and adaptive nature of healthcare systems means emergent events may threaten or enhance achievements [57]. Translation cannot be seen as a one-off activity and on-going monitoring and review needs to guide actions to adapt to system dynamics and support long-term success [58]. This learning is summarised in our substantive theory presented in Table 1.

Embrace complexity

The strategic principle ‘embrace complexity’ demonstrates that evidence-based interventions only work if supporting or dependent practices and processes of care are working sufficiently well.

Understand practices and processes of care

The project narratives demonstrate that interventions do not exist in isolation, but need to fit with, and are dependent on, other practices and processes of care.

Initial perceptions of the project team leaders and other clinicians tended to view interventions in isolation from the system (MM perceived the follow-up phone call would be a standalone intervention to improve patient understanding of their medicines, and initial work of the CAP team focused exclusively on developing and perfecting details of the paper care bundle form). Once MM project team had identified the interdependency of the follow-up phone call with medicines reconciliation processes at discharge, they sought to understand why current practices were not working. They found that, although separate processes for documenting medicines reconciliation were routinely used with each individual staff group, they did not support communication and consolidation between staff groups. This was left to serendipity (e.g. being on the ward at the same time as another staff member) and personal effort to communicate and exchange information between professional groups. This insight led them to develop an additional intervention, namely a new shared form for medicines reconciliation that would be used by all four professional groups.

Complexity theories suggest that it is not possible to understand a system, or how to influence it, by reducing the system to its individual parts. As the projects progressed, it became increasingly apparent that project teams needed to look beyond individual competence or actions, to understand the complex interactions between individual agents, and the resulting patterns, that determine the quality of care [28].

Understand the types and sources of variation

A major challenge faced by the project teams was recognition that there is no single standardised way by which care is delivered. Whilst complex systems can give rise to regular patterns and ingrained behaviours, these are constantly perturbed by internal and external stimuli that systems adapt and respond to.

As the baseline data demonstrated, doctors’ knowledge of appropriate treatment for CAP patients did not translate into high quality care. The delivery of care needed to be reconceptualised as a series of hand-offs and interactions between multiple healthcare professionals (doctors, nurses, pharmacists, porters) each of which could be subject to various interruptions and delays whilst healthcare staff deal with multiple patients and competing priorities. Influencing factors ranged from the small acts of individual discretion (e.g. at what time a staff member took lunch break, how long they stopped to talk to a patient, or which order patients were seen in), to factors outside of any individuals immediate control (how many patients are admitted that day, the experience level of staff on shift, temporary staffing shortages (sickness, compassionate leave), chronic staffing shortages (funding, staff training and retention), and crisis events).

Investigation revealed that there were no routine processes for treating CAP. Each member of staff had developed individual approaches reflecting their personal knowledge of the system and relationships within it necessary to coordinate and deliver patient care. Introducing a shared standardised practice (the care bundle) helped to reduce variation but was not fail-safe and variation was still apparent, influenced by the factors listed above. The care bundle contributed to creating a more resilient process less likely to be affected by every day events such as interruptions or communication failures.

Intervening in complex systems requires an understanding of the variation inherent in all healthcare systems. Complex systems are dynamic and fluctuating, continually responding to internal and external stimuli, which means people have to make decisions and take action in real-world conditions. Rather than assuming standardised, idealised processes exist, it is necessary to understand and work with the complex reality of the settings in which care is delivered [59].

Identify systemic issues

The project narratives demonstrated that, even once interconnected and dependent processes and systems are identified, it cannot be assumed that they are working well.

The MM team discovered whole system issues with chains of dependencies, wherein phone calls depended on accurate information, accurate information depended on medicines reconciliation, and medicines reconciliation depended on staff coordination and joined-up procedures. Not all of these dependent, problematic, areas were within their direct control, and relationships had to be fostered with other key agents (e.g. educational leads, executive managers) to influence areas of concern. Some were considered unresolvable within the sphere of influence and timescales (e.g. interoperability of primary care and secondary care electronic health records) and were ‘parked’, or workarounds developed (e.g. where patients involved with the project developed a solution (patient-held medication records) that was not constrained by organisational or professional boundaries).

This demonstrates the nature of working in an open system. Not only is there interconnectedness within a system, but between various nested systems which connect and interact in a multitude of ways (e.g. the pharmacy system interacts with and is influenced by wider hospital systems, medical education systems, electronic record systems, etc.). Achieving an overall improvement required many other aspects of the system, and related systems, to be ‘fixed’. The original evidence-based intervention acted as a catalyst for a more comprehensive, complex and challenging system-wide analysis and an improvement process that required support and action from the wider organisation [60, 61].

Seek political, strategic and financial alignment

A persistent challenge faced by the project teams was that their individual areas of interest and interventions needed to compete for attention and resources with other initiatives or requirements.

Both projects were initially facilitated by financial support from the NIHR CLAHRC NWL programme, which created space and resource to test and develop interventions and capture an evidence base of their effectiveness. However, engaging already busy and fully committed clinical staff proved challenging, and given the system interdependencies, project teams needed to build strategic and political alignments with other system stakeholders to influence areas beyond their control.

The long-term sustainability of the projects was influenced by political, strategic and financial alignment. MM took advantage of changing political priorities to secure resources to support new ways of working and to increase awareness and perceptions of importance in frontline staff, and was able to sustain new medicines reconciliation practices. The sustainability of the CAP care bundle was variably influenced in the different organisations by their ability to align with key performance indicators, financial incentives or cost-saving initiatives.

Understanding complexity also means being aware of the constraints within the system. If more resources are consumed in one area, then another area will receive less. The finite amount of time, resource and attention within a system is already heavily committed with other wider organisational priorities, including managing service capacity to meet demand, achieving performance targets and responding to policy changes [62, 63], and implementation of multiple sources of evidence and innovations [64]. Evidence translation processes must consider organisational operating pressures and carefully reflect on where resources should be focused to achieve the maximum impact.

This learning is summarised in our substantive theory presented in Table 2.

Engage and empower

The strategic principle ‘engage and empower’ demonstrates that evidence translation and system navigation requires commitment and insights from staff and patients with experience of the local care settings, and changes to a complex system need to align with their motivations and concerns.

Actively engage those responsible for and affected by change

Both projects experienced the harsh reality that if people are not motivated, change will not take place. They realised it was necessary to align with existing personal drivers or build motivation for change in order to get people to adopt new ways of working, and contribute insights and support to problem solving and overcoming obstacles.

In the CAP project, despite having motivated and embedded clinical leadership and support from a multi-disciplinary team, it was a challenge to engage other staff, and in particular other senior doctors. Doctors who believed they already knew how to treat CAP patients, were sceptical about the value of the intervention, and concerned that the care bundle was ‘dumbing down’ complex medical knowledge for junior doctors. Producing the care bundle intervention was not sufficient to instigate behaviour change, and it was rarely used. Engaging staff to understand and respond to their concerns, combined with regular use of measurement and feedback, supported on-going learning and generated local evidence to convince more sceptical individuals that the care bundle increased reliable delivery of evidence-based care. Investing time to engage staff was critical to the use of the intervention.

This example provides a powerful demonstration of the agency of individuals in a complex system. They are highly autonomous, skilled and opinionated individuals with significant discretion to choose what they do and how they do things. This enables them to evade new practices that they have not bought into (or do them in a tokenistic manner), whether initiated by colleagues or through top down directives.

Whilst engaging people can be challenging, the insights they provide are critical to understanding the problems and opportunities, evolving the intervention design and to identify dependent problems to address. In the MM project, patients provided insight into system problems that professionals had not been aware of. Much of the knowledge needed to understand why the problem existed, and how to overcome it, was held tacitly by frontline staff.

‘Seeing’ a complex system is hard. It is necessary to draw on local knowledge and practical wisdom to understand how different elements of care fit together, whilst recognising that each individual only experiences the aspects of a system with which they interact directly. No individual is capable of knowing all parts of a system.

Frontline staff and patients need to be central in the planning, design and conduct of evidence translation and quality improvement endeavours [65, 66]. People affected by change are those most invested in taking ownership and overcoming obstacles and barriers to ensure changes function at a local level [67, 68]. Identifying personal and emotional drivers, and aligning changes to those drivers, can ensure people remain motivated and persistent at times of challenge.

Facilitate dialogue

Bringing together different professional groups and patients may sound straightforward, but this was frequently experienced as challenging, and project teams learnt to anticipate conflict or tensions between different agents. For example, patients with CAP are routinely transferred within the critical first 4 h; therefore, treatment required coordination between the emergency department and the acute medical unit. Although staff from both departments were involved in the project, disputes emerged about who was responsible for initiating and completing the care bundle. Division of labour (partly driven by increasing specialisation) had exacerbated the boundaries between professions, units and organisations, each with their own beliefs, performance expectations and ‘territory’ to protect. Changes to established routines were perceived as threatening or distracting, or compromising professionals’ autonomy and ability to effectively perform their established roles. Dialogue between different ‘communities of practice’ and collaboration between professionals and patients often required facilitation [69, 70].

In complex systems, time is required to facilitate social sense-making, increasing understanding of each other’s perspectives and motives, and to learn how these can better coexist in the same system [71]. Whilst agents may frequently interact with one another, they rarely understand each other’s experiences of being in the system and the expectations, pressures and uncertainties they may face. Change affects individuals in different ways. Patients need to know how new care processes will affect them; staff need to understand how it can be incorporated into their current workload and how it will affect their status or professional identity [72].

Build a culture of willingness to learn and freedom to act

The teams we observed tended to work in high pressure environments with constrained resources and high performance standards and expectations. There were underlying expectations to get things right quickly the first time, which often repressed people’s ability to admit uncertainty or when things were not working well.

These behaviours were reflected in some project team members’ command and control leadership styles resulting from traditional hierarchies. Team members also tended to expect that change would be easy and quick. Many teams found it demoralising when their initial change ideas did not work straight away, or at the large number of barriers and obstacles that needed to be overcome in the process.

Successful teams tended to have the curiosity and persistence in the face of unexpected learning or set-backs. They also tended to be less hierarchical, where the views of all team members were listened to and valued and people were empowered to explore and solve problems. For example, the MM project discovered that, although individual professions were working hard, their collective endeavours failed to consistently deliver the high-quality care they valued. This was disappointing to the staff, but the team transformed this into energy for change. A culture focused on performance management may have repressed this discovery, denying the organisation an important opportunity to learn.

This reflects the inability to ‘control’ complex systems, or to reliably predict how to intervene to achieve a desired outcome. To be successful, it is necessary to have the humility to accept that the answers cannot be fully known in advance, to be willing to learn from experiments conducted within the local system, and to distribute leadership, engaging agents from across the system in the act of improving the system [73, 74].

Provide headroom, resources, training and support

Improving complex systems takes time, effort and reflection. Whilst healthcare professionals work to deliver care to the best of their ability within many constraints, they have little time to consider how the whole system functions. Many of the skills required to understand and intervene in complex systems (e.g. understanding processes and variation, team work) are not commonly taught to healthcare professionals or patients, and represent new ways of thinking that are often counter-cultural to prevailing norms [75].

These project narratives highlight that translation and improvement require space and time. Staff needed ‘headroom’ away from busy practice, time to think, to engage with peers and patients to investigate how their routine processes fit within the overall care system, and to explore potential improvements.

To support the conduct of improvement initiatives, project teams received training from CLAHRC NWL on improvement skills. Teams had limited prior experience and required encouragement and support to use quality improvement methods. Skills in team working and project management were also provided by CLAHRC NWL through on-going coaching and expert input.

One of the major features of complex systems is that they are self-organising. Healthcare professionals and patients are a critical resource to understand and effect change within complex systems, but for them to meaningfully engage requires training, support, resources and headroom in skills they can transfer to other implementation and improvement work [76, 77]. This learning is summarised in our substantive theory presented in Table 3.

Discussion

SHIFT-Evidence provides a comprehensive overview of the challenges and corresponding actions required for successful implementation and improvement. These are summarised as three strategic principles and 12 ‘simple rules’. Exploration of the practical reality of making changes in frontline care settings reveals the need to reconceptualise the challenge of evidence translation to take account of system complexity.

Systems evolve over time and have historical path dependencies

Our findings demonstrate that intervening in complex systems requires an understanding of the unique initial conditions (problems, opportunities, people, practices and patterns) in each local setting that are influenced by historical path dependencies. Scientific evidence about which interventions to use needs to be balanced with local system requirements, rather than assuming the starting point will be the same in each setting, and a commitment to continual improvement is needed to allow for the fact that systems evolve and adapt over time. This temporal dimension of systems thinking is reflected in the SHIFT-Evidence framework by the strategic principle ‘act scientifically and pragmatically’.

These findings challenge current conventions of seeing implementation as a one-off or time-limited activity, and build on Hawe et al.’s [61] proposal that interventions are ‘events in systems’. Further, a single pre-planned intervention, or set of interventions, is unlikely to be sufficient to achieve evidence implementation and improvements. Instead, multiple interventions are likely to be required; with the need emerging only as changes are implemented and system understanding grows. This builds on quality improvement approaches that promote iterative development over time [59, 78, 79], and organisational learning perspectives that value generative learning (e.g. double (and triple) loop learning) [73, 80].

We propose, in light of these findings, that terminology shifts from the use of the noun ‘intervention’ to the verb ‘intervening’. We believe that the concept of ‘intervening to achieve an improvement’ better reflects the iterative and negotiated process required to test multiple interventions whilst noticing and responding to local system requirements over an extended period of time (cf. Snowden’s probe-sense-response) [81].

Systems are dynamic and interconnected

Interventions cannot be considered in isolation from the system they are implemented into. The uptake and effective utilisation of any specific intervention is dependent on established practices and processes of care. These practices and processes of care cannot be assumed to be working well, and often additional interventions will be necessary to address related and systemic problems. Intervening in complex systems requires an understanding of these dynamic and fluctuating processes. Understanding of system dynamics and interconnectivity is represented by the strategic principle ‘embrace complexity’.

This challenges current conventions of seeing interventions as bounded and discrete, and anticipating that such interventions will be used by people working in a rational linear manner. Interventions are inherently dependent on the context that they are used in, and it cannot be assumed that dependent processes and practices are working well. This builds on literature from operations management and patient safety in valuing an understanding of ‘work as is’ as opposed to ‘work as imagined’ [82, 83]; people in complex systems are challenged with making decisions in real-world conditions, under high pressure, with constrained time and resources, whilst balancing multiple priorities [84].

Systems are made up of individual agents capable of self-organisation

The implications of systems evolving over time and their dynamic, interconnected nature are that capacity and capability needs to be built into the system to reflect, experiment and learn about intervening within the system over time. The strategic principle ‘engage and empower’ emphasises the critical role local system members play in identifying and solving local problems (although each person individually can only partially know or see the whole system), and the necessity for their willingness and motivation to adopt new ways of working.

This challenges current conventions of implementation activities being designed and conducted by people outside of the system, and draws attention to the unique insights provided by people within the local system (healthcare professionals, patients, managers) about how they self-organise and how they experience attempts to intervene. This builds on literature on co-production [65, 66, 85] and co-design [86] that emphasises the importance of engaging local stakeholders to solve problems that matter to them in their local setting and acknowledges the extensive work on understanding individual psychology of behaviour change and group dynamics [87, 88].

Value of SHIFT-Evidence to practitioners and academics

SHIFT-Evidence is the first empirically grounded framework for evidence translation in complex systems that can help make predictions and provide explanations about challenges and influences on success.

This research adds to the complexity science literature, initially proposed by Plsek and Greenhalgh [28], describing healthcare as a complex system. Building on this perspective, it makes a unique contribution in considering the implications of complexity for deliberate attempts to intervene and introduce evidence-based practices [36, 37]. Our study focused on micro-level initiatives, but findings resonate with existing literature on complexity in relation to macro-level initiatives (e.g. policy, systems design) [29, 34]. By providing insights into the ‘sharp end’ of practice, SHIFT-Evidence can provide insights to policymakers and system leaders as to how ‘top down’ initiatives might be received in complex systems.

This study also contributes to literature on evidence translation and implementation. It advances research by Craig et al. [89] on complex interventions, and by McCormack et al. [90], amongst others, which recognised the importance of context in the uptake of evidence-based practices, and May et al. [38], which expanded their theory of implementation to consider context as a complex adaptive system. SHIFT-Evidence builds on these views to consider the interaction between interventions, implementation strategies and context as inseparable and interacting components of a complex system. This view is reinforced by complexity thinking by resisting the temptation to isolate or reduce a system to its component parts, and instead to take interest in the interactions and patterns that emerge across the whole system.

For academics, SHIFT-Evidence provides an explanatory and predictive framework. The substantive theory explains the challenges encountered during evidence translation in complex systems and provides a rationale for strategies and actions to overcome them. The ‘simple rules’ provide testable hypotheses about the actions conducive to success that can be tested through future research. In demonstrating the magnitude of the challenge faced, SHIFT-Evidence makes clear the need for interdisciplinary enquiries to advance understanding and practice.

For patients, practitioners, managers, policymakers and academics involved with designing, conducting or evaluating healthcare improvement initiatives, SHIFT-Evidence provides a common framework to guide their work and ensure they are considering the breadth of the practical realities of evidence translation and improvement. The strategic principles (‘act scientifically and pragmatically’, ‘embrace complexity’ and ‘engage and empower’) were designed to be intuitive, accessible and memorable. A common framework that represents the complex and dynamic nature of improvement should help practitioners, academics and patients collaborate more effectively to increase the likelihood of success. If practitioners and patients can easily access practical knowledge, they may be more willing to contribute to the creation of new knowledge and to participate in the design, conduct and evaluation of future change experiments. If researchers understand how their work directly helps practitioners achieve improvements, and influence the lives of patients, they may be more likely to produce outputs which in turn increase practitioner receptivity and access to research settings.

For policymakers, funders and senior managers, SHIFT-Evidence emphasises the significant investment required at all stages of improvement efforts, including providing frontline practitioners with the time to step back from their day-to-day activities and the support needed to overcome barriers and obstacles to improvement. Such resource commitment is often seen as a luxury rather than essential. By using this structured approach to support funding and prioritisation, this may allow optimal investment of available resources and disinvestment in initiatives that add little value.

Limitations and future research

The quality of theory should be assessed by how useful it is in solving societal problems recognising that “the published word is not the final one, but only a pause in the never-ending process of generating theory” [91]. Therefore, rather than be seen as a finalised theory, or a perfect set of ‘simple rules’, the value of SHIFT-Evidence needs to be assessed through its usefulness in practice (and research), and should act as a catalyst for further improvement and refinement of the theory as predictions are tested.

The first limitation of this work is the transferability of the substantive (context-specific) theory to other settings beyond NWL and a UK cultural context. While research drew on a range of real-world improvement studies from different settings and on diverse clinical topics, all cases were from a single region (London, UK). Our wider author and team experiences suggest the cases represent wider national and global challenges (e.g. [92]). However, there is a need to explore the transferability of SHIFT-Evidence into other global settings and to continue to assess the comparative importance of the individual principles in different contexts.

The second limitation of this work is the transferability of findings to different intervention types and implementation and improvement approaches. All of the projects included in the empirical study were led by clinical leaders who voluntarily took on the role and defined the improvement area and evidence-based solutions, and in many instances this was done in collaboration with their teams and wider stakeholders. In addition, the use of a specific, quality improvement approach was promoted and supported in all project teams, although actual use of the approach was variable [93]. Further work is required to explore the transferability of SHIFT-Evidence to a greater diversity of intervention types (including organisational, system or policy level change) and implementation and improvement approaches.

The third limitation is methodological. The auto-ethnographic role of the researchers provided benefits including proximity to the subject matter, extensive contact with project teams and long-term relationships to explore how issues evolved over time. As all authors were senior members of the programme, there is a risk that their access to conversations and their perceptions and interpretations of the findings would be affected by status. The collaborative approach adopted between authors and other members of the CLAHRC NWL team (including more junior staff) allowed access to feedback from other programme participants and different types of conversations and ‘behind the scenes’ encounters. In addition, regular engagement with CLAHRC NWL project team members helped to triangulate findings and gain different perspectives. As such, the findings represent a culmination of discussion and sense making between the researchers and participants over an extended period of time. Evidence of these shared reflections exists in the publications co-authored with project teams that demonstrate insights into the challenges and complexity experienced (e.g. [94,95,96]). Interpretation of the results was further triangulated with other experts in the field and in analysis of extensive literature to support reflectivity and to increase the reliability and validity of the findings. However, further research is required to explore how different methodological or theoretical perspectives produce convergent or divergent findings.

Further research is required to examine how to effectively operationalise the SHIFT-Evidence ‘simple rules’ in practice [97]. For example, knowing that ‘understanding problems and opportunities’ is important does not provide detailed guidance on how to engage relevant stakeholders to access local knowledge nor how to make sense of complex system interactions. Many approaches, tools and methods for operations research [98, 99], network analysis [36], implementation and quality improvement have already been developed and studied [100,101,102,103,104], and this knowledge should inform the generation of structured and practical approaches that enable the SHIFT-Evidence ‘simple rules’ to be enacted in practice. It will also be necessary to work with advances in complexity sciences to develop new approaches to the practice and research of intervening in complex systems.

Conclusion

SHIFT-Evidence is a unique framework with explanatory and predictive power grounded in the practical reality of evidence translation and improvement in healthcare. It advances thinking about how to intervene in complex systems, namely that, to achieve successful improvements from evidence translation in healthcare, it is necessary to ‘act scientifically and pragmatically’ whilst ‘embracing the complexity’ of the setting in which change takes place and ‘engaging and empowering’ those responsible for and affected by the change.

A series of 12 action-orientated ‘simple rules’ are proposed to guide patients, practitioners, managers, policymakers and academics to intervene in complex systems. We propose that efforts to translate evidence into practice should be reconceptualised from focusing on simple relationships between interventions and outcomes to understanding the complex and nuanced work required when ‘intervening to achieve an improvement’. This better reflects the iterative and negotiated process required to test multiple interventions whilst noticing and responding to learning that emerges from the system over an extended period of time.

Abbreviations

CAP:

community-acquired pneumonia

CLAHRC:

Collaboration for Leadership in Applied Health Research and Care

MM:

medicines management

NIHR:

National Institute of Health Research

NWL:

Northwest London

SHIFT-Evidence:

Successful Healthcare Improvement From Translating Evidence into complex systems

References

  1. 1.

    Vincent C, Aylin P, Franklin BD, Holmes A, Iskander S, Jacklin A, Moorthy K. Is health care getting safer? BMJ. 2008;337:a2426.

  2. 2.

    Sutcliffe KM, Paine L, Pronovost PJ. Re-examining high reliability: actively organising for safety. BMJ Qual Saf. 2017;26:248–251.

  3. 3.

    Makary MA, Daniel M. Medical error - the third leading cause of death in the US. BMJ. 2016;353:i2139.

  4. 4.

    Francis R. Report of the Mid Staffordshire NHS Foundation Trust Public Inquiry. London: The Stationery Office; 2013.

  5. 5.

    Barry MJ, Edgman-Levitan S. Shared Decision Making — The Pinnacle of Patient-Centered Care. N Engl J Med. 2012;366(9):780–1.

  6. 6.

    The Health Foundation. Healthy lives for people in the UK. London: The Health Foundation; 2017.

  7. 7.

    Gray M, DaSilvia P, Laitner S. NHS Atlas of Variation in Healthcare. London: NHS; 2010.

  8. 8.

    Boerma T, Eozenou P, Evans D, Evans T, Kieny M-P, Wagstaff A. Monitoring progress towards universal health coverage at country and global levels. PLoS Med. 2014;11(9):e1001731.

  9. 9.

    Kruk ME. More health for the money—toward a more rigorous implementation science. Sci Transl Med. 2014;6(245):245ed217.

  10. 10.

    Appleby J, Galea A, Murray R. The NHS Productivity Challenge: Experience from the Front Line. London: The Kings Fund; 2014.

  11. 11.

    Aaron HJ. Budget crisis, entitlement crisis, health care financing problem—which is it? Health Aff. 2007;26(6):1622–33.

  12. 12.

    Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press; 2001.

  13. 13.

    Cooksey SD. A Review of UK Health Research Funding. London: Department of Health; 2006.

  14. 14.

    McGlynn EA, Asch SM, Adams JL, Keesey J, Hicks J, DeCristofaro AH, Kerr EA. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348(26):2635–45.

  15. 15.

    Greenhalgh T, Howick J, Maskrey N. Evidence based medicine: a movement in crisis? BMJ. 2014;348:g3725.

  16. 16.

    Grimshaw JM, Thomas RE, MacLennan G, Fraser C, Ramsay CR, Vale L, Whitty P, Eccles MP, Matowe L, Shirran L, Wensing M, Dijkstra R, Donaldson C. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess. 2004;8(6):iii–v. 1–72

  17. 17.

    Auerbach AD, Landefeld CS, Shojania KG. The tension between needing to improve care and knowing how to do it. N Engl J Med. 2007;357(6):608–13.

  18. 18.

    Geng EH, Peiris D, Kruk ME. Implementation science: Relevance in the real world without sacrificing rigor. PLoS Med. 2017;14(4):e1002288.

  19. 19.

    Kitson AL. The need for systems change: reflections on knowledge translation and organizational change. J Adv Nurs. 2009;65(1):217–28.

  20. 20.

    Greenhalgh T, Wieringa S. Is it time to drop the ‘knowledge translation’metaphor? A critical literature review. J R Soc Med. 2011;104(12):501–9.

  21. 21.

    Bate P, Robert G, Fulop N, Øvretveit J, Dixon-Woods M. In: The Health Foundation (ed.) Perspectives on Context. London: The Health Foundation; 2014.

  22. 22.

    Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.

  23. 23.

    Denis J-L, Hébert Y, Langley A, Lozeau D, Trottier LH. Explaining diffusion patterns for complex health care innovations. Health Care Manag Rev. 2002;27(3):60–73.

  24. 24.

    Pawson R, Tilley N. Realistic Evaluation. London: Sage Publications; 1997.

  25. 25.

    Prigogine I. Order Out of Chaos: Man's New Dialogue with Nature. New York: Bantam Books; 1984.

  26. 26.

    Holden LM. Complex adaptive systems: concept analysis. J Adv Nurs. 2005;52(6):651–7.

  27. 27.

    Capra F. The web of life: A new scientific understanding of living systems. New York: Anchor Books; 1997.

  28. 28.

    Plsek PE, Greenhalgh T. Complexity science: The challenge of complexity in health care. BMJ. 2001;323:625–8.

  29. 29.

    De Savigny D, Adam T. Systems Thinking for Health Systems Strengthening. Geneva, Switzerland: World Health Organization; 2009.

  30. 30.

    Greenhalgh T, Plsek P, Wilson T, Fraser S, Holt T. Response to ‘The appropriation of complexity theory in health care’. J Health Serv Res Policy. 2010;15(2):115–7.

  31. 31.

    Paley J. The appropriation of complexity theory in health care. J Health Serv Res Policy. 2010;15:59–61.

  32. 32.

    Holland JH. Studying complex adaptive systems. J Syst Sci Complex. 2006;19(1):1–8.

  33. 33.

    McDaniel RR Jr, Lanham HJ, Anderson RA. Implications of complex adaptive systems theory for the design of research on health care organizations. Health Care Manag Rev. 2009;34(2):191.

  34. 34.

    Sterman JD. Business Dynamics: Systems Thinking and Modeling for a Complex World. Boston: McGraw-Hill Education; 2000.

  35. 35.

    Cilliers P, Spurrett D. Complexity and post-modernism: Understanding complex systems. S Afr J Philos. 1999;18(2):258–74.

  36. 36.

    Braithwaite J, Churruca K, Ellis LA, Long J, Clay-Williams R, Damen N, Herkes J, Pomare C, Ludlow K. Complexity Science in Healthcare - Aspirations, Approaches, Applications and Accomplishments: A White Paper. Sydney, Australia: Australian Institute of Health Innovation, Macquarie University; 2017.

  37. 37.

    Thompson DS, Fazio X, Kustra E, Patrick L, Stanley D. Scoping review of complexity theory in health services research. BMC Health Serv Res. 2016;16(1):87.

  38. 38.

    May CR, Johnson M, Finch T. Implementation, context and complexity. Implement Sci. 2016;11(1):141.

  39. 39.

    Pfadenhauer LM, Mozygemba K, Gerhardus A, Hofmann B, Booth A, Lysdahl KB, Tummers M, Burns J, Rehfuess EA. Context and implementation: a concept analysis towards conceptual maturity. Z Evid Fortbild Qual Gesundhwes. 2015;109(2):103–14.

  40. 40.

    Brainard J, Hunter PR. Do complexity-informed health interventions work? A scoping review. Implement Sci. 2015;11(1):127.

  41. 41.

    Anderson L. Analytic autoethnography. J Contemp Ethnogr. 2006;35(4):373–95.

  42. 42.

    Strauss A, Corbin J. Grounded theory methodology: an overview. In: Denzin N, Lincoln Y (eds.) Handbook of Qualitative Research. Thousand Oaks: Sage Publications; 1994. p.273–85.

  43. 43.

    Strübing J. Research as pragmatic problem-solving: The pragmatist roots of empirically-grounded theorizing. In: Bryant A, Charmaz K. (eds.) The SAGE Handbook of Grounded Theory. London: Sage Publications; 2007. p. 580–602.

  44. 44.

    NIHR. 4.5 Collaborations For Leadership In Applied Health Research And Care (CLAHRCs). London: National Institute for Health Research; 2011.

  45. 45.

    Currie G, Lockett A, Enany NE. From what we know to what we do: lessons learned from the translational CLAHRC initiative in England. J Health Serv Res Policy. 2013;18(3 Suppl):27–39.

  46. 46.

    Evans S, Scarbrough H. Supporting knowledge translation through collaborative translational research initiatives:‘Bridging’versus ‘blurring’boundary-spanning approaches in the UK CLAHRC initiative. Soc Sci Med. 2014;106:119–27.

  47. 47.

    Howe CA, Randall K, Chalkley SR, Bell D. Supporting improvement in a quality collaborative. Br J Health Care Manag. 2013;19(9):434–42.

  48. 48.

    Reed JE, McNicholas C, Woodcock T, Issen L, Bell D. Designing quality improvement initiatives: the action effect method, a structured approach to identifying and articulating programme theory. BMJ Qual Saf. 2014;23(12):1040–8.

  49. 49.

    Taylor MJ, McNicholas C, Nicolay C, Darzi A, Bell D, Reed JE. Systematic review of the application of the plan-do-study-act method to improve quality in healthcare. BMJ Qual Saf. 2014;23(4):290–8.

  50. 50.

    Renedo A, Marston CA, Spyridonidis D, Barlow J. Patient and public involvement in healthcare quality improvement: how organizations can help patients and professionals to collaborate. Public Manag Rev. 2015;17(1):17–34.

  51. 51.

    Langley GJ, Moen R, Nolan KM, Nolan TW, Norman CL, Provost LP. The Improvement Guide: A Practical Approach to Enhancing Organizational Performance (2nd ed.). San Francisco: Jossey-Bass Publishers; 2009.

  52. 52.

    Dewey J. The Essential Dewey, Volume I: Pragmatism, Education, Democracy. Bloomington: Indiana University Press; 1998.

  53. 53.

    Greenhalgh T, Hawe P, Leykum L. Preface to Complexity Science in Healthcare - Aspirations, Approaches, Applications and Accomplishments: A White Paper. Sydney, Australia: Australian Institute of Health Innovation, Macquarie University; 2017.

  54. 54.

    Dixon-Woods M, Bosk CL, Aveling EL, Goeschel CA, Pronovost PJ. Explaining Michigan: developing an ex post theory of a quality improvement program. Milbank Q. 2011;89(2):167–205.

  55. 55.

    Gabbay J, Le May A. Practice-based Evidence for Healthcare: Clinical Mindlines. Oxford: Routledge; 2010.

  56. 56.

    Provost LP. Analytical studies: a framework for quality improvement design and analysis. BMJ Qual Saf. 2011;20(Suppl 1):i92–6.

  57. 57.

    Kannampallil TG, Schauer GF, Cohen T, Patel VL. Considering complexity in healthcare systems. Journal of Biomedical Informatics. 2011;44(6):943–947.

  58. 58.

    Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8(1):117.

  59. 59.

    Perla RJ, Provost LP, Parry GJ. Seven propositions of the science of improvement: exploring foundations. Qual Manag Healthc. 2013;22(3):170–86.

  60. 60.

    Dixon-Woods M, Martin G, Tarrant C, Bion J, Goeschel C, Pronovost P, Brewster L, Shaw L, Sutton L, Willars J. Safer Clinical Systems: evaluation findings. Evaluation December 2014. London: The Health Foundation; 2014.

  61. 61.

    Hawe P, Shiell A, Riley T. Theorising interventions as events in systems. Am J Community Psychol. 2009;43(3–4):267–76.

  62. 62.

    Lewis S. Toward a general theory of indifference to research-based evidence. J Health Serv Res Policy. 2007;12(3):166–72.

  63. 63.

    Ferlie EB, Shortell SM. Improving the quality of health care in the United Kingdom and the United States: a framework for change. Milbank Q. 2001;79:281–315.

  64. 64.

    Bastian H, Glasziou P, Chalmers I. Seventy-five trials and eleven systematic reviews a day: how will we ever keep up? PLoS Med. 2010;7(9):e1000326.

  65. 65.

    Holmes BJ, Best A, Davies H, Hunter D, Kelly MP, Marshall M, Rycroft-Malone J. Mobilising knowledge in complex health systems: a call to action. Evid Policy. 2017;13(3):539–60.

  66. 66.

    Ocloo J, Matthews R. From tokenism to empowerment: progressing patient and public involvement in healthcare improvement. BMJ Qual Saf. 2016;25:626–632. Available from: https://doi.org/10.1136/bmjqs-2015-004839.

  67. 67.

    von Hippel E, Thomke S, Sonnack M. Creating breakthroughs at 3M. Harv Bus Rev. 1999;77(5):47–57.

  68. 68.

    Ferlie E, Fitzgerald L, Wood M, Hawkins C. The nonspread of innovations: the mediating role of professionals. Acad Manag J. 2005;48(1):117–34.

  69. 69.

    Brooks I, Brown RB. The role of ritualistic ceremonial in removing barriers between subcultures in the National Health Service. J Adv Nurs. 2002;38(4):341–52.

  70. 70.

    Brown JS, Duguid P. Organizational learning and communities-of-practice: toward a unified view of working, learning, and innovation. Organ Sci. 1991;2(1):40–57.

  71. 71.

    Weick KE, Sutcliffe KM, Obstfeld D. Organizing and the process of sensemaking. Organ Sci. 2005;16(4):409–21.

  72. 72.

    Stetler CB, Legro MW, Rycroft-Malone J, Bowman C, Curran G, Guihan M, Hagedorn H, Pineros S, Wallace CM. Role of “external facilitation” in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implement Sci. 2006;1:23.

  73. 73.

    Senge P. The Fifth Discipline: The Art and Practice of the Learning Organisation. London: Random House; 2006.

  74. 74.

    Deming WE. Out of the Crisis. Cambridge: The MIT Press; 2000.

  75. 75.

    Dekker SW, Leveson NG. The systems approach to medicine: controversy and misconceptions. BMJ Qual Saf. 2015;24:7–9. Available from: https://doi.org/10.1136/bmjqs-2014-003106.

  76. 76.

    Gabbay J, le May A, Connell C, Klein JH. Balancing the skills: the need for an improvement pyramid. BMJ Qual Saf. 2018;27(1):85–9.

  77. 77.

    Kaplan HC, Provost LP, Froehle CM, Margolis PA. The Model for Understanding Success in Quality (MUSIQ): building a theory of context in healthcare quality improvement. BMJ Qual Saf. 2012;21:13–20.

  78. 78.

    Taylor M, McNicholas C, Nicolay C, Darzi A, Bell DR, Reed JE. Systematic review of the application of Plan-Do-Study-Act method to improve quality in healthcare. BMJ Qual Saf. 2013;23:290–8.

  79. 79.

    Langley GJ, Moen RD, Nolan KM, Nolan TW, Norman CL, Provost LP. The improvement guide: a practical approach to enhancing organizational performance. San Francisco: Jossey-Bass Publishers; 1996.

  80. 80.

    Argyris C. Double loop learning in organizations. Harv Bus Rev. 1977;55(5):115–25.

  81. 81.

    Snowden D. Cynefin: A sense of time and space, the social ecology of knowledge management. In: Despres C, Chauvel D. (eds.) Knowledge Horizons: The Present and the Promise of Knowledge Management. Boston: Butterworth-Heinemann; 2000.

  82. 82.

    Jacka JM, Keller PJ. Business Process Mapping: Improving Customer Satisfaction (2nd ed.). Hoboken: Wiley; 2009.

  83. 83.

    Braithwaite J, Wears RL, Hollnagel E. Resilient Health Care, Volume 3: Reconciling Work-as-Imagined and Work-as-done. Boca Raton: CRC Press; 2016.

  84. 84.

    Reason J. Human error: models and management. BMJ. 2000;320(7237):768–70.

  85. 85.

    Best A, Holmes B. Systems thinking, knowledge and action: towards better models and methods. Evid Policy. 2010;6(2):145–59.

  86. 86.

    Bate P, Robert G. Experience-based design: from redesigning the system around the patient to co-designing services with the patient. Qual Saf Health Care. 2006;15:307–10.

  87. 87.

    Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A. on behalf of the "Psychological Theory" Group. Making psychological theory useful for implementing evidence based practice: a consensus approach. Qual Saf Health Care. 2005;14(1):26–33.

  88. 88.

    Davidoff F. Shame: the elephant in the room. Qual Saf Health Care. 2002;11(1):2–3.

  89. 89.

    Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337:a1655.

  90. 90.

    McCormack B, Alison K, Gill H, Jo R-M, Angie T, Kate S. Getting evidence into practice: the meaning of 'context'. J Adv Nurs. 2002;38(1):94–104.

  91. 91.

    Glaser B, Strauss A. Applying grounded theory. In: The Discovery of Grounded Theory: Strategies of Qualitative Research. Hawthorne: Aldine Publishing Company; 1967. p. 237–51.

  92. 92.

    Papoutsi C, Poots A, Clements J, Wyrko Z, Offord N, Reed JE. Improving patient safety for older people in acute admissions: implementation of the Frailsafe checklist in 12 hospitals across the UK. Age Ageing. 2018;47(2):311–317.

  93. 93.

    Howe C, Bell D. Improving engagement in a quality collaborative. Br J Healthc Manag. 2014;20(11):528–35.

  94. 94.

    Marvin V, Kuo S, Poots AJ, Woodcock T, Vaughan L, Bell D. Applying quality improvement methods to address gaps in medicines reconciliation at transfers of care from an acute UK hospital. BMJ Open. 2016;6(6):e010230.

  95. 95.

    Lennox L, Green S, Howe C, Musgrave H, Bell D, Elkin S. Identifying the challenges and facilitators of implementing a COPD care bundle. BMJ Open Respir Res. 2014;1(1):e000035.

  96. 96.

    Jones SE, Green SA, Clark AL, Dickson MJ, Nolan A-M, Moloney C, Kon SSC, Kamal F, Godden J, Howe C, et al. Research Letter: Pulmonary rehabilitation following hospitalisation for acute exacerbation of COPD - referrals, uptake and adherence. Thorax. 2014;69:181–182.

  97. 97.

    Walshe K. Understanding what works--and why--in quality improvement: the need for theory-driven evaluation. Int J Qual Health Care. 2007;19(2):57–9.

  98. 98.

    Crowe S, Brown K, Tregay J, Wray J, Knowles R, Ridout DA, Bull C, Utley M. Combining qualitative and quantitative operational research methods to inform quality improvement in pathways that span multiple settings. BMJ Qual Saf. 2017;26:641–652. Available from: https://doi.org/10.1136/bmjqs-2016-005636.

  99. 99.

    Pitt M, Monks T, Crowe S, Vasilakis C. Systems modelling and simulation in health service design, delivery and decision making. BMJ Qual Saf. 2016;25(1):38–45.

  100. 100.

    Reed JE, Card AJ. The problem with Plan-Do-Study-Act cycles. BMJ Qual Saf. 2016;25:147–152. Available from: https://doi.org/10.1136/bmjqs-2015-005076.

  101. 101.

    Waring JJ, Bishop S. Lean healthcare: rhetoric, ritual and resistance. Soc Sci Med. 2010;71(7):1332–40.

  102. 102.

    Schroeder RG, Linderman K, Liedtke C, Choo AS. Six Sigma: Definition and underlying theory. J Oper Manag. 2008;26(4):536–54.

  103. 103.

    Lennox L, Doyle C, Reed JE, Bell D. What makes a sustainability tool valuable, practical and useful in real-world healthcare practice? A mixed-methods study on the development of the Long Term Success Tool in Northwest London. BMJ Open. 2017;7(9):e014417.

  104. 104.

    Doyle C, Howe C, Woodcock T, Myron R, Phekoo K, McNicholas C, Saffer J, Bell D. Making change last: applying the NHS institute for innovation and improvement sustainability model to healthcare improvement. Implement Sci. 2013;8:127. 

  105. 105.

    Lim WS, Baudouin S, George R, Hill A, Jamieson C, Le Jeune I, Macfarlane J, Read R, Roberts H, Levy M. BTS guidelines for the management of community acquired pneumonia in adults: update 2009. Thorax. 2009;64(Suppl 3):iii1–iii55.

  106. 106.

    Gao F, Melody T, Daniels DF, Giles S, Fox S. The impact of compliance with 6-hour and 24-hour sepsis bundles on hospital mortality in patients with severe sepsis: a prospective observational study. Crit Care. 2005;9(6):R764.

  107. 107.

    Guerin K, Wagner J, Rains K, Bessesen M. Reduction in central line-associated bloodstream infections by implementation of a postinsertion care bundle. Am J Infect Control. 2010;38(6):430–3.

  108. 108.

    Rello J, Lode H, Cornaglia G, Masterton R. A European care bundle for prevention of ventilator-associated pneumonia. Intensive Care Med. 2010;36(5):773–80.

  109. 109.

    Elkin S. Improving the management of community acquired pneumonia (CAP) in a 3 site London teaching hospital. In: European Respiratory Society Annual Congress 2010, 18-22 September 2010, Barcelona, Spain. Sheffield, UK: European Respiratory Society Publication Office; 2010.

  110. 110.

    Melnyk PS, Shevchuk YM, Remillard AJ. Impact of the dial access drug information service on patient outcome. Ann Pharmacother. 2000;34(5):585–92.

  111. 111.

    Bertsche T, Hämmerlein A, Schulz M. German national drug information service: user satisfaction and potential positive patient outcomes. Pharm World Sci. 2007;29(3):167–72.

  112. 112.

    Barber S, Thakkar K, Marvin V, Franklin BD, Bell D. Evaluation of My Medication Passport: a patient-completed aide-memoire designed by patients, for patients, to help towards medicines optimisation. BMJ Open. 2014;4(8):e005608.

  113. 113.

    Jubraj B, Blair M. Use of a medication passport in a disabled child seen across many care settings. BMJ Case Rep. 2015. Available from: https://doi.org/10.1136/bcr-2014-208033.

  114. 114.

    Marvin V, Park C, Vaughan L, Valentine J. Phone-calls to a Hospital Medicines Information Helpline. Analysis of queries from members of the public and assessment for harm from their medicines. Int J Pharm Pract. 2011;19(2):115–22.

  115. 115.

    Hopkinson NS, Englebretsen C, Cooley N, Kennie K, Lim M, Woodcock T, Laverty AA, Wilson S, Elkin SL, Caneja C, et al. Designing and implementing a COPD discharge care bundle. Thorax. 2012;67(1):90–2.

  116. 116.

    Spyridonidis D, Hendy J, Barlow J. Leadership for Knowledge Translation: The Case of CLAHRCs. Qual Health Res. 2015;25(11):1492–1505. Available from: https://doi.org/10.1177/1049732315583268.

  117. 117.

    Peck E. The performance of an NHS Trust Board: actors’ accounts, minutes and observation. Br J Manag. 1995;6(2):135–56.

  118. 118.

    Parker LD. Boardroom operational and financial control: an insider view. Br J Manag. 2008;19(1):65–88.

  119. 119.

    Pugliese A, Nicholson G, Bezemer P-J. An observational analysis of the impact of board dynamics and directors’ participation on perceived board effectiveness. Br J Manag. 2015;26(1):1–25.

Download references

Acknowledgements

Authors would like to thank Dr Catherine French, Stuart Green, Rachel Matthews, Laura Lennox, Ganesh Sathyamoorthy and Dr Tom Woodcock for their advice and support in development of the framework and review of the text; and Dr Vanessa Marvin and Dr Sarah Elkin, who were leaders of the Medicines Management and Community Acquired Pneumonia projects and reviewed and commented on the case studies used in this paper. This paper would not have been possible without the hard work of these colleagues, and all the CLAHRC NWL community. The authors would also like to thank Lizzie Raby for the SHIFT-Evidence graphic design. 

Funding

This article is based on independent research commissioned by the National Institute for Health Research (NIHR) under the Collaborations for Leadership in Applied Health Research and Care (CLAHRC) programme for North West London. JR was also financially supported by an Improvement Science Fellowship from the Health Foundation. The views expressed in this publication are those of the authors and not necessarily those of the Health Foundation, the NHS, the NIHR or the Department of Health and Social Care.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

Author information

JR, CD and DB came up with the concept for the research. JR, CH and CD conducted auto ethnographic observations, analysed data and conducted extensive literature review. JR drafted the first version of the paper. All authors made major contributions to writing the manuscript. All authors read and approved the final manuscript.

Correspondence to Julie E. Reed.

Ethics declarations

Ethics approval and consent to participate

All CLAHRC NWL projects discussed in this manuscript independently applied for ethics, e.g. [114], or obtained ethics waivers, e.g. [115].

The independent evaluation of CLAHRC NWL by Imperial College Business School was approved by Central London Research Ethics Committee (REC approval number 09/H0718/35) [116].

Information from these studies was reviewed using secondary analysis to inform the research presented in this paper.

In line with precedent for auto-ethnographic observations in organisations, further ethical approval was not obtained for the research presented in this paper (e.g. [117,118,119]). Field permissions were obtained from project teams and organisational leaders at CLAHRC NWL.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional files

Additional file 1:

Project and programme publications: a list of CLAHRC NWL projects conducted between 2008 and 2013, and supporting publications from the CLAHRC NWL programme (DOCX 32 kb)

Additional file 2:

Research methods (DOCX 90 kb)

Additional file 3:

Example of coding and theory development (DOCX 14 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Keywords

  • Complex systems
  • Complexity theory
  • Complex adaptive systems
  • Framework
  • Evidence translation
  • Implementation
  • Quality improvement