Knowledge Translation

Our toolbox for Implementation Science


get your manuscripts, blog posts, content right


Press releases, blog posts, white papers


11 languages, all to common human understanding


Infographics, illustrations, tables, charts, journal art & more

Social Media

Twitter, Facebook, Reddit, Instagram, and more


State of the art websites for science communciation

Content Creation

Build your brand’s authority and legitimacy.

Integrated knowledge translation (IKT)

A big problem

The failure to put research findings into action is a major societal issue and contributes to an estimated $200B (USD) of wasted research funding

REAIM framework

All of our tools are built around the idea that research should be disseminated for maximum ROI. This includes 4 parts: Reach, Adoptation, Implementation, Maintenance

Research Effectiveness

We tackle the issues around dissemination of your research
Effective Design

Our services are designed to tell a story. Our professional designers, illustrators, and web experts give you the medium expertise you need

Scientific Expertise

Our researchers, medical doctors and staff are on hand to ensure scientific validity of KT and design work

True Impact

By leveraging the best modern digital tools such as digital design, sites, and social media, your research can be seen and known by the world


Your valuable grant dollars and research work need to be effective. Conductscience KT products help you reach your goals, both for promotion and for societal impact


    Macleod MR, Michie S, Roberts I, Dirnagl U, Chalmers I, Ioannidis JPA, et al. Biomedical research: increasing value, reducing waste. Lancet. 2014;383:

    Implementation Science

    Implementation Science: Introduction

    Given the complexity and dynamics of today’s healthcare environments, financial constraints, and political imperatives require healthcare systems to maximize their potential and provide high-quality care. Therefore, implementation science is becoming a fundamental tool to bridge the gap between research and practice and optimize healthcare value around the globe.

    Implementation science can be defined as the study of effective strategies and evidence-based programs (EBPs) to translate research knowledge into practice. To be more precise, implementation science is defined as “the scientific study of methods to promote the systematic uptake of research findings and other evidence-based programs into routine practice, and, hence, to improve the quality and effectiveness of health services” (Bauer et al., 2015). This definition implies that implementation science addresses not only the empirical outcomes of clinical research but the optimization of services at different healthcare levels (e.g., front-line clinicians, administrators, economists, consumers).

    Implementation Science: A Raising Concept among Overlapping Terminology

    With the increasing complexity of healthcare systems worldwide, scientific terminology is also becoming more complicated. To provide an example, terms such as implementation science, quality improvement, and knowledge translation are often used interchangeably. Nevertheless, some subtle differences should not be ignored. While implementation science aims to address the underutilization of an evidence-based program and resultant know-do gaps in practice, quality improvement starts with a specific problem in a given system and continues with certain strategies to address this problem. Dissemination, on the other hand, refers to the spread of information of a given intervention across practice settings.

    Furthermore, implementation science aims to promote the effective adoption of evidence-based programs, which is a robust ongoing process. Implementation science research encompasses more than the simple adoption of innovations; it utilizes hybrid designs and systematic approaches to identify risks and success factors to improve both reimbursements and health outcomes, as well as to sustain quality care. Note that there’s a difference between the terms adoption, adaptation, scale-up, spread, and sustainability (llot et al., 2013). While adoption is often described by Roger’s Diffusion of Innovations theory as a five-stage decision-making process, adaptation requires innovations to be tailored to the local context. On the other hand, scale-up improvement is defined as a top-down diffusion to improve local, national, and global systems; whereas spread is defined as a horizontal diffusion of knowledge. Sustainability, contrasted to decay, is another crucial factor as most systems fail to sustain innovations. In the UK, for instance, 70% of all organizational change strategies fail to survive.

    Theoretical Foundations behind Implementation Science Research

    The interest in why some innovations fail and others make history has led to the development of several theories and publications, which keep influencing implementation science research (Bodenheimer, 2007):

    • One of the most cited theories is Everett Rogers’ Diffusion of Innovations theory, which describes a health campaign in Peru to prevent water-borne infections and infant mortality. Rogers claims that four elements are needed for an innovation to take off: 1) an innovation should be better than the status quo but simple to understand; 2) an effective communication channel should be established; 3) time is required and this period is often described by an S-shaped curve, in which people can be innovators, early adopters, early majority, late majority or laggards; 4) the structure of the social system should be considered as it impacts the spread of innovations. The US Institute for Health Improvement applies Rogers’ ideas in practice and proposes that spread is a leadership responsibility. Note that the spread of knowledge should take place only if an innovation proves to be beneficial in practice.
    • Another important concept in implementation science is the tipping point described by Malcolm Gladwell. Gladwell emphasizes that a sticky message is required to tackle the concerns of health professionals who are not innovators or early adopters. An example of a sticky message to motivate health professionals is, “This change will help get you home half an hour earlier.”
    • Paul Plsek’s ideas are also beneficial to improve implementation science. He basis his work on Rogers and Gladwell’s research, claiming that once 10%-20% of the target population has adopted an innovation, the tipping point of success has been reached. Interestingly, Plsek places people in the following categories: pre-contemplation, contemplation, action, and maintenance, each requiring a different message to implement knowledge into care.
    • Sarah Fraser, on the other hand, insists that the implementation of science consists of continuous small changes and should focus on reducing costs for the organization spreading the innovation. Fraser also states that innovators should not look down on the majority of people as this majority actually holds the health system together (e.g., people who work with patients on a daily basis).

    Implementation Science: Factors to Consider and Types of Evaluation

    Implementation science is trying to enhance the use of evidence-based strategies that have shown to be effective and improve evidence-based decision-making systems, with a particular focus on low-income countries. To accomplish its goals, implementation science research must account for a wide range of factors and barriers, such as the characteristics of a given intervention, the local context, and stakeholders’ beliefs. For example, researchers must establish not only a program’s efficacy, or what can work in ideal conditions, but its effectiveness, or what works in routine practice. While models and frameworks play a crucial role in implementation science research, experts should focus on testing and redefining existing models rather than developing new and disconnected theories in order to improve routine care and administration. Thus, tailoring the spread of knowledge to the local context is essential, with patient involvement, the economics of implementation science, and personalized care being major areas of interest. Another factor to consider is the actual implementation process via interventions or strategies. Note that an implementation intervention is defined as “a single method or technique to facilitate change,” while an implementation strategy is discussed in terms of “an integrated set, bundle, or package of discreet implementation interventions ideally selected to address specific identified barriers to implementation success.”

    The different types of evaluation employed in implementation science research are also crucial (Bauer et al., 2015). Here we should note that in contrast to clinical research and controlled randomized trials, implementation science research does not focus only on the effects of a given evidence-based practice but on its quality use (e.g., increasing use of beta-blockers in myocardial infarction, the proportion of clinicians providing therapy to bipolar patients, etc.). To assess the use of evidence-based practices in routine care, researchers can implement three types of evaluation, based on both quantitative (e.g., administrative data, surveys) and qualitative (e.g., semi-structured interviews, focus group) measures:

    • Process evaluation: This type of evaluation describes the characteristics of the use of a program without providing feedback and without changing the process. For example, process evaluation can happen during an observational study.
    • Formative evaluation: This evaluation is similar to process evaluation, but feedback is provided during the study to improve the actual implementation process. Note that formative evaluation is included a priori in the study hypothesis.
    • Summative evaluation: This type of evaluation tackles the impact of the given implementation strategy, as well as its economic outcomes.

    Implementation Science: The Key to Success

    Implementation science research is not just another scientific field without any practical applications. Success stories across the globe prove that via the effective implementation of evidence-based programs, organizations can provide high-quality care and improve health outcomes. To provide an example, the US Veteran Health Administration system has transformed from a deteriorated system to one of the most prominent nationwide systems, serving more than five million patients.

    Interestingly, the Veteran Health Administration reduced hospitalization rates after increasing vaccination rates. One of the most significant improvements was the adoption of advanced access as appointment delays often result in poor outcomes, especially in chronic patients. In this case, the idea of advanced access was adopted by central leaders and mandated top-down; yet, individual sites had the freedom to adapt the idea to their needs and make it happen (Bodenheimer, 2007).

    Implementation Science: Conclusion

    Implementation science is one of the main tools to bridge the gap between research and effective routine care by studying the spread of innovations and evidence-based programs. Implementation science researchers must identify factors for success and barriers to support both the adoption and sustainability of programs. Researchers should focus on assessing existing models rather than creating disconnected theories that suit their favored approach.

    Most of all, implementation science researchers should acknowledge that people at all levels must be ready to change. While champions are the engines who initiate a change, leadership is responsible for creating an environment ready to accept the change, and front-line providers keep the organization moving – all with the sole purpose of improving global health.


    1. Bauer, M., Damschroder, L., Hagedorn, H., Smith, J., & Kilbourne, A. (2015). An introduction to implementation science for the non-specialist. BMC Psychology, 3 (1).
    2. Bodenheimer, T. (2007). The Science of Spread: How Innovations in Care Become the Norm.
    3. Llott, I., Gerrish, K., Pownall, S., Eltringham, S., & Booth, A. (2013). Exploring scale-up, spread, and sustainability: an instrumental case study tracing an innovation to enhance dysphagia care. Implementation Science, 8.