Advanced search
Research

Increasing the use of research in population health policies and programs: a rapid review September 2018, Volume 28, Issue 3

Danielle M Campbell, Gabriel Moore

Published 27 September 2018. https://doi.org/10.17061/phrp2831816
Citation: Campbell DM, Moore G. Increasing the use of research in population health policies and programs: a rapid review. Public Health Res Pract. 2018;28(3):e2831816.

  • Citation

  • PDF

About the author/s

Danielle M Campbell | Centre for Epidemiology and Evidence, NSW Ministry of Health, Sydney, Australia

Gabriel Moore | Sax Institute, Sydney, NSW, Australia

Corresponding author

Danielle M Campbell | [email protected]

Competing interests

None declared.

Author contributions

DC drafted the manuscript. DC and GM were jointly responsible for the study design, analysis of data and editing of the manuscript.

Abstract

Background and objectives: Although the body of literature on factors that impede and enhance the use of research in policy making continues to expand, there is limited evidence about strategies that are effective at fostering the use of research in population health policy and programs. Building on previous reviews, we reviewed the published literature to identify and assess papers describing intervention studies that had outcome measures relating to research use.

Study type: Rapid review.

Methods: We searched four academic databases and Google Scholar to identify papers published between 2009 and 2015. Our focus was on strategies relevant to population health policy and program delivery. For studies that tested strategies to increase the use of research, we extracted details about the intervention, participants, study sites and methods, and primary and other outcomes.

Results: We identified 14 articles reporting on 13 intervention studies. The studies were relatively weak methodologically and together provide few indications of effect. Only one study used an experimental design and one other used pre-/post-test design; the remaining studies were characterised by an absence of control groups, small sample sizes, and self-report data. Of the 13 studies: four intervention studies were related to the theme ‘relevant, useful, accessible research’; five studies (described in six papers) tested strategies that facilitated interaction between researchers and research users; three studies assessed strategies aimed at enhancing the capacity of organisations to use research; and one intervention study was related to the theme ‘funding research infrastructure and research projects’.

Conclusion: The level of evidence for the effectiveness of strategies to improve the use of research in policy making is low, and there remains a need for well-designed empirical studies that evaluate interventions. In the absence of strong evidence, efforts to enhance research use should be tailored to organisational needs and may incorporate capability development, improved access to targeted research summaries and syntheses, and greater interaction and collaboration with researchers.

Full text

Key points

  • There is a need for rigorous research evaluating the impact of strategies aimed at increasing the use of research in policy making
  • Most evaluation studies in this field are relatively weak methodologically
  • In the absence of strong evidence, it is suggested that efforts to improve research use should be tailored to the needs and priorities of organisations and individuals, and may include building capability to use research, facilitating access to targeted research summaries, participatory and collaborative processes, and enabling active involvement of policy makers in research

Introduction

It is increasingly expected that population health policies and programs are informed by the best available evidence from research. An evidence-informed approach to population health has the potential to increase the likelihood that successful programs and policies are implemented, and to ensure more efficient use of both public and private resources.1

In 2009, the NSW Ministry of Health commissioned a rapid review to identify strategies that could be used to foster the use of research in population health policy and programs. The review included papers published between 1999 and 2009, and identified five intervention studies that evaluated the impact of strategies to improve research use.2 These five studies pointed to the potential value of: tailored, targeted messages alerting recipients to relevant research and enabling access to the research; using knowledge brokers to build research capability in organisations with low research receptivity; and individual-level training in critical appraisal and research literacy. However, the review provided only limited evidence of effectiveness due to the small number of studies and their methodological limitations.

The number of empirical studies of factors that impede or enhance the use of evidence in policy decision making is rapidly expanding.3 In light of the growth of the evidence base, we conducted a rapid review of the literature published between 2009 and 2015 to identify new evidence about strategies that are effective for increasing the use of research in population health policies and programs.

Methods

We searched Medline, CINAHL, Informit Online, PubMed and Google Scholar for papers published in English between October 2009 and July 2015. The initial search consisted of combinations of several terms including: ‘health’, ‘policy’, ‘evidence based policy’, ‘government’, ‘research utilisation’, ‘knowledge mobilisation’, ‘knowledge translation’, ‘knowledge exchange’, ‘rapid review’, ‘commissioned review’, ‘knowledge management’, ‘knowledge use’, ‘knowledge brokering’, ‘organisational readiness’ and ‘research capacity building’. As we found few articles relating to strategies for generating new research, we expanded our search to include the terms: ‘research funding’, ‘funded research’, ‘commissioned research’, ‘collaboration’, ‘partnership’, ‘coproduction’, ‘government research’, ‘research institute’ and ‘research centre’. For the Google Scholar search we reviewed up to 300 titles per search term, stopping when no new relevant papers were retrieved. The detailed searches are documented in: www.health.nsw.gov.au/research/Documents/increasing-the-use-of-research.pdf.4

Published, peer-reviewed articles were included if they focused on strategies aiming to increase research use (including both use of existing research and generation of new research) and factors associated with these strategies that may influence research use. Our focus was specifically on strategies that would be relevant to population health policy and program delivery by agencies like NSW Health. For the purposes of this review, population health policy and program delivery included the development, implementation and evaluation of policies and programs at national or state/regional level, and excluded implementation of initiatives at community and local level. We included: articles describing strategies implemented by research funding agencies and organisations implementing or supporting knowledge translation strategies; articles describing strategies used by researchers, academics or universities if they included a focus on policy makers and program managers; and articles about the use of evidence from evaluation including economic evaluation.

We excluded: conference abstracts, editorials, book chapters, grey literature, and publications about low- and middle-income countries; articles focusing on health technology assessment, basic science, genomics and pandemics unless they focused on health protection; articles relating to sectors other than health unless they described multisectoral initiatives; articles about community-level interventions, those pertaining to local governments or that are primarily the responsibility of other entities such as universities; articles about health service delivery, clinical guidelines, clinical practice or clinical conditions, unless they focused on policy or population-level strategies; articles about community–academic partnerships that did not include policy makers or program managers; and articles on organisational systems and processes, or training and professional development for health policy makers or practitioners, unless there was a focus on strategies to support research use.

Of the 5934 citations identified through the database searches, 1545 duplicates were removed (Figure 1). Both authors separately screened all remaining papers by title and abstract and excluded 4067 further papers. Following independent assessment by both authors of the full text of the 322 remaining papers, another 18 papers that did not meet the inclusion criteria were excluded.

Figure 1.     Summary of article search and assessment process (click to enlarge)

Of the 304 included papers, 187 described primary research. The remainder were literature reviews (n = 38), professional commentaries (n = 69) and protocols (n = 10). We categorised the 187 primary research papers as either descriptive studies, which included surveys, interviews and document analysis (n = 109) and descriptions of strategies, activities and programs (n = 64); or studies testing strategies to increase research use (n =14). Given the small number of evaluations conducted in this field, we applied a broad definition to ‘studies testing strategies’. We included all studies in which there was an attempt to assess the impact of an intervention related to increasing research use, including those in which there was no control group or pre-intervention measurement. The 14 articles we identified described 13 intervention studies. For these articles, we extracted data on the intervention, study design, sample selection and size, methods, primary and other outcomes, and timing of outcome measurements (see Supplementary Table 1, available from: figshare.com/articles/Supplementary_table_1_-_quality_assessment_docx/6977789). Study quality was assessed using the Mixed Methods Appraisal Tool (MMAT) Version 2011, which is designed for use in systematic literature reviews that include qualitative, quantitative and mixed methods studies.5 The MMAT appraises the methodological quality of studies according to criteria such as the relevance of data sources and analytical approaches (qualitative studies), the appropriateness of measurements and acceptability of response rates (quantitative studies), and the relevance of research designs and integration of data (mixed methods studies). MMAT scores range from * (one quality criterion met) to **** (all criteria met).

The 187 primary research papers were classified according to the study’s main theme (see Table 1). Three of the themes (relevant, useful, accessible research; interaction, partnerships and research coproduction; increasing organisational capacity to use research) were drawn from the 2009 review2, and two additional themes (funding research infrastructure and research projects; research priority setting) emerged through appraisal of the papers.

 

Table 1.     Primary research papers, classified by main theme

Theme All primary research n (%) Studies testing strategies n (%)
Relevant, useful, accessible research 62 (33.2) 4 (28.6)
Interaction, partnerships and research coproduction 49 (26.2) 6 (42.9)
Increasing organisational capacity to use research 45 (24.1) 3 (21.4)
Funding research infrastructure and research projects 28 (15.0) 1 (7.1)
Research priority setting 3 (1.6) 0 (0.0)
Total 187 (100) 14 (100)

Results

The focus of this paper is on the 14 papers reporting findings from 13 studies that tested strategies to increase research use in policy making. Although most of the studies identified a range of outcome measures, findings are presented only for outcomes relating to research use (e.g. impacts of research use) and factors that influence research use (e.g. commitment to using research, likelihood of using research, capacity to use research).

Study design and quality assessment

Of the 14 papers reporting findings from studies that tested a strategy, only one6 used an experimental design. One study used pre- or post-test design with mixed methods7, and two used a case study approach with qualitative methods.8,9 The remaining 10 studies used a post-test design with mixed methods (n = 6)10-15, qualitative methods (n = 3)16-18 or quantitative methods (n = 1).19 Most studies fulfilled either three (n = 9) or all four (n = 4) of the quality criteria specified by the MMAT for each study design.

Study findings

Relevant, useful, accessible research

The four papers that fitted the theme of relevant, useful, accessible research are described below.

Brennan et al.16 conducted individual and group interviews with 43 policy makers to assess the Policy Liaison Initiative, a strategy aimed at supporting the use of Cochrane systematic reviews in health policy making. Policy makers indicated that research summaries were essential for increasing access to research, and preferred layered (‘graded-entry’) formats ranging from short summaries to detailed reports. The absence of a mechanism to capture policy makers’ changing research needs, and thus to ensure timely and targeted provision of relevant syntheses, was an impediment to research use.

Campbell et al.10 conducted structured interviews with eight policy makers to evaluate the process and outcomes associated with Evidence Check, an Australian program managed by the Sax Institute designed to support policy makers to commission rapid reviews of research. Reviews commissioned through the program were mostly perceived as useful for policy decision making. While all eight commissioned reviews had indirect impacts (e.g. informing deliberations, identifying evidence gaps), only two direct policy impacts (i.e. on policy decisions) were noted.

Brownson et al.6 used a survey to explore the impact of four types of policy briefs on the likelihood that policy makers (staffers, legislators, executives) would use and share the briefs. The briefs focused on mammography screening, were either data- or story-focused, and used either state- or local-level data. Based on 291 survey responses, the likelihood of using the brief in a policy setting differed across policy groups, with staffers most likely to report a preference for using the story-focused/state-level brief and legislators most likely to prefer the data-focused/state-level brief.

In their study in the Netherlands, van der Heide et al.14 interviewed and surveyed ‘knowledge workers’ to evaluate the implementation of a Writing on Effectiveness tool across a government public health institute. The web-based tool aimed to improve communication about the effectiveness of interventions to facilitate the use of evidence in policy and practice. Knowledge workers felt the tool could lead to improvements in research products but perceived it as more useful for scientific than policy outputs. Time investment was a practical barrier to use of the tool.

Interaction, partnerships and research coproduction

The review included six papers describing interventions with the theme of interaction, partnerships and research coproduction, as detailed below.

Dwan et al.19 surveyed policy makers (n = 979) who participated in Australian Primary Health Care Research Institute Conversations, through which researchers presented their findings directly to policy makers in traditional seminars or more interactive roundtables. The policy maker participants felt the presentations stimulated their thinking and broadened their knowledge; most agreed they could use the knowledge presented and that the content was directly applicable to their work. Interactive roundtables were perceived as being more relevant than didactic seminars; facilitated engagement was associated with increased perceived relevance and effectiveness of presentations.

Househ et al.8 used data from observation notes, interviews and meeting transcripts to examine the impacts of conferencing technologies to support knowledge exchange in three drug policy groups. Although the groups adapted to using each type of technology to facilitate communication, the technology did affect how evidence was shared within groups (e.g. web conferencing limited the amount of information that could be shared). The importance of skilled facilitators was emphasised, particularly for mediums that do not convey nonverbal cues (such as web conferencing).

Findings from an evaluation of the Service Delivery and Organisation Management Fellowships program in the UK are described by Bullock et al.17 and Morris et al.9 The program supported managers to participate in funded research projects to improve research quality and relevance, develop research capacity, and encourage linkage and exchange between the research and practice communities. Interviews with participants at 10 case study sites indicated that involvement of management fellows improved the relevance, validity and credibility of research, and fellows also improved their individual capacity to use research through better understanding of research processes.17 The ‘fit’ between the management fellow and the research project and team was an important success factor. Data from eight sites suggested that management fellows benefited from linkage with researchers through exposure to knowledge to inform their practice. Some fellows reported taking insights from projects to workplace colleagues.9

Wathen et al.15 used postforum surveys and follow-up interviews to assess the knowledge translation and exchange (KTE) processes undertaken during a series of studies on screening women for exposure to intimate partner violence. KTE activities included collaboration with research users to develop key messages, stakeholder workshops and exchange forums, and an online community of interest. At 3–‍6 months after the event, most workshop (88%) and forum (79%) participants reported sharing research findings with people in their organisations, but fewer workshop (40%) and forum (37%) participants reported having used research knowledge from the event. Symbolic use of the research (e.g. to reinforce current policies or programs) and/or conceptual use (e.g. to increase understanding of issues) were more common than instrumental use (e.g.to update clinical protocols or guidelines).

Kothari et al.13 evaluated PreVAiL (Preventing Violence Across the Lifespan), an international, interdisciplinary research network that provides seed grants and supports attendance of network members (researchers and policy partners) at exchange meetings with the aim of involving policy makers in knowledge generation, dissemination and usage. A survey of 37 network members and interviews with 19 members indicated that policy partners valued the connections to researchers afforded by the network. Most policy makers used findings generated by the network conceptually (e.g. to augment their understandings), and some cited instrumental use (e.g. to shape organisational directions).

Increasing organisational capacity to use research

Three studies fitted the theme of increasing organisational capacity to use research.

A Canadian study by Dilworth et al.11 surveyed 72 ‘organisational champions’ and conducted an online focus group with 11 steering committee members to explore the impacts of becoming a Best Practice Spotlight Organisation Candidate. Candidacy involved implementing and evaluating best practice guidelines in health promotion programs, supported by a team of champions. A large majority of champions agreed that, as a result of guideline implementation, evidence was used to inform practice (86%) and evidence-informed practice was part of the organisation’s culture (80%).

Jansen and Hoeijmakers12 evaluated a research masterclass for public health professionals and policy makers involving a series of structured sessions that supported a practice-based research project. Findings from a survey and focus groups with 16 participants indicated that most expected to disseminate and implement the results of the research after completion of the masterclass. However, at 6-month follow-up, implementation of practice change was minimal. Support from participants’ managers was viewed as an important success factor.

Yost et al.7 evaluated a Canadian intensive 5-day educational workshop on evidence-informed decision making (EIDM) knowledge, skills and behaviours. Forty participants completed preworkshop, postworkshop and 6-month follow-up measures, and eight completed telephone interviews after the 6-month data collection. There was a significant increase in EIDM knowledge and skills from pre- to postworkshop, and from preworkshop to 6 months, but a decrease from postworkshop to 6 months. An increase in EIDM behaviours from preworkshop to 6 months was not significant.

Funding research infrastructure and research projects

A study in the Netherlands used in-depth interviews and focus groups to evaluate the impact of a grant-funded, institutional-level, collaborative centre involving regional public health services, municipal departments and university departments. Hoeijmakers et al.18 reported that although the centre provided a platform for interaction, collaboration within specific research projects did not evolve into broader collaboration. While immediately applicable research results remained scarce, there were several examples of the uptake of findings from the studies of doctoral students to adjust working methods and procedures.

Discussion

This rapid review identified 14 papers describing 13 studies that tested strategies with outcome measures relating to research use or factors that influence research use. Consistent with the 2009 review2, this review suggests that efforts to build individual research capability may help in supporting practitioners to understand research, particularly in the short term. This review provides additional preliminary support for the potential value of: a system for commissioning rapid reviews; tailored approaches to presenting written research findings to policy audiences; the involvement of policy makers in research teams and networks; interactive seminars and conferencing technology for communicating about evidence; organisation-wide capability development initiatives; and funded institutional-level collaborations.

This review highlights strong interest in fostering interaction and partnerships between researchers and policy makers, with almost half of the studies assessing strategies relevant to this theme. This focus is consistent with findings from systematic reviews which identify collaboration between researchers and policy makers as an important facilitator of research uptake.20,21 At the most intensive end of the research partnership continuum, coproduction involves the active collaboration of research producers and users in all parts of the research process, from shaping the research questions and methodology, to data collection, dissemination, interpretation and implementation of results.22 There is currently limited evidence supporting this type of integrated approach to knowledge translation as a mechanism for increasing research use.23 However, evaluations of the UK National Institute for Health Researchfunded Collaborations for Leadership in Applied Health Research and Care (CLAHRCs) suggest that programmatic partnerships between universities and health services may increase research use in a clinical healthcare context.24

It is of interest that, while the volume of literature has continued to expand, there are still relatively few intervention studies. In this review, 5% (n = 14) of 304 articles evaluated interventions; this is the same proportion reported by Moore and colleagues25, who identified five intervention studies from 106 papers published between 1999 and 2009. Much of the primary research in this field is descriptive and focuses on the perspectives of practitioners regarding barriers to or facilitators of research use, drawn from surveys or interviews.26 The challenges associated with conducting intervention research in this field should be acknowledged. For example, there is a need for greater conceptual clarity around the meaning of research use in a policy context, and metrics to measure it.3 Other difficulties relate to issues such as the potentially long time lag between research being conducted and its impact on policy, and the developmental nature of research use over time.27

Strengths and limitations

The review process was comprehensive and followed rigorous criteria for identifying and assessing relevant papers. This review represents an update of a previous rapid review2, and so provides a contemporary and comparable overview of the state of the evidence. However, the studies included in this review are all relatively weak methodologically and so together provide only very small indications of effect. Only one of the 14 included papers used an experimental design and one other used pre-/post-test design; the remaining studies were characterised by an absence of control groups, small sample sizes, and self-report data. There is a need for well-conducted, rigorous research that is capable of evaluating interventions delivered in often complex organisational entities, assessing the context in which the program is delivered, and measuring research using validated tools.28

Conclusion

Despite a rapid expansion in research literature about strategies to increase research use, there is very little evidence of the effectiveness of different strategies; there remains an urgent need for well-designed empirical studies to evaluate interventions, along with practicable metrics to assess research use. In the absence of strong evidence, the findings presented here suggest that efforts to enhance research use should be tailored to the evolving needs and priorities of organisations and individuals. Such efforts may include building capability to use research, facilitating access to targeted research summaries and syntheses, participatory and collaborative processes such as interactive forums and networks, and enabling active involvement of policy makers in research.

Acknowledgements

The Sax Institute receives core funding from the NSW Ministry of Health. The NSW Ministry of Health contributed specific funding to the Sax Institute for the coconduct of the rapid review on which this paper is based.

Peer review and provenance

Externally peer reviewed, commissioned.

Copyright:

Creative Commons License

© 2018 Campbell and Moore. This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International Licence, which allows others to redistribute, adapt and share this work non-commercially provided they attribute the work and any adapted version of it is distributed under the same Creative Commons licence terms.

References

  • 1. Brownson RC, Fielding JE, Maylahn CM. Evidence-based public health: a fundamental concept for public health practice. Annu Rev Public Health. 2009;30:175–201. CrossRef | PubMed
  • 2. Moore G, Todd A, Redman S. Strategies to increase the use of evidence from research in population health policy and programs. Sydney: Sax Institute; 2009 [cited 2018 Aug 8]. Available from: www.health.nsw.gov.au/research/Documents/10-strategies-to-increase-research-use.pdf
  • 3. Oliver K, Lorenc T, Innvær S. New directions in evidence-based policy research: a critical analysis of the literature. Health Res Policy Syst. 2014;12:34. CrossRef | PubMed
  • 4. Moore D, Campbell D. Evidence Check: Increasing the use of research in policymaking. Sydney: NSW Ministry of Health and the Sax Institute; 2017 [cited 2018 Aug 20]. Available from: https://www.health.nsw.gov.au/research/Documents/increasing-the-use-of-research.pdf
  • 5. Pluye P, Robert E, Cargo M, Bartlett G, O’Cathain A, Griffiths F, et al. Proposal: a mixed methods appraisal tool for systematic mixed studies reviews. Canada: McGill University; 2011 [cited 2018 Jun 28]. Available from: mixedmethodsappraisaltoolpublic.pbworks.com/w/file/fetch/84371689/MMAT%202011%20criteria%20and%20tutorial%202011-06-29updated2014.08.21.pdf
  • 6. Brownson RC, Dodson EA, Stamatakis KA, Casey CM, Elliott MB, Luke DA, et al. Communicating evidence-based information on cancer prevention to state-level policy makers. J Natl Cancer Inst. 2011;103(4):306–16. CrossRef | PubMed
  • 7. Yost J, Ciliska D, Dobbins M. Evaluating the impact of an intensive education workshop on evidence-informed decision making knowledge, skills, and behaviours: a mixed methods study. BMC Med Educ. 2014;14:13. CrossRef | PubMed
  • 8. Househ MS, Kushniruk A, Maclure M, Carleton B, Cloutier-Fisher D. The use of conferencing technologies to support drug policy group knowledge exchange processes: an action case approach. Int J Med Inform. 2011;80(4):251–61. CrossRef | PubMed
  • 9. Morris ZS, Bullock A, Atwell C. Developing engagement, linkage and exchange between health services managers and researchers: experience from the UK. J Health Serv Res Policy. 2013;18(Suppl 1):23–9. CrossRef | PubMed
  • 10. Campbell DM, Donald B, Moore G, Frew D. Evidence Check: knowledge brokering to commission research reviews for policy. Evid Policy. 2011;7(1):97–107. CrossRef
  • 11. Dilworth K, Tao M, Shapiro S, Timmings C. Making health promotion evidenced-informed: an organizational priority. Health Promot Pract. 2013;14(1):139–45. CrossRef | PubMed
  • 12. Jansen MWJ, Hoeijmakers M. A masterclass to teach public health professionals to conduct practice-based research to promote evidence-based practice: a case study from the Netherlands. J Public Health Manag Pract. 2013; 19(1):83–92. CrossRef | PubMed
  • 13. Kothari A, Sibbald SL, Wathen CN. Evaluation of partnerships in a transnational family violence prevention network using an integrated knowledge translation and exchange model: a mixed methods study. Health Res Policy Syst. 2014;12:25. CrossRef | PubMed
  • 14. van der Heide I, van der Noordt M, Proper KI, Schoemaker C, van den Berg M, Hamberg-van Reenen HH. Implementation of a tool to enhance evidence-informed decision making in public health: identifying barriers and facilitating factors. Evid Policy. 2016;12(2):183–97. CrossRef
  • 15. Wathen CN, Sibbald SL, Jack SM, MacMillan HL. Talk, trust and time: a longitudinal study evaluating knowledge translation and exchange processes for research on violence against women. Implement Sci. 2011;6:102. CrossRef | PubMed
  • 16. Brennan SE, Cumpston M, Misso ML, McDonald S, Murphy MJ, Green SE. Design and formative evaluation of the Policy Liaison Initiative: a long-term knowledge translation strategy to encourage and support the use of Cochrane systematic reviews for informing health policy. Evid Policy. 2016;12(1):25–52. CrossRef
  • 17. Bullock A, Morris ZS, Atwell C. Collaboration between health services managers and researchers: making a difference? J Health Serv Res Policy. 2012;17(Suppl 2):2–10. CrossRef | PubMed
  • 18. Hoeijmakers M, Harting J, Jansen M. Academic Collaborative Centre Limburg: a platform for knowledge transfer and exchange in public health policy, research and practice? Health Policy. 2013;111(2):175–83. CrossRef | PubMed
  • 19. Dwan KM, McInnes P, Mazumdar S. Measuring the success of facilitated engagement between knowledge producers and users: a validated scale. Evid Policy. 2015;11(2):239–52. CrossRef
  • 20. Innvær S, Vist G, Trommald M, Oxman A. Health policy-makers’ perceptions of their use of evidence: a systematic review. J Health Serv Res Policy. 2002;7:239–44. CrossRef | PubMed
  • 21. Oliver K, Innvær S, Lorenc T, Woodman J, Thomas J. A systematic review of barriers to and facilitators of the use of evidence by policymakers. BMC Health Serv Res. 2014;14:2. CrossRef | PubMed
  • 22. Graham ID, Tetroe J. How to translate health research knowledge into effective healthcare action. Healthc Q. 2007;10(3):20–2. CrossRef | PubMed
  • 23. Graham ID, Kothari A, McCutcheon C, on behalf of the Integrated Knowledge Translation Research Network Project Leads. Moving knowledge into action for more effective practice, programmes and policy: protocol for a research programme on integrated knowledge translation. Implement Sci. 2018;13:22. CrossRef | PubMed
  • 24. Soper B, Yaqub O, Hinrichs S, Marjanovich S, Drabble S, Hanney S, Nolte E. CLAHRCs in practice: combined knowledge transfer and exchange strategies, cultural change, and experimentation. J Health Serv Res Policy. 2013;18(Suppl. 3):53–64. CrossRef | PubMed
  • 25. Moore G, Redman S, Haines M, Todd A. What works to increase the use of research in population health policy and programmes: a review. Evid Policy. 2011;7(3):277–305. CrossRef
  • 26. Cairney P, Oliver K. Evidence-based policymaking is not like evidence-based medicine, so how far should you go to bridge the divide between evidence and policy? Health Res Policy Syst. 2017;15:35. CrossRef | PubMed
  • 27. Penfield T, Baker MJ, Scoble R, Wykes MC. Assessment, evaluations, and definitions of research impact: a review. Res Eval. 2014;23(11):21–32. CrossRef
  • 28. CIPHER Investigators. Supporting Policy In health with Research: an Intervention Trial (SPIRIT) – protocol for a stepped wedge trial. BMJ Open. 2014;4(7):e005293. CrossRef | PubMed