Aim: Literature about research use suggests that certain characteristics or capabilities may make policy agencies more evidence attuned. This study sought to determine policy makers’ perceptions of a suite of organisational capabilities identified from the literature as potentially facilitating research uptake in policy decision making.
Method: A literature scan identified eight key organisational capabilities that support research use in policy making. To determine whether these capabilities were relevant, practical and applicable in real world policy settings, nine Australian health policy makers were consulted in September 2011. We used an open-ended questionnaire asking what facilitates the use of research in policy and program decision making, followed by specific questions rating the proposed capabilities. Interviews were transcribed and the content analysed.
Results: There was general agreement that the capabilities identified from the literature were relevant to real world contexts. However, interviewees varied in whether they could provide examples of experiences with the capabilities, how essential they considered the different capabilities to be and how difficult they considered the capabilities were to achieve.
Conclusion: Efforts to improve the use of research in policy decision making are likely to benefit from targeting multiple organisational capabilities, including staff skills and competence, tools such as templates and checklists to aid evidence use and leadership support for the use of research in policy development. However, such efforts should be guided by an understanding of how policy agencies use evidence and how they view their roles, and external factors such as resource constraints and availability of appropriate research.
Research utilisation science seeks to develop our understanding of what works to improve research use in policy. It is widely recognised that evidence-informed policy decision making benefits health outcomes, but also that individual, organisational and environmental barriers can prevent optimal use of research in policy processes.1-4 Consequently, there is growing interest in trialling interventions to improve research use in health policy and program development.5,6
Designing interventions that aim to build the capacity of policy organisations to use research requires an understanding of how policy agencies function. Of particular interest are the characteristics that make them more attuned to, and capable of, evidence-informed decision making.1,7 Although a substantial body of literature addresses ways to make research more palatable for policy decision makers, the actual policy environment is still described by many as a ‘black box’.8
The literature suggests that certain characteristics or capabilities may make policy agencies more attuned to evidence. However, this is often based on localised commentaries, case studies or exploratory qualitative research, or may be proposed as a possible explanatory factor for research uptake.1 In many instances, it is not clear whether the capabilities proposed in the literature are valid in real world policy settings.
We sought to test the validity of a suite of organisational capabilities that were identified from the literature as having the potential to facilitate research use in policy decision making. As part of the formative phase of a trial aimed at increasing the use of research in health policy making, we aimed to determine whether these capabilities are relevant, practical and applicable in real world policy settings. It was intended that findings from this work would inform the development of a multicomponent intervention trial.
The questions we wanted to answer were:
We used purposive sampling to identify comprehensive reviews on research use published in the past 10 years.4,6,7,9,10 From the reviews, factors identified as associated with increased research use were extrapolated into eight organisational attributes and capabilities that have the potential to facilitate research use.
To test whether the organisational attributes and capabilities found in the literature were relevant, applicable and practical in real world settings, a purposive sample of senior Australian health policy makers (n = 9) was consulted using informal semi-structured telephone interviews in September 2011. Interviewees were selected for their expert understanding of health policy environments, their known commitment to using research in policy decision making, and their knowledge of specific state and federal health policy and program agencies that were indicative of a potential target population for the proposed intervention trial. Interviewees held positions at the level of policy unit management or higher, and all had at least 10 years’ experience working in public health agencies. They worked in a range of areas related to health policy, including healthcare quality and safety, population health and preventive health. All invited policy makers agreed to participate.
Interviewees were emailed the eight proposed capabilities before the interview. Each interview began with an open-ended question – “From your experience, what organisational strategies facilitate the use of research in policy and program decision making?”– followed by specific questions about the relevance, applicability and importance of the proposed capabilities (see Box 1).
All interviews were digitally recorded and de-identified for transcription and analysis. Content of the interviews was analysed by tallying participant views of the proposed capabilities to refine the suite of capabilities, and by using thematic coding to identify new capabilities and associated themes and caveats.
1) From your experience, what organisational strategies facilitate the use of research in policy and program decision making?
Rating the proposed capabilities:
2) Do you think these eight organisational capabilities and systems can actually facilitate research use in policy?
3) Which of the capabilities and systems are particularly important for fostering evidence-informed practice in policy organisations?
4) Do you think there are particular capabilities and systems that will be difficult for policy agencies to develop their capacity in?
Our literature scan identified eight broad groupings of capabilities purported to encourage research use in policy environments (Table 1). The ‘policies’ capability is an institutional feature that promotes evidence uptake by mandating or otherwise incentivising the examination of research evidence at various stages of policy development.11–13 The ‘analysis’ capability refers to institutionalised arrangements in a policy organisation that strategically assess how evidence is used, and can be best used, to inform policy decisions.4,14–16 Other proposed capabilities focused more on ensuring skills and resources at individual and team levels, including ‘access’ and ‘training’.17–20 Capabilities that, directly or indirectly, refer to ensuring the closure of ‘know–do’ gaps, such as ‘knowledge generation’ and ‘evaluation’, were also found in the literature.6,21–23 In addition, there were frequent references to the value of ‘leaders’ and ‘champions’ in promoting evidence uptake.24–27 Finally, ‘relationships’ with researchers, which can take various forms and level of formality, was identified as a key capability that promotes the uptake of research in decision making.28,29
|Capability||Description||No. respondents raising capability in non-prompted part of interview (n = 9)||No. respondents rating capability as particularly important (n = 9)|
|Training||Strategies are in place to provide staff with training in using research and maintaining these skills||7||5|
|Access||The organisation has formal methods that it uses to access research||5||4|
|Policies||Organisational policies exist that encourage or mandate the examination of research in policy and program development||6
(in the form of tools, templates and guidelines)
|Leaders||Leaders of the organisation actively support the use of research in policy and program development||4||7|
|Analysis||Systems are in place for analysing the ways that research can inform policies and programs||2||3|
|Generate||The organisation has methods that it uses to generate research to inform its work||3||3|
|Evaluate||The organisation has methods that it uses to ensure adequate evaluations of its policies and programs||0||4|
|Relationships||Strong relationships exist with a diverse range of researchers||4||5|
Over the course of the interviews, respondents indicated that all eight capabilities were relevant to real world contexts and were legitimate goals for interventions aimed at increasing the use of research in health policy and program decision making. All capabilities were nominated as important by at least one interviewee. However, interviewees varied in how often they raised a particular capability during the open-ended part of the interview, how essential they considered each capability to be and how possible it was to achieve each capability.
Staff competence was raised more frequently than any other issue by interviewees in response to the initial, open-ended question (seven interviewees). Training was also rated as one of the three most important capabilities (along with leadership and relationships) when interviewees were asked to rate each capability. Interviewees argued that training was a means of achieving improved competency rather than an end in itself. Other mechanisms included staff recruitment policies, and the development of dedicated research and evaluation roles. Effective policy work required practical skills in appraising and synthesising research rapidly, but also required rhetorical skills in framing arguments and integrating research in a way that made it accessible and palatable to senior decision makers.
Unless you have people who have the time and the skills to generate research or interpret research and feed that into the decision making process, you’re going to get very little connection between the research that’s out there and the policy decision.
If the staff don’t have an understanding or an interest, then I guess you could have all the systems and methods and policies in the world, but how well that will be implemented or how well that will come together could be questionable.
Formal organisational policies and guidelines for encouraging or mandating research use were thought to contribute to the ethos of ‘evidence-based policy’, reduce obstacles to research use and ensure that policy positions cannot be advanced without some consideration of research.
The organisation … needs to have an explicit policy which in a sense mandates the consideration or includes the consideration of evidence in decision making.
This issue was raised by six interviewees during the non-prompted part of the interview, but without specifying ‘policies’ as such. Rather, interviewees referred to concrete resources such as templates and checklists to help policy makers with planning, implementation and decision making: “so it’s tools, checklists, templates, those sorts of things which translate the policy into a mechanism of practice”. Some mentioned examples of tools that are prominent in the literature.30
Good access to research was considered to be fundamental, and was raised by five interviewees during the open-ended question part of the interview: “… obviously something that is really critical as well is that people are actually able to access information”.
When asked the initial open-ended question, four interviewees mentioned ‘leadership’ as a key capability. This was also rated as one of the important factors when rating the proposed list of capabilities drawn from the literature. Leadership was considered key for promoting the value of research use within the organisational culture: “the key factor is the attitude and expectations of senior decision makers and the extent to which they value evidence”. Interviewees believed leadership had an important role to play in cultural change: “organisational culture is also about what people above you will agree to or not”.
Analysis was not raised in response to the initial open-ended question. Only two interviewees were able to give examples of organisation-level analysis of how research is (and could be) used strategically within the policy environment, when reviewing the list of capabilities provided. Some commented that this proposed capability was ambiguously defined, so, unsurprisingly, there was a lack of consensus about what it meant in practice.
Although interviewees recognised that conducting and commissioning research are important skills, most considered that generating knowledge had less impact on an organisation’s uptake of research, especially in relation to the other capabilities. Two interviewees talked about the value of locally generated research, but they expressed scepticism about policy staff having the ability and time to produce research.
Evaluation of policies and programs was considered to be important for well-informed decision making, but was regarded as particularly challenging: “we do need that but I’ve got to say, I think by and large the health system is weaker at [evaluation] than the up-front process”. This was linked to the need for financial and infrastructure support, and the importance of staff competence as a core capability:
Sometimes the reason why evaluation doesn’t happen isn’t about politics, it’s about funding or it’s about someone not actually knowing how to do it. It does go back to a skill issue.
Relationships with researchers featured prominently (raised by four interviewees during the open-ended part of the interview and rated of high importance when reviewing the list of capabilities). Interviewees described the need to access authoritative experts who can give clear, succinct answers to policy problems. Genuinely valuable relationships, however, were often collaborative and took time:
… I have seen examples of where there have been established relationships and built over time where a research program is developed collaboratively to support a reform agenda, it’s been very useful.
Problems in identifying appropriate researchers, and the potential to become too ‘cosy’ in the relationship were also noted:
People develop relationships with particular groups but might not go beyond those groups and so you know you fall back on old friendships I suppose. And then that question is how do you know who are the researchers in a particular field?
Our study sought to determine whether organisational capabilities for improved research use proposed in the literature appear valid to people with extensive insider knowledge of policy organisations. We found it necessary to refine key capabilities to improve their clarity and applicability in policy settings. We also identified several caveats that should be considered when applying the results in the real world.
There was general agreement among interviewees that the capabilities identified from the literature were relevant to real world contexts. However, three key points could serve to refine the capabilities before using them to inform the development of interventions.
First, training, in itself, was not considered to be a key capability for facilitating research use. Rather, a variety of staff skills and competencies are needed, and training may be one means of achieving this. During the open-ended question part of our interviews, staff skills was the most frequently cited factor for determining research use and was also considered one of the most important when reviewing our suite of capabilities drawn from the literature.
Second, templates and checklists were identified as important tools for enacting organisational policies but were not considered to fall within the ‘policy’ capability, and so may need to be considered separately.
Third, with regard to the ‘analysis’ capability, confusion about the value of analysing current and potential use of research by an organisation suggests that more work needs to be done to define this capability to explain its value and mode.
Thus, we recommend that, if our suite of organisational capabilities is to be used as a basis for studies examining how to increase the use of research in health policy and program development, it should be refined to accommodate these priorities and variations in understanding.
Our interviews were also an opportunity to assess the real world relevance and applicability of policy agency capabilities that have been proposed in the literature. Interviewees agreed that proposed capabilities were relevant to real world contexts; however, there were caveats about the manner in which they were applicable and practical.
Several interviewees noted that policy makers are expected to use evidence in decision making –“it is a given” – so that, generally, evidence “ … just flows into what we do”. They emphasised, however, that evidence has many sources and should not be defined as academic research alone. Further, that it feeds into decision making directly and indirectly and is not always measureable or easily identified.
The construct of ‘a policy maker’ was also challenged. Interviewees explained that not all staff working in public agencies would consider the remit of their work to be policy making per se. Different terms may actually refer to the same things, depending on the setting. There is a need to design research use strategies and use language that is sensitive to variation in staff and organisations.
A planned approach to building research use capabilities was considered desirable, but interviewees noted that capacity development was likely to depend on resources. Aspects such as time and budget, both of which are subject to shifting external forces and thus in constant flux, can make it hard to predict the need for, and feasibility of, particular intervention strategies.
Research use was described as complex and mediated by external factors such as the frustrating lack of relevant, timely research. This concern went beyond research translation (e.g. synthesising research and providing it in policy-friendly language and formats), to the type of knowledge generated, including the research focus (e.g. implementation challenges) and appropriateness of the methods.
Lastly, there was no clear causal link between specific capabilities and outcomes. Interviewees sometimes struggled to articulate what the capability they were espousing would contribute to research uptake and use. Organisational culture can create tacit norms that support an environment receptive to research, even when seemingly obvious facilitators such as research use policies, highly skilled staff or ready access to research are missing.
Our study made initial inroads into testing the validity of organisational capabilities, identified from the literature, that have the potential to facilitate research use in policy decision making. Interviewees were all employed in policy agencies in Australia and were purposively sampled for their experience and commitment to using research in policy decision making. Although we suggest that the results would reflect the views of managers and directors in similar situations, we recognise that this may vary between policy settings and content areas, and with variation in individual experience of research use. In addition, some responses may have reflected specific local practice, such as how policy makers access research, develop relationships with researchers and conceptualise their own roles. This could be explored further in other settings.
This study formed part of the formative phase of a trial aimed at increasing the use of research in health policy making. Findings suggest that efforts to improve the use of research in policy decision making are likely to benefit from targeting multiple capabilities, including staff skills and competence, access to research, internal policies that mandate or encourage the use of research, tools such as templates and checklists that help put these policies into practice, leadership that supports research use, collaborative relationships with researchers, and generating new knowledge through research and evaluation. However, such efforts should be guided by an understanding of how policy agencies use evidence, how they view their roles, and external factors such as resource constraints and availability of appropriate research. The diffuse nature of research use and the complexity of the policy environment remain a challenge.
This work was partially funded by the Centre for Informing Policy in Health with Evidence from Research (CIPHER), an Australian National Health and Medical Research Council Centre for Research Excellence (APP1001436) grant, administered by the University of Western Sydney. The Sax Institute receives core funding from the NSW Ministry of Health.
© 2014 Huckel Schneider et al. This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International Licence, which allows others to redistribute, adapt and share this work non-commercially provided they attribute the work and any adapted version of it is distributed under the same Creative Commons licence terms.