Research strategies for policy relevance.

AuthorWolf, Amanda

Abstract

Research can fail to be policy relevant when too little attention is paid to the "why" and "how" of policy change in the real world, and when relevant information remains elusive due to the complexity of social reality. Five strategies for researchers to consider when conceptualising new research are proposed, each of which addresses something about the mechanism of policy change. The strategies address issues concerned with both the availability of information and the fit of that information with policy argument needs. The five strategies serve to (1) generate new ideas about "what works" or what accounts for policy-relevant effects; (2) accrue ideas about the way mechanisms work for different people and in different circumstances; (3) improve understanding about why and how one mechanism works, and how it works in comparison with other mechanisms; (4) reveal the indirect mechanisms at work in a policy system; and (5) reinforce a realistic view of "causality" that supports timely action.

INTRODUCTION

Social science researchers, policy analysts and policy decision makers have long shared a commitment to better link research with policy. In historical perspective, today's focus on research into "what works" renews attention to evidence after a long reign of efficiency in the policy spotlight. Evidence-based (sometimes softened to "evidenceaware") policy is a recent term, but the idea is not new. At least since the 1930s, social scientists have provided input for policy development (Parsons 1995:20). In the 1960s and 1970s, these efforts emerged as a distinct field of enquiry, named policy analysis. Government agencies hired policy analysts in the genuine hope that they would crack the toughest policy nuts, such as poverty and illegal drug use, thus paving the way for effective interventions.

Today's proponents of evidence-based policy are wiser. For instance, Nutley et al. (2003) caution against over-expectation when they add a contextual qualifier in their stock phrase "what works, for whom, and in which circumstances". False promise, however, continues. Proponents of evidence-based policy have made great strides in proposing ways to improve the knowledge base for decisions. However, people oversimplify when they argue that improved social knowledge will lead to better policy outcomes. Not all knowledge will inform better decision making. And not all good decision making will lead to better outcomes. This article addresses the issue from the orientation of research design and methodology--from the strategies researchers use to produce knowledge--rather than that of the knowledge that is produced. In particular, my aim is to improve the chances that research outputs will indeed inform decision making, because they will be more policy relevant.

A semantic confusion needs to be addressed at the outset. I follow Majone's view that "evidence" is not the same as "information", although the two are often conflated. Majone writes that "evidence" is "information selected from the available stock and introduced at a specific point in the argument in order to persuade a particular audience of the truth or falsity of a statement" (1989:10). When the two words are conflated, we may fail to discriminate problems of availability from problems of fit for purpose.

In New Zealand, the "availability" (or information) problem predominates, expressed as either a lack of New-Zealand-specific information, or a lack of ways to make better use of a flood of mostly international information. The common prescription is to set priorities to improve the stock of New Zealand information. However, the "fit-for-purpose" (or evidence) problem needs to be addressed as well. This problem is a subset of the availability problem, as reflected in Majone's definition. More specifically, an evidence problem is apparent when very little of the vast stock of information available makes its way into policy arguments (Shulock 1999), or where the information that is asserted in support of a policy decision is unpersuasive.

Policy decisions, without doubt, should be well informed. Information may be valuable and interesting, but it is not evidence until it serves to make a policy argument, perhaps supporting a problem definition, or explaining or justifying a policy response. Policy researchers, then, might ask how effectively they are producing information that supports good decision making. Policy-relevant research results from a good match, an effective blending of researchers' awareness of policy decision needs and decision makers' awareness of a knowledge base that bears on their decision.

In this article I direct attention mainly to the issue of how to improve the policy relevance of research (where that is the espoused objective of the research in question). I focus at the front end of the evidence challenge, by considering how social science researchers can better conceptualise studies to improve the policy relevance of their results. Research conceptualisation refers to the initial design or vision of a research project, more like an architect's concept drawing than like a builder's blueprint (Hakim 2000:1). All successful conceptualisations match a "solution" to a client's "problem". In the policy research context, successful research starts with policy developers' and decision makers' ("users") knowledge needs, and hinges on how well these needs are specified. From the researcher's standpoint, the design challenge is how best to provide users with a solution they are happy with.

Research conceptualisation is key to achieving policy relevance. The architectural metaphor readily evokes the core challenge. Good architects have, in addition to technical skill and materials knowledge, a design sense that allows them to avoid bad ideas and, more often than not, provide a design concept for a structure or space that satisfies their commissioners. Similarly, policy-relevant research--relevant by design, and not by accident or serendipity--is underpinned and informed, more often than not, by well-conceived approaches to generating research solutions. Put somewhat differently, policy relevance is not an automatic by-product of "good" research.

Thus, to keep focused at the conceptualisation stage, I assume ideal research conditions. In these cases, researchers are fully able to contribute high-quality research to a well-functioning, realistic policy process within recognised constraints. Thus, I skirt challenges relating to research capacity and capability (technical skills, funding), to government preferences (strategic policy development, choice of desired outcomes), and to logistical or personal factors (awareness of research findings, willingness to consider unpalatable findings, maintaining reasonable mutual expectations between researcher and policy developer). What remains to focus on is the "reflective practitioner's" (Schon 1983) knack for seeing to the heart of what really needs to be known, if that knowledge is to qualify as policy relevant.

In the next section I illustrate how research can fail to be policy relevant through shortcomings in the information content of research, and through failures in information fit and extent, even where information quality is high. This review of the problem paves the way for considering five methodological strategies to improve relevance. Each strategy responds to a slightly different challenge facing researchers, but they overlap and are mutually reinforcing. I conclude on a cautiously positive note, for there are no serious hindrances to more widespread uptake of the suggested practices (indeed, all are currently in practice to some degree).

THE "RELEVANCE" PROBLEM

Policy-relevant research presents what has been, what is and what is likely to be, in a specific social context, in order to inform policy decisions. Continuing with the ideal scenario, let us assume that users are satisfied when researchers provide (enough) high-quality information, in understandable terms, which they can use to make and justify robust policy choices. In this formulation there are two critical links: one concerns the information content, and the other its contribution to policy choice. Greenberg et al. (2000:367) offer a similar view on critical links in the context of the usefulness of policy experiments by highlighting the reliability of information, the connection to the right users and the actual use of the information in decision making. While recognising the connections between available content and usefulness, weaknesses in each type are separately considered below.

Information Content Limitations: "What" Answered, Not "Why" and "How"

Many policy researchers focus on "what" questions and produce descriptive information. A recent discussion paper on knowledge needs from Statistics New Zealand (2003) provides a summary of policy questions organised into cross-cutting themes, such as population and security. Under the heading "Culture and Identity", for example, are a number of exploratory and descriptive questions, including:

* How do people living in New Zealand identify themselves and to what groups do they feel they belong? (This asks, in essence, What identifiers do people living in New Zealand apply to themselves?)

* What are different groups of New Zealanders' attitudes to "belonging" in New Zealand, and what is their experience of various aspects of life in New Zealand?

* What is the current status and health of the Maori language?

In general, "what" questions require a descriptive answer about a social phenomenon: What types of people are involved, and what are their characteristics? What knowledge, beliefs, values and attitudes do they hold? What is likely to happen? "Why" questions build on descriptive information to investigate the causes of, or reasons for, characteristics or patterns in a social phenomenon. They are directed toward understanding and explaining: Why do people think and...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT