Research on evidence-based policymaking
Note: Links are highlighted in dark orange.
Stand-alone research articles are listed below in reverse chronological order. Earlier academic work and helpful links are grouped below the articles:
More from the Campbell Collaborative: Here is an editorial in Campbell Systematic Reviews titled "Research—What is it good for? Absolutely nothing… unless it is used to inform policy and practice." The editorial emphasizes "the need to take the research one step further in ‘translating’ the research findings into forms which are discoverable, understandable and usable by decisionmakers." The authors emphasize Campbell's approach in using "evidence and gap maps to build the evidence architecture." The U.S. doesn't have a counterpart organization that develops these evidence and gap maps.
Here is a valuable report from The Pew Charitable Trusts that offers guidance on ways in which nongovernment stakeholders can support states in advancing evidence-based policymaking. The report describes lessons learned from Pew's Results First initiative, which has assisted states in institutionalizing and sustaining the use of evidence in policymaking. The initiative will end in 2023, with its resources being transferred to other organizations.
Howard White's recommendations (see March entry below) are supported, in part, by results from the following working paper from the National Bureau of Economic Research. The paper reports that evidence from randomized controlled trials in U.S. cities is more likely to be incorporated into policymaking when it is implemented as part of "pre-existing communications." See "Bottlenecks for Evidence Adoption."
The president of the Association for Public Policy and Management's 2021 presidential address, titled "Connecting the Dots: Turning Research Evidence into Evidence for Policymaking" was recently published in the Journal of Policy Analysis and Management. In the interest of promoting good policymaking, perhaps the journal should consider providing free access to the article, if only for a limited time.
Here is a report from a team at Mathematica, advocating for use of Bayesian methods that incorporate prior evidence to assess the impact of a policy programmatic change. Bayesian models are not particularly easy for non-statisticians to understand, but it can't hurt to try to make them more widely used and understood.
Here is a short editorial from Howard White, the CEO of the Campbell Collaboration, on the collaboration's experience in "getting evidence into use." Among his recommendations, White suggests working with "user-commissioners" who "have a specific set of research questions they want answered, and a plan for using the findings." White also recommends using "evidence portals" and "policy-specific toolkits," both of which, unfortunately, are not used as widely in the U.S. as they are in the U.K. and elsewhere.
Here is an interesting description from the Office of the Assistant Secretary for Planning and Evaluation (ASPE) at the U.S. Department of Health and Human Services regarding its “Core Components Approaches to Building Evidence of Program Effectiveness.”
The idea is for researchers, evaluators and policymakers to consider “a complementary approach to building evidence of effectiveness” by focusing on core components of programs, meaning “the parts, features, attributes, or characteristics of a program that a range of research techniques show influence its success when implemented effectively.” Items at this link represent ASPE’s current work on core components approaches to building evidence of program effectiveness.
For podcast listeners, this interview with David Anderson, Vice President of Evidence-Based Policy at Arnold Ventures, describes his insights about building credible evidence in social policy.
This helpful Twitter thread provides a nice summary of highlights from a workshop at the 2021 Evidence and Implementation Summit on the competencies needed to support implementation and evidence use.
Recent studies in 2021 from the field of Applied Economics indicate that government officials in two countries (Pakistan and Brazil) were receptive to training in policy-relevant quantitative research methods and to receiving and learning about research findings on policy evaluation studies. Both studies found that policymakers were influenced by high-quality evidence in support of related public policies. Links to the studies are in the previous parenthetical.
In April 2021, the American Institutes for Research (AIR) published a detailed document that sets forth standards for economic evaluations of educational and social programs.
AIR states that in developing the standards, its goals were the following:
Provide guidance to researchers on the elements of a rigorous economic evaluation as they design and conduct program evaluations.
Direct researchers toward consistent practices that promote transparency and comparability across economic evaluations.
Develop standards that consumers of research can use to evaluate economic evaluations, including research proposals, study plans, and reported results.
A link to the document pdf is here.
The February 2021 issue of the Journal of Policy Analysis and Management had an interesting series of articles in its Point/Counterpoint section, all of which relate to evidence-based policies and decision-making, and all of which are accessible only to subscribers and affiliates, thereby (unfortunately) limiting access to many of the decision-makers and policy researchers who are the target audience for the articles.
Here is a helpful, U.K.-based “Practical Guide for Establishing an Evidence Center” which, sadly, may not be so practical for the U.S.
From an October 5, 2020 blog at the academic journal Policy & Politics: A list of essential reading recommendations for academic courses that incorporate topics such as evidence-based policy, policy design, and behavioral public policy. Several of the recommended articles are freely available for download as of October 2020.
In September 2020, the academic journal Policy & Politics published a themed issue that provides a "state-of-the-art" overview of advances in behavioral public policy and administration - and as of October 2020, the articles were available via a free download.
In this April, 2020 blog post from the U.K. (with a link to the longer research article), the author sets forth arguments for considering evidence-based policymaking as an ideal, rather than something that can be realized in practice.
This Spring 2020 article from the Stanford Social Innovation Review discusses reasons why evidence is infrequently used in philanthropic decision making. Many of these reasons also apply in the realm of policymaking.
Here is a recent (2020), non-theoretical discussion of the problems with identifying effective evidence-based policies and recommendations for addressing those problems, particularly in evidence-based registries for social programs: Gies, S.V., Health, E. & Stephenson, R. (2020). The evidence of effectiveness: Beyond the methodological standards. Justice Evaluation Journal, doi.org/10.1080/24751979.2020.1727296.
The coronavirus pandemic, Spring 2020
A 2020 review from U.K.-based researchers concludes that the global Covid-19 pandemic has established the "the importance of scientific evidence, the synthesis of this evidence, and guidance for government [officials.]" But even though "robust, internationally recognised standards for evidence generation, synthesis and guidance production are available," those standards "have rarely been formally adopted." That conclusion certainly applies in a U.S. context as well. A summary of the review and links to the relevant research are here.
This September 2019 research article describes some of the issues involved with using so-called "behavioural insights" as a concept for understanding and approaching policymaking.
This fairly dense article from July 2019 develops the concept of an "evidence ecosystem." The paper provides helpful background information and conclusions regarding the production and use of systematic reviews, their underlying methods, and their use in assessing research evidence, including evidence with public policy implications.
Although the essay’s main focus is an overview of issues relating to research on reentry programs in criminal justice systems, it also includes a nice summary of the hierarchy, arguing that studies purporting to produce policy-relevant evidence
"should be weighted based on where a study falls in a hierarchy of evidence: raw correlational analyses near the bottom, outranked by studies with rich control variables, then by studies using matched comparison groups, then by studies using natural experiments to avoid selection bias (e.g., studies using sound difference-in-difference, regression discontinuity, and instrumental variable designs), then randomized controlled trials (RCTs) at the top."
Much of the theoretical research in this area seems to have been generated in the U.K.:
You’ll find an embarrassment of riches at Professor Paul Cairney’s website. He is based in Scotland, but his research is applicable across countries. https://paulcairney.wordpress.com/
The writing in some of Professor Cairney’s published research can be a bit dense, but theories and other ideas presented are worth the effort. See examples below:
Cairney, P., & Oliver, K. (2017). Evidence-based policymaking is not like evidence-based medicine, so how far should you go to bridge the divide between evidence and policy? Health Research Policy and Systems, 15(35), 35-46.
Here are other examples of academic work, also from Europe:
The Pew Charitable Trusts and the MacArthur Foundation assessed the prevalence of specific actions that states use the incorporated evidence into policymaking. In their assessment, they categorized each of the 50 the states, with some surprising results. http://www.pewtrusts.org/~/media/assets/2017/01/how_states_engage_in_evidence_based_policymaking.pdf
The Policy Surveillance Program at Temple University is the first systematic effort (that we know of) to conduct formal legal mapping for policy surveillance. Its focus is on public health policy. http://lawatlas.org.
Contact us to share your resources: email@example.com