top of page

Resources

Research on evidence-based policymaking
Note: Links are highlighted in dark orange.
 
Stand-alone research articles and posts are listed below in reverse chronological order. Earlier academic work and helpful links are grouped below the articles:
2024
  • Here is an interesting study that I missed from 2022, published in Prevention Science and titled "How Consistently Do 13 Clearinghouses Identify Social and Behavioral Development Programs as `Evidence‑Based'?" The short answer to that question is, unfortunately, not very well. The authors note that given the inconsistent standards applied across clearinghouses, the term "evidence-based" will "remain more of an aspiration than achievement in the social and behavioral sciences."
2023
  • The National Academies' Committee on Law and Justice held a panel discussion on November 8, 2023 "to solicit input from researchers on CrimeSolutions’ review process and communication of outcome." (I have mentioned CrimeSolutions.org on this website.) Here is a link to videotaped portion of the public seminar titled "CrimeSolutions: What Works, What Doesn’t, What’s Promising: Feedback from the Field."
 
Some panel participants argued that the program should simply be scrapped, even though it's been around for more than a decade and considerable time, effort, and public funding have been invested in program. One panelist also argued that CrimeSolutions not a "trusted source" among academics, who, of course, are not the program's target audience.
 
  • Could Generative AI help you with your research? Here is a list I have compiled (without the help of Generative AI!) of research tools that use Generative AI and Large Language Models. It is current only as of December 2023.
  • Here is an unique approach to evidence synthesis that was posted online in August 2023. The article describes "a semi-automated approach to extract, aggregate, map, and analyze causal evidence from policy-relevant literature." I wouldn't say the article is an easy read and I think the first half would have benefitted from more concrete examples in the background section, less jargon, a focus on fewer related topics, and less use of the passive voice, which drags down the writing. However, the illustration that the authors use is helpful, and the underlying ideas are interesting. The authors say their method for synthesizing evidence saves time when compared to the time needed for a standard meta-review. But the learning curve would be steep, though perhaps worth the time?
  •  
  • Here is a review of a new book, Experimental Thinking: A Primer on Social Science Experiments, by political scientist James Druckman. Although the book will not be transformed into a thriller on Netflix anytime soon, it nevetheless appears to be an important contribution to the field, offering a wide-ranging discussion on how to properly think about and interpret experiments in the social sciences, and emphasizing the importance of a comprehensive scientific process beyond experimental design alone. It would be  useful resource for college-level instructors teaching related topics.
     
  • Here is a report commissioned by the National Academies of Sciences, Engineering, and Medicine that discusses, in depth, how behavioral economics can be used as a framework for understanding the barriers that researchers and policymakers face in translating evidence into policy. In sum, the three main barriers are: "1) political actors need to know and value the evidence from behavioral economics; 2) they need to translate the correct insight into a new context; and 3) they need to act on the evidence by implementing the correct insight at scale."
 
  • This blog post from 80,000 Hours provides a helpful framework for assessing and comparing the effectiveness of different policy solutions. In particular, it introduces the concept of "expected value" and provides examples of how it can be used to compare policy effectiveness. 
2022
November
  • Here is an interesting blog post by a government official at the Administration for Children and Families, describing how she has been pleasantly surprised to realize the extent to which social science research does inform policymaking at many levels.
  • The blog post mentions the 2007 book, "Using evidence: How research can inform public services." The book may be more than 15 years old now, but as you see from the table of contents, its basic ideas are still useful and informative.
September
  • More from the Campbell Collaborative: Here is an editorial in Campbell Systematic Reviews titled "Research—What is it good for? Absolutely nothing… unless it is used to inform policy and practice." The editorial emphasizes "the need to take the research one step further in ‘translating’ the research findings into forms which are discoverable, understandable and usable by decisionmakers." The authors emphasize Campbell's approach in using "evidence and gap maps to build the evidence architecture." The U.S. doesn't have a counterpart organization that develops these evidence and gap maps.
  • Here is a valuable report from The Pew Charitable Trusts that offers guidance on ways in which nongovernment stakeholders can support states in advancing evidence-based policymaking. The report describes lessons learned from Pew's Results First initiative, which has assisted states in institutionalizing and sustaining the use of evidence in policymaking. The initiative will end in 2023, with its resources being transferred to other organizations. 
 
June
  • Howard White's recommendations (see March entry below) are supported, in part, by results from the following working paper from the National Bureau of Economic Research. The paper reports that evidence from randomized controlled trials in U.S. cities is more likely to be incorporated into policymaking when it is implemented as part of "pre-existing communications." See "Bottlenecks for Evidence Adoption."
May
  • The president of the Association for Public Policy and Management's 2021 presidential address, titled "Connecting the Dots: Turning Research Evidence into Evidence for Policymaking" was recently published in the Journal of Policy Analysis and Management. In the interest of promoting good policymaking, perhaps the journal should consider providing free access to the article, if only for a limited time.
 
April
  • Here is a report from a team at Mathematica, advocating for use of Bayesian methods that incorporate prior evidence to assess the impact of a policy programmatic change. Bayesian models are not particularly easy for non-statisticians to understand, but it can't hurt to try to make them more widely used and understood.
 
March
  • Here is a short editorial from Howard White, the CEO of the Campbell Collaboration, on the collaboration's experience in "getting evidence into use." Among his recommendations, White suggests working with "user-commissioners" who "have a specific set of research questions they want answered, and a plan for using the findings." White also recommends using "evidence portals" and "policy-specific toolkits," both of which, unfortunately, are not used as widely in the U.S. as they are in the U.K. and elsewhere.
2021
  • For podcast listeners, this interview with David Anderson, Vice President of Evidence-Based Policy at Arnold Ventures, describes his insights about building credible evidence in social policy.
  • This helpful Twitter thread provides a nice summary of highlights from a workshop at the 2021 Evidence and Implementation Summit on the competencies needed to support implementation and evidence use.
  • Recent studies in 2021 from the field of Applied Economics indicate that government officials in two countries (Pakistan and Brazil) were receptive to training in policy-relevant quantitative research methods and to receiving and learning about research findings on policy evaluation studies. Both studies found that policymakers were influenced by high-quality evidence in support of related public policies. Links to the studies are in the previous parenthetical.
 
  • In April 2021, the American Institutes for Research (AIR) published a detailed document that sets forth standards for economic evaluations of educational and social programs. 
 
AIR states that in developing the standards, its goals were the following:
  • Provide guidance to researchers on the elements of a rigorous economic evaluation as they design and conduct program evaluations.

  • Direct researchers toward consistent practices that promote transparency and comparability across economic evaluations.

  • Develop standards that consumers of research can use to evaluate economic evaluations, including research proposals, study plans, and reported results.

        A link to the document pdf is here.
  • The February 2021 issue of the Journal of Policy Analysis and Management had an interesting series of articles in its Point/Counterpoint section, all of which relate to evidence-based policies and decision-making, and all of which are accessible only to subscribers and affiliates, thereby (unfortunately) limiting access to many of the decision-makers and policy researchers who are the target audience for the articles.
 
2020
 
  • From an October 5, 2020 blog at the academic journal Policy & Politics: A list of essential reading recommendations for academic courses that incorporate topics such as evidence-based policy, policy design, and behavioral public policy. Several of the recommended articles are freely available for download as of October 2020.
 
  • In September 2020, the academic journal Policy & Politics published a themed issue that provides a "state-of-the-art" overview of advances in behavioral public policy and administration - and as of October 2020, the articles were available via a free download.
  • Here is an interesting description (dated August 2020) from the Office of the Assistant Secretary for Planning and Evaluation (ASPE) at the U.S. Department of Health and Human Services regarding its “Core Components Approaches to Building Evidence of Program Effectiveness.”

The idea is for researchers, evaluators and policymakers to consider “a complementary approach to building evidence of effectiveness” by focusing on core components of programs, meaning “the parts, features, attributes, or characteristics of a program that a range of research techniques show influence its success when implemented effectively.” Items at this link represent ASPE’s current work on core components approaches to building evidence of program effectiveness.

 
  • In this April, 2020 blog post from the U.K. (with a link to the longer research article), the author sets forth arguments for considering evidence-based policymaking as an ideal, rather than something that can be realized in practice.
 
  • This Spring 2020 article from the Stanford Social Innovation Review discusses reasons why evidence is infrequently used in philanthropic decision making. Many of these reasons also apply in the realm of policymaking.
 
  • Here is a recent (2020), non-theoretical discussion of the problems with identifying effective evidence-based policies and recommendations for addressing those problems, particularly in evidence-based registries for social programs: Gies, S.V., Health, E. & Stephenson, R. (2020). The evidence of effectiveness: Beyond the methodological standards. Justice Evaluation Journal, doi.org/10.1080/24751979.2020.1727296.​​
 
The coronavirus pandemic, Spring 2020
  • Professor Paul Cairney applies a template for understanding the difficulties of the policymaking process to the extreme example of how governments may address the pandemic. He has posted both short and long versions of his analysis (see link to more of his and related work, below).
  • A 2020 review from U.K.-based researchers concludes that the global Covid-19 pandemic has established the "the importance of scientific evidence, the synthesis of this evidence, and guidance for government [officials.]" But even though "robust, internationally recognised standards for evidence generation, synthesis and guidance production are available," those standards "have rarely been formally adopted." That conclusion certainly applies in a U.S. context as well. A summary of the review and links to the relevant research are here.
2019
  • This September 2019 research article describes some of the issues involved with using so-called "behavioural insights" as a concept for understanding and approaching policymaking.
  • This fairly dense article from July 2019 develops the concept of an "evidence ecosystem." The paper provides helpful background information and conclusions regarding the production and use of systematic reviews, their underlying methods, and their use in assessing research evidence, including evidence with public policy implications.
  • Here is an essay from 2019 published in the Journal of Policy Analysis and Management, titled “’Evidence-Based Policy’ Should Reflect a Hierarchy of Evidence.”
 
Although the essay’s main focus is an overview of issues relating to research on reentry programs in criminal justice systems, it also includes a nice summary of the hierarchy, arguing that studies purporting to produce policy-relevant evidence
 
"should be weighted based on where a study falls in a hierarchy of evidence: raw correlational analyses near the bottom, outranked by studies with rich control variables, then by studies using matched comparison groups, then by studies using natural experiments to avoid selection bias (e.g., studies using sound difference-in-difference, regression discontinuity, and instrumental variable designs), then randomized controlled trials (RCTs) at the top."
Other resources
  • Much of the theoretical research in this area seems to have been generated in the U.K.:

 

You’ll find an embarrassment of riches at Professor Paul Cairney’s website. He is based in Scotland, but his research is applicable across countries. https://paulcairney.wordpress.com/

 

The writing in some of Professor Cairney’s published research can be a bit dense, but theories and other ideas presented are worth the effort. See examples below:

 

Cairney, P. (2018). Three habits of successful policy entrepreneurs. Policy & Politics, 46(2), 199-215.

Cairney, P., & Oliver, K. (2017). Evidence-based policymaking is not like evidence-based medicine, so how far should you go to bridge the divide between evidence and policy? Health Research Policy and Systems, 15(35), 35-46.

 

Weible, C. M., & Cairney, P. (2018). Practical lessons from policy theories. Policy & Politics, 46(2), 183-197.

 

  • Here are other examples of academic work, also from Europe:

 

Oliver, K., Lorenc, T., & Innvær, S. (2014). New directions in evidence-based policy research: a critical analysis of the literature. Health Research Policy and Systems, 12(34).

 

Reed, M. S., Bryce, R., & Machen, R. (2018). Pathways to policy impact: A new approach for planning and evidencing research impact. Evidence & Policy, 14(3), 431-458.

 

  • The Policy Surveillance Program at Temple University is the first systematic effort (that we know of) to conduct formal legal mapping for policy surveillance. Its focus is on public health policy. http://lawatlas.org.​

Contact us to share your resources: info@duddonresearch.org

bottom of page