2022 Federal Index
Common Evidence Standards / What Works Designations
Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding purposes; did that framework prioritize rigorous research and evaluation methods; and did the agency disseminate and promote the use of evidence-based interventions through a user friendly tool in FY22 (example: What Works Clearinghouses)?
Score
8
8
Millennium Challenge Corporation
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
- For each investment, MCC’s Economic Analysis Division undertakes a constraints analysis to determine the binding constraints to economic growth in a country. To determine the individual projects in which MCC will invest in a given sector, MCC’s Economic Analysis Division combines root cause analysis with a cost-benefit analysis. The results of these analyses allow MCC to determine which investments will yield the greatest development impact and return on MCC’s investment. Every investment also has its own set of indicators as well as standard agency-wide sector indicators for monitoring during the life cycle of the investment and an evaluation plan for determining the results and impact of a given investment. MCC’s Policy for Monitoring and Evaluation details its evidence-based research and evaluation framework. According to the policy, each completed evaluation requires a summary of findings, now called the Evaluation Brief, to summarize the key components, results, and lessons learned from the evaluation. Evidence from previous MCC programming is considered during the development of new programs. Again, according to the policy, “monitoring and evaluation evidence and processes should be of the highest practical quality. They should be as rigorous as practical and affordable. Evidence and practices should be impartial. The expertise and independence of evaluators and monitoring managers should result in credible evidence. Evaluation methods should be selected that best match the evaluation questions to be answered. Indicators should be limited in number to include the most crucial indicators. Both successes and failures must be reported.”
6.2 Did the agency have a common evidence framework for funding decisions?
- The corporation uses a rigorous evidence framework to make every decision along the investment chain, from country partner eligibility to sector selection to project choices. It uses evidence-based selection criteria, generated by independent, objective third parties, to select countries for grant awards. To be eligible for selection, World-Bank-designated low- and lower-middle-income countries must first submit to MCC a collection of twenty independent third party indicators that objectively measure their policy performance in the areas of economic freedom, investing in people, and ruling justly. An in-depth description of the country selection procedure can be found in the annual report.
6.3 Did the agency have a user-friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
- Millennium Challenge Corporation is a leader in the production of evidence on the results of its international development programs. As a data-driven agency, it invests in evidence-generating activities, such as due diligence surveys, willingness to pay surveys, and independent evaluations. It has more room to lead, however, in the accessibility and usability of its evidence. Since 2013, MCC has shared the data, documentation, and analysis underlying its independent evaluations. In terms of accessibility of evaluation materials, users have noted that MCC’s central evaluation and data repository, the Evaluation Catalog, is hard to navigate.
- Recognizing that transparency is not enough to achieve accountability and learning, MCC developed the MCC Evidence Platform. The Evidence Platform offers first of its kind study and data access and usability and encourages the utilization of its vast library of evidence. The corporation invites researchers, from students to experienced professionals, to use the data and documentation provided here to reproduce and build upon MCC’s evidence base and thereby drive development effectiveness for, and beyond, MCC.
- The MCC Evidence Platform shares resources:
- Studies–Users may search by studies to find all the related data and documentation associated with each study. Study types include independent evaluations, monitoring, constraints analysis, willingness to pay, due diligence, and country-led studies.
- Documentation–Users may search by specific documentation associated with MCC-funded studies. This documentation is shared as specific knowledge product types including design reports, baseline reports, interim analysis reports, final analysis reports, MCC learning documents, evaluation-based cost-benefit analyses, and questionnaires.
- Data Packages–Users may search by specific data packages associated with MCC-funded studies. Data package types include round (baseline, interim, or final), public, and restricted access.
- The MCC Evidence Platform encourages the use of MCC’s data, documentation, and analysis as global public goods to support mutual accountability for the agency and its country partners and to encourage learning from measured results. The platform includes information about the level of rigor, research methodology, and population effects for every evaluation.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
- As described above, the new MCC Evidence Platform was intentionally designed and launched with utilization as a primary goal. The platform specifically encourages users to take MCC learning and evidence and apply and reproduce it for new learning. The platform will then aim to also share new learning based on published MCC evidence. As a part of this comprehensive approach, Evaluation Briefs continue to be a cornerstone to promoting utilization across audience groups. Enhanced utilization of MCC’s vast evidence base and learning was a key impetus behind the creation and expansion of the Evaluation Briefs and Star Reports. A push to ensure sector-level evidence use has led to the renewed emphasis of the Principles into Practice series, with recent reports on the transport, education, and water and sanitation sectors.
- The corporation has also enhanced its in-country evaluation dissemination events to ensure further results and evidence building with additional products in local languages and targeted stakeholder learning dissemination strategies.
Score
10
10
U.S. Department of Education
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
- ED has an agency-wide framework for impact evaluations that is based on ratings of studies’ internal validity. ED evidence-building activities are designed to meet the highest standards of internal validity (typically randomized control trials) when causality must be established for policy development or program evaluation purposes. When random assignment is not feasible, rigorous quasi-experiments are conducted. The framework was developed and is maintained by IES’s What Works Clearinghouse (WWC). Standards are maintained on the WWC website. A stylized representation of the standards can be found here, along with information about how ED reports findings from research and evaluations that meet these standards.
- Since 2002, ED—as part of its compliance with the Information Quality Act and OMB guidance—has required that all “research and evaluation information products documenting cause and effect relationships or evidence of effectiveness should meet quality standards that will be developed as part of the What Works Clearinghouse” (see Information Quality Guidelines).
6.2 Did the agency have a common evidence framework for funding decisions?
- The Department of Education employs the same evidence standards in all discretionary grant competitions that use evidence to direct funds to applicants proposing to implement projects that have evidence of effectiveness and/or to build new evidence through evaluation. Those standards, as outlined in the Education Department General Administrative Regulations (EDGAR), build on ED’s WWC research design standards.
6.3 Did the agency have a clearinghouse(s) or user-friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
- ED’s What Works Clearinghouse (WWC) identifies studies that provide valid and statistically significant evidence of effectiveness of a given practice, product, program, or policy (referred to as “interventions”), and disseminates summary information and reports on the WWC website.
- The WWC has published more than 611 Intervention Reports, which synthesize evidence from multiple studies about the efficacy of specific products, programs, and policies. Wherever possible, Intervention Reports also identify key characteristics of the analytic sample used in the study or studies on which the Reports are based.
- The WWC has published nearly 30 Practice Guides, which synthesize across products, programs, and policies to surface generalizable practices that can transform classroom practice and improve student outcomes.
- Finally, the WWC has completed nearly 12,000 single study reviews. Each is available in a searchable database.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
- ED has several technical assistance programs designed to promote the use of evidence-based practices, most notably IES’s Regional Educational Laboratory Program and the Office of Elementary and Secondary Education’s Comprehensive Center Program. Both programs use research on evidence-based practices generated by the What Works Clearinghouse and other ED-funded Research and Development Centers to inform their work. RELs also conduct applied research and offer research-focused training, coaching, and technical support on behalf of their state and local stakeholders. Their work is reflected in the Department’s Strategic Plan.
- Often, those practices are highlighted in WWC Practice Guides, which are based on syntheses (most recent meta-analyses) of existing research and augmented by the experience of practitioners. These guides are designed to address challenges in classrooms and schools.
- To ensure continuous improvement of the kind of TA work undertaken by the RELs and Comprehensive Centers, ED has invested in both independent evaluation and grant-funded research. Additionally, IES has awarded two grants to study and promote knowledge utilization in education, including the Center for Research Use in Education and the National Center for Research in Policy and Practice. In June of 2020, IES released a report on How States and Districts Support Evidence Use in School Improvement, which may be of value to technical assistance providers and SEA and LEA staff in improving the adoption and implementation of evidence-based practice.
- Finally, the ED developed revised evidence definitions and related selection criteria for competitive programs that align with ESSA to streamline and clarify provisions for grantees. These revised definitions align with ED’s suggested criteria for states’ implementation of ESSA’s four evidence levels, included in ED’s non-regulatory guidance, Using Evidence to Strengthen Education Investments. ED also developed a fact sheet to support internal and external stakeholders in understanding the revised evidence definitions. This document has been shared with internal and external stakeholders through multiple methods, including the Office of Elementary and Secondary Education ESSA technical assistance page for grantees.
Score
6
6
U.S. Agency for International Development
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
- The agency has developed a draft agency-level evidence framework to clarify evidence standards for different decisions, including those related to funding. The draft was published for voluntary comment, and is being further updated. Once finalized, it will be published as a suggested help guide for USAID staff and continue to be refined and updated as needed.
- The agency’s evidence standards are embedded within its policies and include requirements for the use of evidence in strategic planning, project design, activity design, program monitoring, and evaluation. Its Scientific Research Policy sets out quality standards for research across the agency. Its Program Cycle Policy requires the use of evidence and data to assess the development context, challenges, potential solutions, and opportunities in all of its country strategies. As part of the grant awards process, Grand Challenges, such as the Water and Energy for Food Grand Challenge and the Securing Water for Food Grand Challenge, collaborate with innovators to set ambitious results targets and make eligibility for subsequent funding contingent on demonstrated evidence of hitting those targets. Other programs, such as DIV, use evaluation criteria based on evidence of causal impact, cost effectiveness, and pathways to scale and financial sustainability (see grant solicitation DIV Annual Program Statement). As one of USAID’s flagship open innovation programs, DIV helps to find, test, and scale innovative solutions to any global development challenge from anyone, anywhere. By backing proven innovations, driven by rigorous evidence and ongoing monitoring, USAID’s DIV program has proven to impact millions of lives at a fraction of the usual cost. Based on recent research announced in October 2020, a subset of grants from DIV’s early portfolio covering 2010-2012 has produced $17 in social benefit for every dollar spent by USAID. This research was led by Dr. Michael Kremer, a Nobel Prize-winning economist who is DIV’s scientific director. Further, the Government Accountability Office found in its December 2019 report Evidence-Based Policymaking: Selected Agencies Coordinate Activities, But Could Enhance Collaboration that leading practices for collaborating when building and assessing evidence.
6.2 Did the agency have a common evidence framework for funding decisions?
- The U.S. Agency for International Development has a draft agency-level evidence framework to clarify evidence definitions, principles, and approaches for different decisions, including those related to funding. The framework has been posted for review and comment by external stakeholders.
- In addition, there are specific types of programs at the sub-agency level that do use evidence framework or standards to make funding decisions. For example, DIV uses a tiered funding approach to find, test, and scale evidence-based innovations. Its grants include stage 1 for piloting (up to $200,000), stage 2 for testing and positioning for scale (up to $1,500,000), stage 3 for transitioning to scale (up to $15,000,000), and evidence generation awards (up to $1,500,000) for research to determine causal impact of interventions that have already scaled. In particular for stage 2 grants, DIV requires evidence of impact that must be causal and rigorous—the grantee must have rigorous evidence of causal impact or conduct a rigorous evaluation of causal impact during the award. These stages are also common across other USAID-sponsored Challenge and Grand Challenge programs, such as the Mujer Prospera Challenge or the Creating Hope in Conflict Humanitarian Grand Challenge.
- Evaluation criteria for DIV funding is based on its three core principles as further outlined in its annual grant solicitation (DIV Annual Program Statement): (1) evidence of Impact, (2) cost effectiveness, and (3) potential for scale and financial sustainability. Expectations vary by stage, but every awardee must report against a set of pre-negotiated key performance indicators.
- In support of Grand Challenges programs, the Exploratory Programs and Innovation Competitions team has developed a sector-agnostic results framework and is developing a cost effectiveness analysis framework to improve the rigor and evidence-based programming for current and future Grand Challenges.
6.3 Did the agency have a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
- An agency-wide repository for development information (including evaluation reports and other studies) is available to the public at the DEC. In addition, USAID uses the International Initiative for Impact Evaluations (3ie) database of impact evaluations relevant to development topics (including over 4,500 entries to date), knowledge gap maps, and systematic reviews that pull the most rigorous evidence and data from across international development donors. 3ie also houses a collection of institutional policies and reports that examine findings from its database of impact evaluations on overarching policy questions to help policymakers and development practitioners improve development impact through better evidence.
- The Agency Programs and Functions policy designates technical bureaus responsible for being the repository for the latest information in the sectors they oversee; prioritizing evidence needs and taking actions to build evidence; and disseminating that evidence throughout the agency for those sectors. Several USAID bureaus and sectors have created user friendly tools to disseminate information on evidence-based solutions. These include, but are not limited to:
- Climatelinks: A global knowledge portal for climate change and development practitioners;
- Educationlinks: A portal for sharing innovations and lessons learned on implementation of the USAID Education Policy;
- Natural Resources Management and Development Portal;
- Urbanlinks: USAID’s sharing platform for resources on sustainable urban development.
- Finally, USAID recently applied Natural Language Processing Text Analysis to analyze unstructured data from the previous ten years of evaluation reports published by USAID and identify countries that used specific language and terminology related to racial and ethnic equity. This review included 1,208 evaluation reports and 2,525 final contractor/grantee reports that were available on USAID’s public \DEC and converted to machine readable format. To develop an algorithm to find the most relevant information, the team consulted with experts from across the agency working on inclusive development and diversity, equity, inclusion, and accessibility issues to develop a lexicon of terms that together with other factors were tested and found to identify relevant documents.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and reapplication of evaluation findings and other evidence?
- The agency’s approach to CLA helps ensure that evidence from evaluation of USAID programming is shared with and used by staff, partners, and stakeholders in the field. The agency requires a dissemination plan and post-evaluation action plan for each evaluation, and USAID field staff are encouraged to co-create evaluation action plans with key stakeholders based on evaluation evidence. It collects examples through the CLA Case Competition, held annually, which recognizes implementers, stakeholders, and USAID staff for their work generating and sharing technical evidence and learning from monitoring and evaluation. It is another way that the agency encourages evidence-based practices among its stakeholders.
- The agency also periodically holds large learning events with partners and others in the development community around evidence including, but not limited to, evaluation summits, engagement around the Agency Learning Agenda, and Moving the Needle. These gatherings are designed to build interest in USAID’s evidence, build capacity around applying that evidence and learning, and elicit evidence and learning contributions.
- The agency created and led the Million Lives Collective coalition, with more than thirty partners, which has identified more than 100 social entrepreneurs who have at least a million customers in order to share the learning that this successful cohort has had and better describe how USAID funding can assist more social entrepreneurs to grow successfully and rapidly. This unique learning platform brings donors, funders, governments, and the entrepreneurial community to the table together to learn and iterate on successful approaches.
- Additionally, USAID recently published the Evaluations at USAID dashboard, which provides evidence of evaluation use by missions, as well as opportunities for peer learning.
Score
7
7
AmeriCorps
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
- AmeriCorps uses the same standard scientific research methods and designs for all of its studies and evaluations following the model used by clearinghouses like Department of Education’s What Works Clearinghouse, the Department of Labor’s Clearinghouse for Labor Evaluation and Research, and the Department of Health and Human Services’ Home Visiting Evidence of Effectiveness project.
- AmeriCorps’ evidence building approach utilizes the full range of social science designs and methods. Application of these strategies in the context of all AmeriCorps program models does not neatly align with the three-tiered evidence framework as defined and implemented in the context of the Social Innovation Fund and other federal tiered-evidence initiatives.
6.2 Did the agency have a common evidence framework for funding decisions?
- AmeriCorps has a common evidence framework for funding decisions in the Senior Corps and AmeriCorps state and national programs. This framework, which is articulated in the AmeriCorps State and National program notice of funding, includes the following evidence levels: pre-preliminary, preliminary, moderate, and strong.
- The tiered evidence framework is defined by these research designs and has been embedded into the AmeriCorps state and national (ASN) competitive grant making process for several years. For context, in FY22, funding appropriated by ASN programs amounts to 54% of AmeriCorps’ enacted operating budget. Furthermore, 64% of FY22 ASN competitively awarded funds were invested in interventions with moderate and strong evidence. Investment in education interventions (50% of the ASN portfolio) with moderate (14%) or strong (56%) evidence amounted to 70% of allocated funds in this focus area.
6.3 Did the agency have a clearinghouse(s) or a user-friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
- The AmeriCorps Evidence Exchange is a virtual repository of reports and resources intended to help AmeriCorps grantees and other interested stakeholders find information about evidence- and research-based national service programs. Examples of the types of resources available in the Evidence Exchange include research briefs that describe the core components of effective interventions such as those in the areas of education, economic opportunity, and health.
- The Office of Research & Evaluation also creates campaigns and derivative products to distill complex report findings and increase their utility for practitioners (for example, this brief on a study about the health benefits of Senior Corps). It categorizes reports according to their research design so that users can easily search for experimental, quasi-experimental, or nonexperimental studies and studies that qualify for strong, moderate, or preliminary evidence levels.
- The Office of Research & Evaluation awarded a new, multi-year contract to Mathematica in FY22 to support and advance the agency’s knowledge translation and dissemination goals so that multiple stakeholder groups can access the agency’s body of evidence and more effectively address local community needs.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
- AmeriCorps launched and promoted its web-based interactive tool for supporting the implementation and replication of evidence-based interventions: Scaling Checklists: Assessing Your Level of Evidence and Readiness (SCALER). This resource was promoted by the Department of Education and AmeriCorps through webinars and in-person trainings. The agency makes a range of resources available to the field to encourage the use of evidence-based practices.
- AmeriCorps maintained its agency-wide approach to promoting the use of evidence-based practices and employed a variety of strategies including evidence briefs, broad-based support to national service organizations, and targeted technical assistance to grantees. First, ORE has created campaigns and derivative products to distill complex report findings and increase their utility for practitioners. Second, AmeriCorps has created user-friendly research briefs that describe the core components of effective interventions in the areas of education, economic opportunity, and health. These briefs are designed to help grantees (and potential grantees) adopt evidence-based approaches. Third, ORE funds a contractor to provide AmeriCorps grantees with evaluation capacity building support; staff are also available to state commissions to address their evaluation questions and ro make resources (e.g., research briefs summarizing effective interventions, online evaluation planning and reporting curricula) available to them and the general public. Fourth, AmeriCorps funds and participates in grantee conferences that include specific sessions on how to incorporate evidence and data into national service programs. Fifth, as part of its State and National FY20 application process, AmeriCorps provided technical assistance to grantees on using evidence-based practices through webinars and calls. AmeriCorps Seniors continues to encourage and support the use of evidence-based programs, as identified by the U.S. Department of Health and Human Services’s (HHS’s) Administration for Community Living. In FY22 AmeriCorps Seniors began exploring and supporting evidence-based service to work models.
Score
8
8
U.S. Department of Labor
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
- The Department of Labor’s Clearinghouse for Labor Evaluation and Research (CLEAR) evidence guidelines, which describe quality standards for different types of studies These standards are applied to all independent evaluations, including all third party evaluations of DOL programs determined eligible for CLEAR’s evidence reviews across different topic areas. Requests for proposals also indicate that these CLEAR standards should be applied to all Chief Evaluation Office evaluations when considering which designs are the most rigorous and appropriate to answer specific research questions.
- In addition, the DOL Evaluation Policy identifies principles and standards for evaluation planning and dissemination. The Department of Labor collaborates with other agencies (the Department of Health and Human Services, the Department of Education’s Institute of Education Sciences, the National Science Foundation, and the Corporation for National and Community Service to develop technological procedures to link and share reviews across clearinghouses.
6.2 Did the agency have a common evidence framework for funding decisions?
- The department uses the CLEAR evidence guidelines and standards to make decisions about discretionary program grants awarded using evidence-informed or evidence-based criteria. The published guidelines and standards are used to identify evidence-based programs and practices and to review studies to assess the strength of their causal evidence or to do a structured evidence review in a particular topic area or time frame to help inform agencies about what strategies appear promising and where gaps exist.
6.3 Did the agency have a clearinghouse(s) or user-friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
- The department’s CLEAR is an online evidence clearinghouse whose goal is to make research on labor topics more broadly accessible to practitioners, policymakers, researchers, and the public so that it can inform their decisions about labor policies and programs. This clearinghouse identifies and summarizes many types of research, including descriptive statistical studies and outcome analyses, implementation studies, and causal impact studies. For causal impact studies, it assesses the strength of the design and methodology in studies that look at the effectiveness of particular policies and programs. Its study summaries and icons, found in each topic area, can help users quickly and easily understand what studies found and how much confidence to have in the results. Its search tool allows users to find studies based on target population, including race and other demographic characteristics.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
- The Department of Labor promotes the use of evidence-based practices to encourage or implement program evaluation, foundation fact-finding, performance measurement, and policy analysis in a variety of ways. For example, the Chief Evaluation Office provides regular briefings for DOL national and regional staff on interim and final results of studies; trainings, research roundtables, and single briefings from external experts on methodological topics and new labor-related research findings through the Chief Evaluation Office Seminar Series; and a monthly research roundup on a variety of labor-related topics for DOL staff, called Labor Research News. The office’s Data Analytics Unit also offers agencies support to pilot sophisticated analyses of existing internal or external data. For the public, the office provides regular updates as well as a quarterly newsletter called Building the Evidence Base; supports trainings for workforce agencies and the public on how to access user friendly results on a topic across thousands of studies in DOL’s clearinghouse CLEAR; provides public information on how the department is building evidence by maintaining the DOL Evidence Hub; and supports the dissemination of evidence-based standards and program evaluations, for example, through the dissemination of meta-analyses of career pathways impact evaluations and a user friendly Career Trajectories and Occupational Transitions Dashboard, or through dissemination, in collaboration with ETA, of the office’s RESEA evidence-building and program implementation study, which helps states apply evaluation findings to improve the RESEA program.
- Department of Labor agencies also support the use of evidence-based practices. For example, the International Labor Affairs Bureau (ILAB) has an evaluation newsletter and maintains a public website sharing evaluation reports. The Employment and Training Administration maintains a user-friendly database and a community of practice, Workforce System Strategies, that highlights the use of evidence-based interventions as foundational fact finding and the Evaluation and Research Hub for replication. Workforce System Strategies is a comprehensive database of more than 1,500 profiles that summarize a wide range of findings from reports, studies, and technical assistance tools to guides that support program administration and improvement. The Evaluation and Research Hub is a community of practice created to support evidence and evaluation-capacity building efforts within state workforce development programs. In another effort to promote evidence-based practices, ETA has supported an applied data analytics program offered through the Coleridge Initiative for multiple teams from state workforce agencies.
Score
8
8
Administration for Children and Families (HHS)
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
- The Administration for Children and Families has established a common evidence framework adapted for the human services context from the framework for education research developed by the U.S. Department of Education (ED) and the National Science Foundation (NSF). The ACF framework, which includes the six types of studies delineated in the ED/NSF framework, aims to (1) inform ACF’s investments in research and evaluation and (2) clarify for potential grantees’ and others’ ACF’s expectations for different types of studies.
6.2 Did the agency have a common evidence framework for funding decisions?
- While ACF does not have a common evidence framework across all funding decisions, certain programs such as those listed below do use a common evidence framework for funding decisions:
- The Family First Prevention Services Act (FFPSA) enables states to use funds for certain evidence-based services. In April 2019, ACF published the Prevention Services Clearinghouse Handbook of Standards and Procedures, which provides a detailed description of the standards used to identify and review programs and services in order to rate programs and services as promising, supported, and well-supported practices.
- The PREP Competitive Grants were funded to replicate effective, evidence-based program models or substantially incorporate elements of projects that have been proven to delay sexual activity, increase condom or contraceptive use for sexually active youth, and/or reduce pregnancy among youth. Through a systematic evidence review, HHS selected forty-four models that grantees could use, depending on the needs and age of the target population of each funded project.
6.3 Did the agency have a clearinghouse(s) or a user-friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
- ACF sponsors several user friendly tools that disseminate and promote evidence-based interventions. Several evidence reviews of human services interventions have disseminated and promoted evidence-based interventions by rating the quality of evaluation studies and presenting results in a user friendly searchable format. Current evidence reviews include (1) Home Visiting Evidence of Effectiveness, which provides an assessment of the evidence of effectiveness for early childhood home visiting models that serve families with pregnant women and children from birth to kindergarten entry; (2) The Pathways to Work Evidence Clearinghouse, a user friendly website that reports on “projects that used a proven approach or a promising approach in moving welfare recipients into work, based on independent rigorous evaluations of the projects” and allows users to search for interventions based upon characteristics of the clients served by the intervention; (3) ACF’s Title IV-E Prevention Services Clearinghouse, whose easily accessible and searchable website allows users to find information about mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services designated as “promising,” “supported,” and “well-supported” practices by an independent systematic review; and (4) Child Care & Early Education Research Connections, which promotes high-quality research in child care and early education to support policymaking. Its associated website provides research and data resources for researchers, policymakers, practitioners, and others.
- Additionally, most ACF research and evaluation projects produce and widely disseminate short briefs, tip sheets, or infographics that capture high-level findings from the studies and make information about program services, participants, and implementation more accessible to policymakers, practitioners, and others invested in the outcomes of the research or evaluation. For example, the PACE project released a series of nine short briefs to accompany the implementation and early impact reports that were released for each of the nine PACE evaluation sites.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
- The Administration for Children and Families Evaluation Policy states that it is important for evaluators to disseminate research findings in ways that are accessible and useful to policymakers, practitioners, and the communities that ACF serves and that OPRE and program offices will work in partnership to inform potential applicants, program providers, administrators, policymakers, and funders through disseminating evidence from ACF-sponsored and other good quality evaluations. Research contracts initiated by OPRE include a standard clause requiring contractors to develop a dissemination plan during early project planning to identify key takeaways, target audiences, and strategies for most effectively reaching the target audiences. The dissemination strategy adopted by OPRE is also supported by a commitment to plain language; OPRE works with its research partners to ensure that evaluation findings and other evidence are clearly communicated. Additionally, it has a robust dissemination function that includes the OPRE website, including a blog, an e-newsletter, and social media presence on Facebook, Twitter, Instagram, and LinkedIn.
- The Office of Planning, Research, and Evaluation hosts an annual “Evaluation and Monitoring 101” training for ACF staff to help agency staff better understand how to design, conduct, and use findings from program evaluation and performance monitoring, ultimately building the capacity of agency staff and program offices to use evaluation research and data analysis to improve agency operations.
- The office biennially hosts two major conferences, the Research and Evaluation Conference on Self-Sufficiency and the National Research Conference on Early Childhood to share research findings with researchers and with program administrators and policymakers at all levels. It also convenes the Network of Infant and Toddler Researchers, which brings together applied researchers with policymakers and technical assistance providers to encourage research-informed practice and practice-informed research, and the Child Care and Early Education Policy Research Consortium, which brings together researchers, policymakers, and practitioners to discuss what is being learned from research that can help inform policy decisions for ACF, states, territories, localities, and grantees and to consider the next steps in early care and education research.
- The Children’s Bureau sponsors the recurring National Child Welfare Evaluation Summit to bring together partners from child welfare systems and the research community to strengthen the use of data and evaluation in child welfare; disseminate information about effective and promising prevention and child welfare services, programs, and policies; and promote the use of data and evaluation to support sound decision-making and improved practice in state and local child welfare systems.
- The Administration for Children and Families also sponsors additional resources:
- research centers that advance research and translate findings to inform practice, including the Tribal Early Childhood Research Center, the Center for Research on Hispanic Children & Families, and the African American Child and Family Research Center; and
- resource websites, including the Child Welfare Information Gateway, Child Welfare Capacity Building Collaborative, and a forthcoming website for Healthy Marriage and Responsible Fatherhood grantees to support grantee access to program-relevant research and evidence.
Score
5
5
Substance Abuse and Mental Health Services Administration
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
- The SAMHSA Strategic Plan FY2019-FY2023 (pp. 20-23) outlines five priority areas to carry out the vision and mission of SAMHSA, including priority 4: improving data collection, analysis, dissemination, and program and policy evaluation. This priority includes three objectives: (1) to develop consistent data collection strategies to identify and track mental health and substance use needs across the nation; (2) to ensure that all SAMHSA programs are evaluated in a robust, timely, and high-quality manner; and (3) to promote access to and use of the nation’s substance use and mental health data and conduct program and policy evaluations and use the results to advance the adoption of evidence-based policies, programs, and practices.
- SAMHSA has informally incorporated qualitative data into its framework through the feedback received by the project officers and through annual narrative reports submitted by grantees. It is in regular communication with grantees and the state/community programs regarding opportunities and challenges. It is beginning to develop a more formal process in FY22 for incorporating qualitative feedback into its evaluation process.
6.2 Did the agency have a common evidence framework for funding decisions?
- Universal language about using EBPs is included in SAMHSA’s funding opportunity announcements (FOAs) also known as NOFO (or notice of funding opportunity). This language includes acknowledgement that “EBPs have not been developed for all populations and/or service settings,” thus encouraging applicants to “provide other forms of evidence” that a proposed practice is appropriate for the intended population.
- Specifically, the language states that applicants should:
(1) document that the EBPs chosen are appropriate for intended outcomes;
(2) explain how the practice meets SAMHSA’s goals for the grant program;
(3) describe any modifications or adaptations needed for the practice to meet the goals of the project;
(4) explain why the EBP was selected;
(5) justify the use of multiple EBPs if applicable; and
(6) discuss training needs or plans to ensure successful implementation.
Lastly, the language includes resources the applicant can use to understand EBPs. Federal grants officers work in collaboration with the SAMHSA Office of Financial Resources to ensure that grantee funding announcements clearly describe the evidence standard necessary to meet funding requirements. - SAMHSA developed a manual Developing a Competitive SAMHSA Grant Application explains information applicants will likely need for each section of the grant application. It has two sections devoted to evidence-based practices (p. 8, p. 26), including (1) a description of the EBPs applicants plan to implement, (2) specific information about any modifications applicants plan to make to the EBPs and a justification for making them, and 3) how applicants plan to monitor the implementation of the EBPs. In addition, if applicants plan to implement services or practices that are not evidence based, they must show that these services/practices are effective.
6.3 Did the agency have a clearinghouse(s) or user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
- The EBPRC provides communities, clinicians, policymakers, and others with the information and tools to incorporate evidence-based practices into their communities or clinical settings. It contains a collection of scientifically based resources for a broad range of audiences, including treatment improvement protocols, toolkits, resource guides, clinical practice guidelines, and other science-based resources. The retooled EBPRC neither accepts open submissions from outside program developers nor rates individual programs.
- Because SAMHSA recognizes that one size does not fit all, although grantees are encouraged to consider the EBPs listed on the SAMHSA EBPRC website, they must provide information on the EBP they plan to implement. Their description should reference why each EBP is appropriate for the problem area addressed by the grant as well as the specific population(s) of focus. They are also asked for specific information about any modifications planned to make the EBPs and a justification for making these modifications as well as how the grantee will monitor the implementation of the EBPs to ensure that they are implemented according to EBP guidelines.
- Recognizing that communities currently face unprecedented challenges with access to mental health and substance use services as well as behavioral health workforce challenges, SAMHSA strives to be flexible, understanding that if notices of funding opportunities are too prescriptive, it risks losing applicants without capacity to implement certain EBPs. An innovation designed to connect more diverse and historically marginalized populations to apply for grants is SAMHSA’s DIPS. The DIPS model helps to connect community-based organizations to funders and advocates at all levels (e.g., federal and philanthropic organizations, stakeholders, and communities).
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
- SAMHSA’’s EBPRC aims to provide communities, clinicians, policymakers, and others in the field with the information and tools they need to incorporate EBPs into their communities or clinical settings. It contains a collection of science-based resources, including treatment improvement protocols, toolkits, resource guides, and clinical practice guidelines for a broad range of audiences. As of June 2022, it includes 155 items, including 15 data reports, 23 toolkits, 6 fact sheets, and 91 practice guides.
- The Mental Health Technology Transfer Center (MHTTC) Network engages with organizations and treatment practitioners involved in the delivery of mental health services to strengthen their capacity to deliver effective evidence-based practices to individuals, including the full continuum of services spanning mental illness prevention, treatment, and recovery support. Under the State Targeted Technical Assistance grant, the Opioid Response Network was created to support efforts to address opioid use disorder prevention, treatment, and recovery, and to provide education and training at the local level in evidence-based practices.
- The Knowledge Application Program supports the professional development of behavioral health workers and provides information and resources on best practices. Specifically, this program provides substance use treatment professionals with publications that contain information on best treatment practices.
- The Substance Abuse and Mental Health Services Administration (SAMHSA) promotes the utilization of evidence-based practices. Within grant applications, it encourages innovation. For example, the FY20-21 Substance Use Prevention and Treatment Block Grant Application includes the following language: There is increased interest in having a better understanding of the evidence that supports the delivery of medical and specialty care including mental/substance use disorder services. Over the past several years, SAMHSA has collaborated with CMS, HRSA, SMAs, state mental/substance use disorder authorities, legislators, and others regarding the evidence of various mental and substance misuse prevention, treatment, and recovery support services.
- States and other purchasers are requesting information on evidence-based practices or other procedures that result in better health outcomes for individuals and the general population. While the emphasis on evidence-based practices will continue, there is a need to develop and create new interventions and technologies and in turn to establish the evidence. The Substance Abuse and Mental Health Services Administration supports states’ use of the block grants for this purpose. The National Quality Forum and the Institute of Medicine recommend that evidence play a critical role in designing health benefits for individuals enrolled in commercial insurance, Medicaid, and Medicare. To respond to these inquiries and recommendations, SAMHSA has undertaken several activities. Its EPBRC assesses the research evaluating an intervention’s impact on outcomes and provides information on forty-three resources to facilitate the effective dissemination and implementation of the program. The EPBRC provides the information and tools needed to incorporate evidence-based practices into communities or clinical settings.
Score
5
5
U.S. Dept. of Housing & Urban Development
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
- The Department of Housing and Urban Development’s Program Evaluation Policy defines standards that prioritize rigorous methods for research and evaluation covering impact evaluations, implementation of process evaluations, descriptive studies, outcome evaluations, formative evaluations, and both qualitative and quantitative approaches. It also provides for timely dissemination of such evidence to stakeholders. The department updated the Program Evaluation Policy in August 2021 to address the rigorous inclusion of qualitative evidence and considerations related to equity.
6.2 Did the agency have a common evidence framework for funding decisions?
- The Department of Housing and Urban Development does not have a common evidence framework for funding decisions. It seeks to employ tiered evidence in funding decisions by embedding implementation and impact evaluations in funding requests for program initiatives, including major program demonstrations that employ random assignment methods. These include the Moving To Work Expansion Demonstration, the Rental Assistance Demonstration, the Rent Reform Demonstration, the Family Self-Sufficiency Demonstration, the Housing Counseling Demonstration, and the Family Options Demonstration, described above. Such trials provide robust evidence to inform scale-up funding decisions.
- The department extended its standardized data collection and reporting framework, Standards for Success, to additional discretionary grant programs in FY19. The framework consists of a repository of data elements that participating programs use in grant reporting, creating common definitions, and measures across programs for greater analysis and coordination of services.
6.3 Did the agency have a clearinghouse(s) or a user-friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
- The Department of Housing and Urban Development provides resources and assistance to support community partners in evidence-based practice through the HUD Exchange web portal and through Community Compass technical assistance. The Department of PD&R provides the public, policymakers, and practitioners with evidence of what works through the Regulatory Barriers Clearinghouse and HUD USER, which is a portal and web store for program evaluations, case studies, and policy analysis and research. The evaluations of major programs of demonstrations provide rigorous evidence about effect sizes and variations in effects between key subgroups. Research available on HUD USER supports greater equity in housing and community development policy, including HUD’s foundational research to measure the extent of housing discrimination and experimental demonstrations and other studies assessing how best to increase economic opportunity for disadvantaged and underserved populations. Additionally, HUD USER contains pages dedicated to HUD’s most important research areas, including research on family homelessness, deregulation in public housing, and supportive services for older adults.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
- The Department of Housing and Urban Development provides resources and assistance to support community partners in evidence-based practice through the HUD Exchange web portal and through technical assistance. The Office of PD&R provides the public, policymakers, and practitioners with evidence of what works primarily through HUD USER, a portal and web store for program evaluations, case studies, and policy analysis and research; the Regulatory Barriers Clearinghouse; and through initiatives such as Sustainable Construction Methods in Indian Country and the Consumer’s Guide to Energy-Efficient and Healthy Homes. This content is designed to provide current policy information, elevate effective practices, and synthesize data and other evidence in accessible formats such as Evidence Matters. Through these resources, researchers and practitioners can see the full breadth of work on a given topic (e.g., rigorous established evidence, case studies of what has worked in the field, and new innovations currently being explored) to inform their work.
- The Office of PD&R has increased emphasis on generating interim reports during long-term impact evaluations. Such interim reports provide practitioners with early findings about implementation practice and outcomes that can inform their own program designs. An example is the Interim Report from HUD’s Supportive Services Demonstration, published in 2020.
- Community Compass technical assistance for urban, rural, and tribal partners is designed to facilitate understanding of community and housing development issues in a way that cuts across program silos. It supports them in evaluation, evidence building, integrating knowledge management principles, and sharing practices.
Score
5
5
Administration for Community Living (HHS)
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
- The Administration for Community Living defines evidence-based programs on its website. Its NIDILRR has two principal frameworks that guide and inform the generation of new knowledge and products. The stages of research framework is used to guide, inform, and track the creation of new knowledge that in turn becomes part of the larger disability evidence base. The stages of development framework is used to guide, inform, and track the development of new products and technologies. Both of these frameworks are codified in federal regulations and are described on NIDILRR’s Frameworks’ page on the ACL website. The stages of research framework is codified in 45 CFR 1330.4 while the stages of the development framework is codified in 45 CFR 1330.5.
6.2 Did the agency have a common evidence framework for funding decisions?
- The Older Americans Act requires the use of evidence-based programming in Title III-D-funded activities: Disease Prevention and Health Promotion Services. In response, ACL developed a definition of evidence-based programs and created a website containing links to a range of resources for evidence-based programs. This is a common evidence framework used for activities funded by the Older Americans Act. For programs that are not legislatively required to use evidence-based models, ACL, through its funding process, requires all programs to provide clear justification and evidence (where available) that proposed projects will achieve their stated outcomes. In 2018 ACL developed a tool to help a small number of program officers assess grantee progress toward the stated goals of their grants. Using the tool program officers have instituted corrective actions or required underperforming grantees to relinquish grant funds. The agency is developing similar tools for several other grant programs with the intention of rolling out new guidance for program officers in 2023.
6.3 Did the agency have a clearinghouse(s) or a user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
- The Administration for Community Living does not have a common evidence repository that applies across the entire agency. It publishes intervention summaries of aging and disability evidence-based programs and practices. It funds the Evidence-Based Program Review Council to identify new community programs that meet the criteria established by the Administration for Community Living/Administration on Aging for evidence-based programs funded through the OAA Title III-D. The Model Systems Knowledge Translation Center has worked with NIDILRR’s model systems grantees to develop and publish a variety of evidence-based factsheets about living with spinal cord injury, traumatic brain injury, or burn injury. The ACL Living Well demonstration program requires grantees to use evidence-based and innovative strategies to (1) improve access to and quality of community services; (2) reduce and mitigate abuse and neglect; and (3) support empowerment, self-determination, and self-advocacy.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
- To drive improvements in outcomes for older adults and individuals with disabilities, ACL works through its resource centers to help grantees use evidence. For example, with funding from ACL, the National Center on Aging, in collaboration with the Evidence-Based Leadership Council, led an innovative vetting process to increase the number of programs available to ACL’s aging network that meet the Title III-D evidence-based criteria. This process resulted in adding six new health promotion programs and three new programs for preventing falls. The Alzheimer’s Disease Supportive Services Program funds competitive grants to expand the availability of evidence-based services that support persons with Alzheimer’s disease and related dementia and their family caregivers. Extensive evaluation of the National Chronic Disease Self-Management Education and Falls Prevention database helped generate important insights for potential new ACL applicants in preparing their applications using data-driven estimation procedures for participant and completion targets. In addition, ACL funded several grants, such as the Lifespan Respite Care Program: State Program Enhancement Grants and Disability and Rehabilitation Research Projects Program: Chronic Disease Management for People with Traumatic Brain Injury, which are designed in part to develop an evidence base for respite care and related services and to contribute to the evidence base upon which people with traumatic brain injury and their health care providers can use effective chronic disease management practices, respectively. Moreover, NIDILRR provides the Rehabilitation Measures Database, which is a knowledge translation on-line resource with succinct summaries of instruments relevant to rehabilitation populations that provides evidence-based summaries including concise descriptions of each instrument’s psychometric properties, reliability, validity, sensitivity, instructions for administering and scoring, and a representative bibliography with citations.