2022 Federal Index


Use of Evidence in Noncompetitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its non-competitive grant programs in FY22 (examples: evidence-based funding set-asides, requirements to invest funds in evidence-based activities, and pay for success provisions)

Score
10
Millennium Challenge Corporation
  • MCC does not administer non-competitive grant programs (relative score for criteria #8 applied).
Score
7
U.S. Department of Education
9.1 What were the agency’s five largest non-competitive programs and their appropriation amounts (and were city, county, and/or state governments eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in the largest five non-competitive grant programs? (e.g., Are evidence-based interventions/practices required or suggested? Is evidence a significant requirement?)
  • Section 1003 of ESSA requires states to set aside at least 7% of their Title I, Part A funds for a range of activities to help school districts improve low-performing schools. School districts and individual schools are required to create action plans that include “evidence-based” interventions that demonstrate strong, moderate, or promising levels of evidence.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • ESEA requires a National Assessment of Title I– Improving the Academic Achievement of the Disadvantaged. In addition, Title I Grants require state education agencies to report on school performance, including those schools identified for comprehensive or targeted support and improvement.
  • Federal law (ESEA) requires states receiving funds from 21st Century Community Learning Centers to “evaluate the effectiveness of programs and activities” that are carried out with federal funds (section 4203(a)(14)), and it requires local recipients of those funds to conduct periodic evaluations in conjunction with the state evaluation (section 4205(b)).
  • The Office of Special Education Programs (OSEP), the implementing office for IDEA grants to states, implements an accountability system that puts more emphasis on results through the use of Results Driven Accountability.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs (besides its five largest grant programs)?
  • Section 4108 of ESEA authorizes school districts to invest “safe and healthy students” funds in Pay for Success initiatives. Section 1424 of ESEA authorizes school districts to invest their Title I, Part D funds (Prevention and Intervention Programs for Children and Youth Who are Neglected, Delinquent, or At-Risk) in Pay for Success initiatives; under the section 1415 of the same program, a State agency may use funds for Pay for Success initiatives.
9.5 What are the agency’s 1-2 strongest examples of how non-competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • States and school districts are implementing the requirements in Title I of the ESEA regarding using evidence-based interventions in school improvement plans. Some States are providing training or practice guides to help schools and districts identify evidence-based practices.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • In 2016, ED released non-regulatory guidance to provide state educational agencies, local educational agencies (LEAs), schools, educators, and partner organizations with information to assist them in selecting and using “evidence-based” activities, strategies, and interventions, as defined by ESSA, including carrying out evaluations to “examine and reflect” on how interventions are working. However, the guidance does not specify that federal non-competitive funds can be used to conduct such evaluations.
Score
7
U.S. Agency for International Development
  • USAID does not administer noncompetitive grant programs (relative score for criteria #8 applied).
Score
3
AmeriCorps
9.1 What were the agency’s five largest noncompetitive programs and their appropriation amounts (and were city, county, and/or state governments eligible to receive funds from these programs)?
  • In FY22, the five largest noncompetitive grant programs are:
    1. AmeriCorps state formula grants program: $156,000,000; eligible grantees: states;
    2. AmeriCorps NCCC: $34,500,000; eligible grantees: nonprofit organizations;
    3. AmeriCorps VISTA: $100,000,000; eligible grantees: nonprofit organizations, state, tribal, and local governments, institutions of higher education;
    4. Senior Corps Foster Grandparents: $122,000,000; eligible grantees: nonprofit organization, local governments;
    5. Senior Corps Senior Companion Program: $54,000,000; eligible grantees: nonprofit organizations, local governments.
9.2 Did the agency use evidence of effectiveness to allocate funds in the largest five noncompetitive grant programs (e.g., are evidence-based interventions/practices required or suggested and is evidence a significant requirement)?
  • The AmeriCorps VISTA, NCCC and Seniors programs (Foster Grandparents, Senior Companions, and RSVP) are distinguished from the ASN model in three important ways. First, with the exception of RSVP (6% of the agency’s FY22 enacted operating budget), these are not competitive grant programs as defined in their authorizing legislation. Secondly, these programs have as a primary focus the member and volunteer experiences and outcomes. Third, the AmeriCorps VISTA and NCCC programs are more directly managed by AmeriCorps staff.
  • Because the programmatic emphasis of these AmeriCorps programs is human development across the lifespan, with participation in national service as the intervention for influencing positive developmental outcomes for members and volunteers across the life span, AmeriCorps’s approach to building evidence for these programs is different from a tiered evidence framework. The AmeriCorps ORE has a robust research agenda for building evidence for these programs. Broadly speaking, the research strategy relies primarily on survey research conducted by the U.S. Census, AmeriCorps, or other organizations (e.g., the Urban Institute). This approach relies on research designs that capture data over time, mapped to time participating in national service and follow-up periods. When feasible, AmeriCorps evaluations have used comparison groups of similar volunteers to determine the relative influence of AmeriCorps versus volunteering in general.
9.3 Did the agency use its five largest noncompetitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)
  • Examples of studies that have contributed to the country’s and the agency’s body of evidence for national service and volunteering include the following:
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs (besides its five largest grant programs)?
  • AmeriCorps administers only five noncompetitive grant programs, as described above.
9.5 What are the agency’s 1-2 strongest examples of how noncompetitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
9.6 Did the agency provide guidance that makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • AmeriCorps does not prohibit the use of formula dollars for evaluation but each state commission may have its own guidelines. Further, formula grantees awarded more than $500,000 must perform evaluations using their grant funds.
Score
7
U.S. Department of Labor
9.1 What were the agency’s five largest noncompetitive programs and their appropriation amounts (and were city, county, and/or state governments eligible to receive funds from these programs)?
  • In FY22, the five largest noncompetitive grant programs at DOL are in the ETA:
    1. Unemployment Insurance State Administration: $2,591,816,000; eligible grantees: city, county, and/or state governments.
    2. Dislocated Worker Employment and Training formula grants: $1,075,553,000; eligible grantees: city, county, and/or state governments.
    3. Youth Activities: $933,130,000; eligible grantees: city, county, and/or state governments.
    4. Adult Employment and Training Activities: $870,649,000; eligible grantees: city, county, and/or state governments.
    5. Employment Security Grants to States: $675,052,000; eligible grantees: city, county, and/or state governments.
9.2 Did the agency use evidence of effectiveness to allocate funds in the largest five noncompetitive grant programs (e.g., are evidence-based interventions/practices required or suggested and Is evidence a significant requirement)?
  • A signature feature of WIOA (Pub. L. 113-128) is its focus on the use of data and evidence to improve services and outcomes, particularly in provisions related to states’ role in conducting evaluations and research, as well as in requirements regarding data collection, performance standards, and state planning. Conducting evaluations is a required statewide activity, but there are additional requirements regarding coordination (with other state agencies and federal evaluations under WIOA), dissemination, and provision of data and other information for federal evaluations.
  • Evidence and performance provisions of WIOA (1) increased the amount of WIOA funds states can set aside and distribute directly from 5 to 10% to 15% and authorized them to invest these funds in pay for performance initiatives; (2) authorized states to invest their own workforce development funds, as well as non-federal resources, in pay for performance initiatives; (3) authorized local workforce investment boards to invest up to 10% of their WIOA funds in pay for performance initiatives; and (4) authorized states and local workforce investment boards to award pay for performance contracts to intermediaries, community based organizations, and community colleges.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence (e.g., requiring grantees to participate in evaluations)?
  • Section 116(e) of WIOA and CFR 668.220 describe how the state, in coordination with local workforce boards and state agencies that administer the programs, shall conduct ongoing evaluations of activities carried out in the state under these state programs. These evaluations are intended to promote, establish, implement, and utilize methods for continuously improving core program activities in order to achieve high-level programs within and high-level outcomes from the workforce development system.
  • The Employment and Training Administration sponsors WorkforceGPS, which is a community point of access to support workforce development professionals in their use of evaluations to improve state and local workforce systems. Professionals and leaders can access a variety of resources and tools, including an Evaluation Peer Learning Cohort to help them improve their research and evaluation capacities. The WorkforceGPS includes links to resources on evaluation assessment readiness, evaluation design, and performance data, all focused on improving the public workforce system. As of FY2021, eighteen state teams, consisting of the WIOA core partners, have voluntarily participated in the Evaluation Peer Learning Cohort technical assistance activities to gauge and build capacity for research and evaluation.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other noncompetitive grant programs (besides its five largest grant programs)?
  • Reemployment Services and Eligibility Assessments funds must be used for interventions or service delivery strategies demonstrated to reduce the average number of weeks of unemployment insurance a participant receives by improving employment outcomes. The law provides for a phased implementation of the new program requirements over several years. In FY19, DOL awarded $130,000,000 to states to conduct RESEA programs that met these evidence of effectiveness requirements. Beginning in FY23, states must also use no less than 25% of RESEA grant funds for interventions with a high or moderate causal evidence rating and show a demonstrated capacity to improve outcomes for participants. This percentage increases in subsequent years until after FY26, when states must use no less than 50% of such grant funds for such interventions. Training and Employment Guidance Letter No. 05-21 and Unemployment Insurance Program Letter No. 10-22 describe the evaluation and evidence-based expectations for FY22 through FY27. These expectations stipulate the use of existing evidence, building future evidence, and using funding for interventions with high or moderate causal ratings.
9.5 What are the agency’s one or two strongest examples of how noncompetitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Institutional Analysis of American Job Centers: the goal of the evaluation was to understand and systematically document the institutional characteristics of AJCs and to identify variations in service delivery, organization structure, and administration across AJCs.
  • Career Pathways Descriptive and Analytical Study: The purpose of this project was to build evidence about the implementation and effectiveness of career pathways approaches to education and training. In 2018 the Chief Evaluation Office partnered with the ETA to conduct the Career Pathways Descriptive and Analytical project, which included a portfolio of three studies: a meta-analysis of the impacts of career pathways program approaches, a longitudinal career trajectories and occupational transitions study, and an exploratory machine learning study. Researchers used data from four large nationally representative longitudinal surveys, as well as licensed data on occupational transitions from online career profiles, to examine workers’ career paths and wages. One of the final products was an interactive Career Trajectories and Occupational Transitions Dashboard.
  • Analysis of Employer Performance Measurement Approaches: The goal of the study was to examine the appropriateness, reliability, and validity of proposed measures of effectiveness in serving employers, as required under WIOA. It included knowledge development to understand and document the state of the field, an analysis and comparative assessment of measurement approaches and metrics, and the dissemination of findings through a report, as well as research and topical briefs. Although the authors did not find an overwhelming case for adopting either one measure or several measures, adopting more than one measure offers the advantage of capturing more aspects of performance and may make results more actionable for the different Title I, II, III, and IV programs. Alternatively, a single measure has the advantage of clarity on how state performance is assessed and fewer resources devoted to record keeping.
9.6 Did the agency provide guidance that makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity building efforts?
  • The Employment and Training Administration’s RESEA grantees may use up to 10% of their grant funds for evaluations of their programs. The administration released specific evaluation guidance to help states understand how to conduct evaluations of their RESEA interventions with these grant funds. The goal of the agency guidance, along with the evaluation technical assistance being provided to states and their partners, is to build states’ capacity to understand, use, and build evidence.
  • Section 116 of WIOA establishes performance accountability indicators and performance reporting requirements to assess the effectiveness of states and local areas in achieving positive outcomes for individuals served by the workforce development system’s core programs. Section 116(e) of WIOA and CFR 668.220 require states to “employ the most rigorous analytical and statistical methods that are reasonably feasible, such as the use of control groups” and requires that states evaluate the effectiveness of their WIOA programs in an annual progress report that includes updates on (1) current or planned evaluation and related research projects, including methodologies used; (2) efforts to coordinate the development of evaluation and research projects with WIOA core programs, other state agencies, and local boards; (3) a list of completed evaluation and related reports with publicly accessible links to such reports; (4) efforts to provide data, survey responses, and timely visits for federal evaluations; (5) any continuous improvement strategies utilizing results from studies and evidence-based practices evaluated. States are permitted to use WOIA grant funds to perform the necessary performance monitoring and evaluations. States are also required to describe their approach to conducting evaluations in state plans submitted to ETA and partner agencies.
Score
7
Administration for Children and Families (HHS)
9.1 What were the agency’s five largest noncompetitive programs and their appropriation amounts (and were city, county, and/or state governments eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in the largest five noncompetitive grant programs (e.g., are evidence-based interventions/practices required or suggested and Is evidence a significant requirement)?
  • The FFPSA (Division E, Title VII of the Bipartisan Budget Act of 2018), funded under the Foster Care budget, enables states to use federal funds available under parts B and E of Title IV of the Social Security Act to provide enhanced support to children and families and prevent foster care placements through the provision of evidence-based mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services. This act requires an independent systematic review of evidence to designate programs and services as “promising,” “supported,” and “well-supported” practices. Only interventions designated as evidence-based will be eligible for federal funds. Attached to the FY20 Appropriations Act was the Family First Transition Act (P.L. 116-94), which provided grantees with additional time and resources to implement the requirements of FFPSA.
  • Most of ACF’s noncompetitive grant programs are large block grants without the legislative authority to use evidence of effectiveness to allocate funds.
9.3 Did the agency use its five largest noncompetitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)?
  • TANF Grant Program: The TANF statute gives HHS responsibility for building evidence about the TANF program: “Evaluation of the Impacts of TANF: The Secretary shall conduct research on the effect of State programs funded under this part and any other State program funded with qualified State expenditures on employment, self-sufficiency, child well-being, unmarried births, marriage, poverty, economic mobility, and other factors as determined by the Secretary.”  Since FY17, Congress has designated 0.33% of the TANF Block Grant for related research, evaluation, and technical assistance.  The Administration for Children and Families has a long-standing and ongoing research portfolio in service of building evidence for the TANF Grant Program. It conducts research and evaluation projects in collaboration with TANF grantees, typically in areas where TANF grantees are facing challenges, innovating, or carrying out demonstration projects. Recent and ongoing work includes building evidence around career pathways training programssubsidized employment approachesjob search assistance, and employment coaching. These are all program approaches used by state and county TANF grantees to meet their employment goals. The administration widely disseminates information from its research and evaluation activities to TANF grantees and provides extensive training and technical assistance.
  • The TANF Data Innovation (TDI) project, launched by ACF in 2017, supports the innovation and improved effectiveness of state TANF programs by enhancing the use of data from TANF and related human services programs. In 2019, the TANF Data Collaborative (TDC), an initiative of the TDI project, conducted a needs assessment survey of all states. It is now supporting a TANF agency Pilot program with eight pilot sites. To support state and local efforts and build strategic partnerships, pilot agencies are receiving funding and intensive training and technical assistance.
  • Child Care Development Block Grant Program: While the Child Care Development Block Grant Act does not allocate funding for states to independently build evidence, the act allows for up to 0.5% of CCDBG funding for a fiscal year to be reserved for HHS to conduct research and demonstration activities and to conduct periodic, external, independent evaluations of the CCDF program with respect to increasing access to child care services and improving the quality and safety of child care services.Health and Human Services must then disseminate the key findings of these evaluations widely and on a timely basis. In recent years, appropriations acts have also authorized the use of up to 0.5% of child care entitlement funds for this purpose. ACF manages this ongoing research portfolio to build evidence for the CCDBG Program, conducting research and evaluation projects in collaboration with CCDBG grantees, typically in areas where they are facing challenges, innovating, or carrying out demonstration projects. Major ongoing and recent projects include the National Survey of Early Care and Education; assessment of evidence on ratings in quality rating and improvement systems and several research partnerships between CCDF lead agencies and researchers. ACF widely disseminates information from its research and evaluation activities to CCDF grantees and provides extensive training and technical assistance.
  • Foster Care and Related Child Welfare Grant Programs: The Administration for Children and Families administers several foster care and related child welfare grant programs that do not possess the funding authority for states to conduct independent evidence-building activities. Some of these programs have set-asides for federal research; the Foster Care Independence Act of 1999, for instance, sets aside 1.5% of the allocation for the John H. Chafee Foster Care Program for Successful Transition to Adulthood program for evaluations of promising independent living programs.
  • As such, ACF has an ongoing research and evaluation portfolio on the Title IV-E foster care grant program and related grant programs. It conducts research and evaluation in collaboration with child welfare grantees, typically focusing on areas in which grantees are facing challenges, innovating, or conducting demonstrations. Examples include strategies for prevention of maltreatment, meeting service needs, and improving outcomes for children who come to the attention of child welfare. For instance, the Supporting Evidence Building in Child Welfare project is intended to increase the number of evidence-supported interventions grantees can use to serve the child welfare population. Other child welfare research and evaluation efforts include National Survey of Child and Adolescent Well-being Building Capacity to Evaluate Child Welfare Community Collaborations to Strengthen and Preserve Families, Building Capacity to Evaluate Interventions for YARH, and Child Welfare Study to Enhance Equity with Data.
  • The Administration for Children and Families has begun work on conducting formative evaluations of independent living programs of potential national significance in preparation for possible future summative evaluations. This work builds on the multi-site evaluation of foster youth programs, a rigorous, random assignment evaluation of four programs funded under the Chafee program completed in 2011.
  • Also, ACF’s Community-Based Child Abuse Prevention (CBCAP) formula grants, with a focus on supporting community-based approaches to prevent child abuse and neglect, are intended to inform the use of other child welfare funds more broadly.
  • Child Support Enforcement Research and Evaluation Grant Program: Section 1115 of the Social Security Act provides unique authority for research and evaluation grants to child support enforcement grantees to “improve the financial well-being of children or otherwise improve the operation of the child support program.” For instance, ACF awarded digital marketing grants to test digital marketing approaches and partnerships to reach parents who could benefit from child support services and create or improve two-way digital communication and engagement with parents.
  • The ACF child support enforcement research portfolio is multifaceted.  A variety of research and evaluation components are administered to understand more about cost and program effectiveness. Research and evaluation within the portfolio have consisted of (1) supporting large multi-state demonstrations that include random assignment evaluations (described in criteria question 7.4), (2) funding a supplement to the Census Bureau’s Current Population survey, and (3) supporting research activities of other government programs and agencies by conducting matches of their research samples to the NDNH.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other noncompetitive grant programs (besides its five largest grant programs)?
  • States applying for funding from ACF’s CBCAP grant program must “demonstrate an emphasis on promoting the increased use and high quality implementation of evidence-based and evidence-informed programs and practices.” The Children’s Bureau defines evidence-based and evidence-informed programs and practices along a continuum with four categories: emerging and evidence-informed; promising; supported; and well supported. Programs determined to fall within specific program parameters will be considered to be evidence-informed or evidence-based practices , as opposed to programs that have not been evaluated using any set criteria. ACF monitors progress on the percentage of program funds directed toward evidence-based and evidence-informed practices. Similarly, the State Personal Responsibility Education Program (State PREP), which awards grants to state agencies to educate young people on both abstinence and contraception to prevent pregnancy and sexually transmitted infections . These state PREP projects replicate effective evidence-based program models or substantially incorporate elements of effective programs.
9.5 What are the agency’s 1-2 strongest examples of how noncompetitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • In Section 413 of the Social Security act where Congress gives HHS primary responsibility for building evidence about the TANF program, Congress also commissions HHS to develop “a database (which shall be referred to as the ‘What Works Clearinghouse of Proven and Promising Projects to Move Welfare Recipients into Work’) of the projects that used a proven approach or a promising approach in moving welfare recipients into work, based on independent, rigorous evaluations of the projects”. In April of 2020, ACF officially launched the Pathways to Work Evidence Clearinghouse, a user friendly website that shares the results of the systematic review and provides web-based tools and products to help state and local TANF administrators, policymakers, researchers and the general public make sense of the results and better understand how this evidence might apply to questions and contexts that matter to them.
  • Additionally, ACF has continued to produce findings from numerous randomized controlled trials providing evidence on strategies that TANF agencies can use such as subsidized employmentcoachingcareer pathways, and job search strategies. Ongoing ACF efforts to build evidence for what works for TANF recipients and other low-income individuals include the Building Evidence on Employment Strategies for Low-Income Families project and the Next Generation of Enhanced Employment Strategies project; these projects are evaluating the effectiveness of innovative programs designed to boost employment and earnings among low-income individuals.
  • ACF’s Office of Child Care drew on research and evaluation findings related to eligibility redetermination, continuity of subsidy use, use of dollars to improve the quality of programs, and more to inform regulations related to child care and development block grant reauthorization.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • Community-Based Child Abuse Prevention programs are authorized as part of the Child Abuse Prevention and Treatment Act (CAPTA), which promotes the use of evidence-based and evidence-informed programs and practices that effectively strengthen families and prevent child abuse and neglect. This includes efforts to improve the evaluation capacity of the states and communities to assess progress of their programs and collaborative networks in enhancing the safety and wellbeing of children and families. The 2020 Program Instruction for the CBCAP grant program states that CBCAP funds made available to states must be used for financing, planning, community mobilization, collaboration, assessment, information and referral, startup, training and technical assistance, information management and reporting, and reporting and evaluation costs for establishing, operating, or expanding community-based and prevention-focused programs and activities designed to strengthen and support families and prevent child abuse and neglect, among other things.
  • The Child Care and Development Block Grant Act of 2014 requires states to spend not less than 7%, 8%, and 9% of their CCDF awards (“quality funds”), respectively, for years 1-2, 3-4, and 5+ after 2014 CCDBG enactment on activities to improve the quality of child care services provided in the state, including:
    • 1B: supporting the training and professional development of the child care workforce through . . . incorporating the effective use of data to guide program improvement;
    • 3: developing, implementing, or enhancing a quality rating system for child care providers and services, which may support and assess the quality of child care providers in the State (A) and be designed to improve the quality of different types of child care providers (C);
    • 7: evaluating and assessing the quality and effectiveness of child care programs and services offered in the state, including evaluating how such programs positively impact children.
  • The Administration for Children and Families requires all CCDF lead agencies to annually report on how their CCDF quality funds were expended, including the activities funded and the measures used by states and territories to evaluate progress in improving the quality of child care programs and services. It released a program instruction for state and territorial lead agencies to provide guidance on reporting the authorized activities for the use of quality funds.
  • It also provides evaluation technical assistance for grantees:
Score
7
Substance Abuse and Mental Health Services Administration
9.1 What were the agency’s five largest noncompetitive programs and their appropriation amounts (and were city, county, and/or state governments eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in its five largest noncompetitive grant programs (e.g., are evidence-based interventions/practices required or suggested and is evidence a significant requirement)?
  • Allocation of community MHBG funds is based on a congressionally mandated formula. States are required to use at least 10% of the MHBG funds to support EBPs that address the needs of individuals with early serious mental illness, including psychotic disorders, regardless of the age of the individual at onset. States are also required to use 5% of their total allocation of MHBG funds to support and provide crisis services. SAMHSA also encourages states to use the MHBG funds to implement evidence-based practices for both adults and children.
  • For the first-episode psychosis 10% set-aside in the MHBG, states are directed to use EPS. SAMHSA provides guidance to states based on evidence. Its recommendation is to develop a state first-episode psychosis program based on the coordinated specialty care model, as evaluated by the National Institute of Mental Health. For example, the first-episode psychosis program OnTrackNY is an evaluated model that is recommended based on evidence of success.
  • Similarly, the Substance Abuse and Prevention Block Grant is a formula grant program that provides states flexibility to identify and deliver substance use-related services to meet their state-specific needs while also ensuring attention to critical prevention-focused public health issues. The authorizing legislation and implementing regulations for the SABG program include a maintenance of effort requirement and specific funding set-asides, including a 20% set-aside for primary prevention and a 5% set-aside for early intervention services for HIV for designated states. Through the SABG, states should “identify, implement, and evaluate evidence-based programs, practices, and policies that have the ability to reduce substance use and improve health and well-being in all communities.”
9.3 Did the agency use its five largest noncompetitive grant programs to build evidence (e.g., requiring grantees to participate in evaluations)?
  • Information on how to use funds for data collection and evaluation is covered in the block grant application. Grantees are encouraged to allocate grants funds for data collection, data analysis, and program evaluation. Some grantees hire external evaluators using grant funds to assist them in the evaluation process. In FY21, SAMHSA updated its application manual to include a section on developing goals and measurable objectives (see p. 38). Specifically, the document states, “To be able to effectively evaluate your project, it is critical that you develop realistic goals and measurable objectives. This chapter will provide information on developing goals and measurable objectives. It will also provide examples of well-written goals and measurable objectives.”
  • Grantees in noncompetitive grant programs are required to submit quantitative data to SAMHSA using reporting systems associated with their grant. For example, state mental health agencies receive noncompetitive grants and compile and report annual data collected from SAMHSA’s Community MHBG. More information on the Uniform Reporting System can be found online. In this way, noncompetitive grant programs not only allow the sharing of data for research and evaluation, but also allow grantees to explore data from other state grantees.
  • In the FY20-21 Block Grant Application, SAMHSA asks states to base their administrative operations and service delivery on principles of continuous quality improvement/total quality management (CQI/TQM). These processes should identify and track critical outcomes and seventy-two performance measures, based on valid and reliable data, consistent with the National Behavioral Health Quality Framework, which will describe the health and functioning of the mental health and addiction systems. The CQI processes should continuously measure the effectiveness of services and supports and ensure that they continue to reflect this evidence of effectiveness. The state’s CQI process should also track programmatic improvements using stakeholder input, including the general population and individuals in treatment and recovery and their families. In addition, the CQI plan should include a description of the process for responding to emergencies, critical incidents, complaints, and grievances.
  • In FY22, SAMHSA focused its resources on an examination of evidence and the collection of data related to both the SABG and the MHBG.
  • For SABG, SAMHSA’s Center for Substance Abuse Treatment is engaging in a multipronged approach to evaluate the program, guided by the December 2020 U.S. Government Accountability Office report Substance Use Disorder: Reliable Data Needed for Substance Abuse Prevention and Treatment Block Grant Program. Based on the Accountability Office’s recommendations, SAMHSA has initiated an assessment of the quality of grantees’ self-reported data. This includes conducting quantitative and qualitative analysis to understand reliability issues associated with grantees’ self-reported data and barriers to data collection and to identify potential alternative data sources and methodological approaches to address data gaps. This effort is expected to result in a set of recommendations in August 2022 for implementing changes to the SABG program’s data collection efforts to improve the consistency and relevance of the data collected.
  • For MHBG, SAMHSA’s Center for Mental Health Services has organized a series of state panel discussions to examine specific aspects of its data collection. The first panel was held in April 2022 to examine the burden and utility of each national outcome measure collected. The session was titled Mental Health Block Grant Uniform Reporting System—Gaps, Challenges, Strengths, and Opportunities. A survey was provided to states and the results shared with the expert panel during a three-hour discussion with conclusions and recommendations designed to increase the data quality and utility of the Block Grant program. Additional panel discussions are scheduled for the summer and fall of 2022 with the results used to inform future block grant funding.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs (besides its five largest grant programs)?
  • Nearly all SAMHSA grants are competitively awarded. SAMHSA has only four noncompetitive grants, which are included above.
9.5 What are the agency’s one or two strongest examples of how noncompetitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • In 2016 SAMHSA, in partnership with the National Institute of Mental Health and Office of the Assistant Secretary for Planning and Evaluation, initiated a three-year evaluation study of the CSC model programs funded through the MHBG 10% set-aside to ascertain the effectiveness of these programs. The study results of services provided by thirty-six diverse programs indicated that the evidence-based CSC programs lead to statistically significant improvements in the health and well-being of individuals who participate in them, including reductions in hospitalization (-79%) and emergency room visits (-71%), criminal justice involvement (-41%), suicide attempts (-66%), and homelessness (-35%).
9.6 Did the agency provide guidance that makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity building efforts?
  • Information on how to use funds for data collection and evaluation is covered in the Block Grant Application. Grantees are encouraged to allocate grants funds for data collection, data analysis, and program evaluation. Some grantees hire external evaluators using grant funds to assist them in the evaluation process.
Score
4
U.S. Dept. of Housing & Urban Development
9.1 What were the agency’s five largest noncompetitive programs and their appropriation amounts (and were city, county, and/or state governments eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in its five largest noncompetitive grant programs (e.g., are evidence-based interventions/practices required or suggested and is evidence a significant requirement)?
  • None of HUD’s largest noncompetitive grants define and prioritize evidence in their allocation of formula funds.
  • Although the funding formulas are prescribed in statute, evaluation-based interventions are central to each program. The department uses evidence from a 2015 Administrative Fee Study of the costs that high-performing public housing authorities incur in administering an HCV Program to revise its approach to providing administrative fees that incentivize PHAs to improve outcomes in leasing and housing mobility. It has also used the results of its Landlord Task Force to provide guidance to PHAs on working effectively with landlords and to propose policy and fee changes to ensure strong landlord participation in the new Emergency Housing Voucher Program funded through the American Rescue Plan. In allocating $5,000,000,000 in emergency housing voucher funding to PHAs, HUD developed an allocation formula that considered (among other factors) evidence of PHA capacity to implement the program effectively and quickly.
  • Funding of public housing by HUD is being radically shifted through the evidence-based Rental Assistance Demonstration (RAD), which enables accessing private capital to address the $26,000,000,000 backlog of capital needs funding. Based on RAD’s demonstrated success, for FY20 HUD proposed to transfer $95,000,000,000 from the operating fund and capital fund to the Tenant-Based Rental Assistance fund to support RAD conversions. For FY21 HUD is proposing to remove the cap on the number of public housing developments to be converted to Section 8 contracts. It is beginning to evaluate RAD’s impacts on children, and it is also conducting a Rent Reform Demonstration and a Moving To Work demonstration to test efficiencies of changing rent rules and effects on tenant outcomes.
  • Public Housing Formula Grants are awarded through a determination of modernization and accrual needs, calculated from data submitted by PHAs in the Inventory Management System/Public Housing Information Center (IMS/PIC). Public housing authorities  are required to annually update and verify their data submissions to the IMS/PIC. The selection of recipients for HCVs is also largely need based.
  • Applicants for the Community Development Entitlement Block Grant and the HOME Investment Partnership must submit a consolidated plan that identifies goals for the program, which will later be used by HUD to evaluate the performance of each grantee. The plan must also outline the project’s plans for community engagement, and include defined outcome measures for each activity to be undertaken.
9.3 Did the agency use its five largest noncompetitive grant programs to build evidence (e.g., requiring grantees to participate in evaluations)?
  • Evidence building is central to HUD’s funding approach through the use of prospective program demonstrations. These include the Public Housing Operating Fund’s RAD, the Public Housing Capital Grants’ Rent Reform demonstration, and the Housing Choice Voucher program’s Moving To Work demonstration grants. As Congress moved to expand Moving to Work flexibilities to additional PHAs, HUD sought authority to randomly assign cohorts of PHAs to provide the ability to rigorously test specific program innovations.
  • Program funds are provided to operate demonstrations through the HCV account Tenant-Based Rental Assistance. These include the Tribal HUD-VA Supportive Housing (Tribal HUD-VASH) demonstration of providing permanent supportive housing to Native American veterans and the Family Self-sufficiency-Family Unification Program demonstration that tests the effect of providing vouchers to at-risk young adults who are aging out of foster care.
  • Applicants for Public Housing Formula Grants are required to submit performance evaluations on all open grants to HUD as a requirement of funding, and annual submissions from grantees are required to include a five-year plan, outlining the targeted goals of each project.
  • Housing choice voucher recipients must complete and maintain accurate accounts and records for each program in a manner that enables HUD to complete an audit. These records must collect data such as physical unit inspections and financial statements, as well as data on the income, racial, ethnic, gender, and disability status of program applicants and participants.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other noncompetitive grant programs (besides its five largest grant programs)?
  • Allocation of HUD-VASH vouchers is based in part on the administrative performance of housing agencies as measured by their past utilization of HUD-VASH vouchers in HUD’s Voucher Management System [Notice PIH-2019-15 (HA)]. This performance information helps ensure that eligible recipients are actually able to lease units with the vouchers that HUD funds. The HUD-VASH Exit Study documented that 87,864 VASH vouchers were in circulation in April 2017, contributing substantially to the 47% in the number of homeless veterans since 2010.
9.5 What are the agency’s 1-2 strongest examples of how noncompetitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • To address a severe backlog of capital needs funding for the nation’s public housing stock, the Rental Assistance Demonstration was authorized in 2011 to convert the properties to project-based Section 8 contracts to attract an infusion of private capital. The 2019 final report on the RAD evaluation showed that conversions successfully raised $12,600,000,000 in funding, an average of $121,747 per unit to improve physical quality and stabilize project finances. Based on the program’s successes, the limit on the number of public housing conversions was increased to 455,000 units in 2018, nearly half of the stock, and HUD has been proposing to eliminate the cap. Additionally, HUD extended the conversion opportunity to legacy multifamily programs through RAD 2.
9.6 Did the agency provide guidance that makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity building efforts?
  • Communities receiving HUD block grant funding through Community Development Block Grants, HOME block grants, and other programs are required to consult local stakeholders, conduct housing needs assessments, and develop needs-driven consolidated plans to guide their activities. They then provide Consolidated Annual Performance and Evaluation Reports to document progress toward their consolidated plan goals in a way that supports continued community involvement in evaluating program efforts.
  • The Department of Housing and Urban Development’s Community Development Block Grant program, which provides formula grants to entitlement jurisdictions, increases local evaluation capacity. Specifically, federal regulations (Section 24 CFR 570.200) authorize CDBG recipients (including city and state governments) to use up to 20% of their CDBG allocations for administration and planning costs that may include evaluation-capacity building efforts and evaluations of their CDBG-funded interventions (as defined in 570.205 and 570.206).
  • Program guidance for Public Housing Formula Grants (capital expenses) includes extensive guidance on eligible program costs, separated by each phase of project development. In the planning and development stage, HUD allows grant funding to be used for studies (market studies and surveys) necessary for development. As included in “soft costs,” HUD allows grant recipients to use program funds to improve PHA management and to improve the involvement of residents and stakeholders in PHA activities.
  • HOME Investment Partnership grantees are allowed to use program funding to promote general management and oversight, which includes activities such as developing compliance and performance-tracking systems, preparing documents and reports for submission to HUD, evaluating program performance in relation to stated goals, and providing technical assistance to personnel tasked with managing such evaluations.
Score
4
Administration for Community Living (HHS)
9.1 What were the agency’s five largest noncompetitive programs and their appropriation amounts (and were city, county, and/or state governments eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in the largest five noncompetitive grant programs (e.g., are evidence-based interventions/practices required or suggested and is evidence a significant requirement)?
  • Authorizing legislation for ACL’s largest noncompetitive grant programs requires consideration of evidence-based programming as a requirement of funding. The Developmental Disabilities Assistance and Bill of Rights Act of 2000 allows for the withholding of funding if “(1) the council or agency has failed to comply substantially with any of the provisions required by section 124 to be included in the State plan, particularly provisions required by paragraphs (4)(A) and (5)(B)(vii) of section 124(c), or with any of the provisions required by section 125(b)(3); or (2) the Council or agency has failed to comply substantially with any regulations of the Secretary that are applicable.” As a condition of funding, noncompetitive grantees are required to “determine the extent to which each goal of the Council was achieved for that year” and report that information to ACL.
  • States that receive Older Americans Act Home and Community-Based Supportive Services Title III-D funds are required to spend those funds on evidence-based programs to improve health and well-being and reduce disease and injury. In order to receive funding, states must utilize programs that meet ACL’s definition of evidence-based programs or are defined as evidence-based by another HHS operating division. Under the Older American Act, caregiver support programs are required to track and report on their use of evidence-based caregiver support services.
9.3 Did the agency use its five largest noncompetitive grant programs to build evidence (e.g., requiring grantees to participate in evaluations)?
  • The Administration for Community Living’s Nutrition Services provides grants for innovations in nutrition programs and services. These research projects must have the potential for broad implementation and demonstrate potential to improve the quality, effectiveness, and outcomes of nutrition service programs by documenting and proving the effectiveness of these interventions and innovations. They must also target services to underserved older adults with greatest social and economic need and individuals at risk for institutional placement to permit such individuals to remain in home and community-based settings. Consistent with its focus on identifying new ways to efficiently improve direct service programs, ACL is using its 1% nutrition authority to fund $3,500,000 in nutrition innovations and to test ways to modernize how meals are provided to a changing senior population. One promising demonstration currently being carried out by the Georgia State University Research Foundation is the Double Blind Randomized Control Trial on the Effect of Evidence-Based Suicide Intervention Training on the Home-Delivered and Congregate Nutrition Program through the Atlanta Regional Commission. This demonstration has drawn widespread attention for its effort to train volunteers who deliver home-delivered meals to recognize and report indicators of suicidal intent and other mental health issues so that they can be addressed.
  • Under Home and Community-Based Services, FY12 Congressional appropriations included an evidence-based requirement for the first time. Older Americans Act Title III-D funding may be used only for programs and activities demonstrated to be evidence based. The National Council on Aging maintains a tool to search for evidence-based programs that are approved for funding through OAA Title III-D.
  • The agency’s Caregiver Support Services builds evidence in a number of areas. These include a national survey of caregivers of older adult clients, gathering and reporting best practices regarding grandparents raising grandchildren, adapting and scaling evidence-based programs for children and older adults with disabilities through the RESILIENCE Rehabilitation Research and Training Center, and other similar efforts.
  • State Councils on Developmental Disabilities design five-year state plans that address new ways of improving service delivery. To implement the state plans, councils work with different groups in many ways, including funding projects to show new ways that people with disabilities can work, play, and learn and seeking information from the public and from state and national sources.
  • State Protection and Advocacy Systems encompass multiple avenues of protection and advocacy including specialization in individuals with developmental disabilities, assistive technology, voting accessibility, individuals with traumatic brain injury, and technical assistance. The Developmental Disabilities Assistance and Bill of Rights Act of 2000 requires Administration on Intellectual and Developmental Disabilities grantees to report annually on progress achieved through advocacy, capacity building, and systemic change activities.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other noncompetitive grant programs (besides its five largest grant programs)?
  • The 2020 reauthorization of the Older Americans Act requires that assistive technology programs are “aligned with evidence-based practice;” that person-centered, trauma informed programs “incorporate evidence-based practices based on knowledge about the role of trauma in trauma victims’ lives;” and that a newly authorized Research, Demonstration, and Evaluation Center for the Aging Network increases “the repository of information on evidence based programs and interventions available to the aging network, which information shall be applicable to existing programs and interventions, and help in the development of new evidence-based programs and interventions.”
9.5 What are the agency’s  one or two strongest examples of how noncompetitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Since 2017, ACL has awarded Innovations in Nutrition grants to forty-three organizations to develop and expand evidence-based approaches to enhance the quality and effectiveness of nutrition programming. It is currently overseeing ten grantees for innovative projects that will enhance the quality, effectiveness, and outcomes of nutrition services programs provided by the National Aging Services Network. The grants total $2,218,419 for this FY22.  In FY22, ACL awarded a total of $1,448,797 funding for the first year of three 5-year research grants. Through this grant program, ACL aims to identify innovative and promising practices that can be scaled across the country and to increase the use of evidence-informed practices within nutrition programs.
9.6 Did the agency provide guidance that makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • All funding opportunity announcements published by ACL include language about generating and reporting evidence about their progress toward the specific goals set for the funds. Grantee manuals include information about the importance of and requirements for evaluation. The National Ombudsman Resource Center, funded by ACL, provides self-evaluation materials for long-term care ombudsman programs  funded under Title VII of the Older Americans Act.
Back to the Standard

Visit Results4America.org