2022 Federal Index


Administration for Children and Families (HHS)

Score
9
Leadership

Did the agency have senior staff members with the authority, staff, and budget to build and use evidence to inform the agency’s major policy and program decisions in FY22?

1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer or equivalent (example: Evidence Act 313)?
  • The deputy assistant secretary for planning, research, and evaluation at OPRE serves as the ACF chief evaluation officer. The deputy assistant secretary oversees OPRE, which supports evaluation and other learning activities across the agency. In FY22 the deputy assistant secretary oversaw a research and evaluation budget of approximately $156,000,000 . The office has 76 federal staff positions; OPRE staff are experts in research and evaluation methods and data analysis as well as ACF programs, policies, and the populations they serve. In June 2022, the director of the Division of Evidence, Evaluation, and Data Policy at the HHS Office of the Assistant Secretary for Planning and Evaluation was named the chief evaluation officer of HHS, responsible for overseeing the HHS Evidence and Evaluation Council, which advises the HHS CEO on implementation of Title I of the Evidence Act.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s chief data officer or equivalent? [example: Evidence Act 202(e)]?
  • In September 2021, HHS named a chief data officer within the HHS Office of the Chief Information Officer. The HHS chief data officer chairs the HHS Data Governance Board, the body responsible for advising the HHS chief data officer and implementing the Title II Evidence Act activities across HHS. In 2020 the director of ACF’s Division of Data and Improvement was designated to act as the primary member to serve on the Data Governance Board and on the HHS Data Council, an HHS advisory body responsible for advising the Data Governance Board and the HHS Evidence and Evaluation Council.
  • The Division of Data and Improvement (DDI) was established by ACF in 2016 to provide federal leadership and resources to improve the quality, use, and sharing of ACF data. The director reports to the deputy assistant secretary for planning, research, and evaluation and oversees work to assist ACF programs in responsibly managing and using data to improve the effectiveness, efficiency, and equity of human services programs. The division has twelve federal staff positions and an FY22 budget of approximately $7,500,000 (not including salaries).
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, performance improvement officer, and other related officials in order to support Evidence Act implementation and improve the agency’s major programs?
  • Since September 2019, ACF’s deputy assistant secretary for planning, research, and evaluation has served as the primary ACF representative to HHS’ Leadership Council and Evidence and Evaluation Council and oversees the director of the Division of Data and Improvement, who serves as the primary ACF representative to the HHS Data Governance Board and Data Council, collectively covering the HHS bodies responsible for implementing Evidence Act activities across HHS. These cross-agency councils meet regularly to discuss agency-specific needs and experiences and to collaboratively develop guidance for department-wide action.
  • Within ACF, the 2016 reorganization that created the Division of Data and Improvement endowed ACF’s deputy assistant secretary for planning, research, and evaluation with oversight of the agency’s strategic planning; performance measurement and management; research and evaluation; statistical policy and program analysis; synthesis and dissemination of research and evaluation findings; data quality, usefulness, and sharing; and application of emerging technologies to improve the effectiveness of programs and service delivery. ACF reviews program office performance measures and associated data at least three times per year coincident with the budget process; OPRE has traditionally worked with ACF program offices to develop research plans on an annual basis and has worked to integrate the development of program-specific learning agendas into this process. In addition, OPRE holds both regular and ad hoc meetings with ACF program offices to discuss research and evaluation findings, as well as other data topics.
Score
10
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and learning agenda (evidence building plan) and did it publicly release the findings of all completed program evaluations in FY22?

2.1  Did the agency have an agency-wide evaluation policy? (Example: Evidence Act 313(d))
  • The Administration for Children and Families Evaluation Policy confirms its commitment to conducting evaluations and using evidence from evaluations to inform policy and practice. ACF seeks to promote rigor, relevance, transparency, independence, and ethics in the conduct of evaluations. It established an evaluation policy in 2012 and updated it in 2021. It published the updated version, which includes a focus on equity throughout all five principles, in the Federal Register on November 11, 2021. In late 2019, ACF released a short video about the policy’s five principles and how it uses them to guide its work.
  • As ACF’s primary representative to the HHS Evidence and Evaluation Council, the ACF deputy assistant secretary for planning, research, and evaluation co-chaired the HHS Evaluation Policy Subcommittee, the body responsible for developing an HHS-wide evaluation policy, which was released in 2021.
2.2 Did the agency have an agency-wide evaluation plan [example: Evidence Act 312(b)]?
  • In accordance with guidance from the U.S. Office of Management and Budget (OMB), ACF contributes to the HHS-wide evaluation plan. The Office of Planning, Research, and Evaluation also annually identifies questions relevant to the programs and policies of ACF and develops an annual research and evaluation spending plan. This plan focuses on activities that OPRE plans to conduct during the following fiscal year.
2.3 Did the agency have a learning agenda (evidence building plan) and did the learning agenda describe the agency’s process for engaging stakeholders including but not limited to the general public, state and local governments, and researchers/academics in the development of that agenda (example: Evidence Act 312)?
  • In accordance with OMB guidance, HHS developed an agency wide evidence-building plan. To develop this document, HHS asked each sub-agency to submit examples of its agency’s priority research questions, potential data sources, anticipated approaches, challenges and mitigation strategies, and active engagement strategies with those affected by this work. The Administration for Children and Families drew from its existing program-specific learning agendas and research plans to contribute priority research questions and has learning activities related to Strategic Goal 3: Strengthen Social Well-Being, Equity, and Economic Resilience in the HHS evidence building plan.
  • In 2020, ACF released a research and evaluation agenda describing research and evaluation activities and plans in nine ACF program areas with substantial research and evaluation portfolios: adolescent pregnancy prevention and sexual risk avoidance, child care, child support enforcement, child welfare, Head Start, health profession opportunity grants, healthy marriage and responsible fatherhood, home visiting, and welfare and family self-sufficiency.
  • In addition to fulfilling requirements of the Evidence Act, ACF has supported and continues to support systematic learning and active engagement activities across the agency. For example:
2.4 Did the agency publicly release all completed program evaluations?
  • The Administration for Children and Families Evaluation Policy requires that “ACF will release evaluation results regardless of findings.Evaluation reports will present comprehensive findings, including favorable, unfavorable, and null findings. ACF will release evaluation results timely–usually within two months of a report’s completion.” ACF has publicly released the findings of all completed evaluations to date. In 2021, OPRE released over 220 research publications. These publications are publicly available on the OPRE website.
2.5 Did the agency conduct an Evidence Capacity Assessment that addressed the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts [Example: Evidence Act 3115, subchapter II (c)(3)(9)]?
  • In accordance with OMB guidance, ACF contributed to an HHS-wide capacity assessment, which was conducted in early 2022.
  • Additionally, OPRE launched the ACF Evidence Capacity Support project in 2020. This project provides support to ACF’s efforts to build and strengthen programmatic and operational evidence capacity, including supporting learning agenda development and the development of other foundational evidence through administrative data analysis. To operationalize “evidence capacity” and guide engagement at the ACF level, the project developed a research-based conceptual framework that will be publicly available in late 2022.
  • Given the centrality of data capacity to evidence capacity, ACF partnered with the HHS Office of the Chief Data Officer to develop and pilot test a tool to conduct an HHS-wide data capacity assessment, consistent with Title II Evidence Act requirements. In support of specifically modernizing ACF’s data governance and related capacity, ACF launched the Data Governance Consulting and Support project. The Data Governance Support project is providing information gathering, analysis, consultation, and technical support to ACF and its partners to strengthen data governance practices within ACF offices and between ACF and its partners at the federal, state, local, and tribal levels.
  • The Administration for Children and Families also continues to support the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts as follows:
    • Quality: The Administration of Children and Families’ Evaluation Policy states that ACF is committed to using the most rigorous methods that are appropriate to the evaluation questions and the populations with whom research is being conducted and feasible within budget and other constraints. Rigor is necessary not only for impact evaluations, but also for implementation/process evaluations, descriptive studies, outcome evaluations, and formative evaluations; and in both qualitative and quantitative approaches.
    • Methods: The Administration of Children and Families uses a range of evaluation methods. It conducts impact evaluations as well as implementation and process evaluations, cost analyses and cost benefit analyses, descriptive and exploratory studies, research syntheses, and more. It also develops and uses methods that are appropriate for studying diverse populations, taking into account historical and cultural factors and planning data collection with disaggregation and subgroup analyses in mind. ACF is committed to learning about and using the most scientifically advanced approaches to determining effectiveness and efficiency of ACF programs; to this end, OPRE annually organizes meetings of scientists and research experts to discuss critical topics in social science research methodology and how innovative methodologies can be applied to policy-relevant questions.
    • Effectiveness: Its evaluation policy states that ACF will conduct relevant research and disseminate findings in ways that are accessible and useful to policymakers, practitioners, and the diverse populations that ACF programs serve. The Office of Planning, Research, and Evaluation engages in ongoing collaboration with ACF program office staff and leadership to interpret research and evaluation findings and to identify their implications for programmatic and policy decisions such as ACF regulations and funding opportunity announcements. For example, when ACF’s Office of Head Start significantly revised its program performance standards﹘the regulations that define the standards and minimum requirements for Head Start services﹘the revisions drew from decades of OPRE research and the recommendations of the OPRE-led Secretary’s Advisory Committee on Head Start Research and Evaluation. Similarly, ACF’s Office of Child Care drew from research and evaluation findings related to eligibility redetermination, continuity of subsidy use, use of funds dedicated to improving the quality of programs, and other information to inform the regulations accompanying the reauthorization of the Child Care and Development Block Grant.
    • Independence: The Administration for Children and Families’ Evaluation Policy states that independence and objectivity are core principles of evaluation. Agency and program leadership, program staff, service providers, populations and communities studied, and others should participate actively in setting evaluation priorities, identifying evaluation questions, and assessing the implications of findings. However, it is important to insulate evaluation functions from undue influence and from both the appearance and the reality of bias. To promote objectivity, ACF protects independence in the design, execution, analysis, and reporting of evaluations. To this end, ACF will conduct evaluations through the competitive award of grants and contracts to external experts who are free from conflicts of interest.
  • The deputy assistant secretary for planning, research, and evaluation reports directly to the assistant secretary for children and families, serves as ACF’s chief evaluation officer, has authority to approve the design of evaluation projects and analysis plans, and has authority to approve, release and disseminate evaluation reports.
2.6 Did the agency use rigorous evaluation methods, including random assignment studies, for research and evaluation purposes?
Score
4
Resources

Did the agency invest at least 1% of program funds in evaluations in FY22 (examples: impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance; and rigorous evaluations, including random assignments)?

3.1 ____ invested $____ on evaluations, evaluation technical assistance, and evaluation capacity-building, representing __% of the agency’s $___ billion FY22 budget.
  • The Administration for Children and Families invested approximately $156,000,000 in evaluations, evaluation technical assistance, and evaluation capacity building, representing approximately 0.2% of the agency’s approximately $69,600,000,000 FY22 enacted budget.
  • The Office of Planning, Research, and Evaluation has provided training to  its staff on procurement vehicles that specifically support small businesses with a race equity lens (e.g., minority-owned small businesses, Latino-owned small businesses, and tribally owned Native American concerns).
3.2 Did the agency have a budget for evaluation and how much was it (were there any changes in this budget from the previous fiscal year) ?
  • In FY22, ACF had an evaluation budget of approximately $156,000,000, a $25,000,000 decrease from FY21. The decrease is primarily due to COVID-19 funding received in FY20 and FY21 but not FY22. In addition, in FY22, Congress granted ACF the authority to extend the availability of a portion of its research and evaluation funds provided in the omnibus appropriation, allowing for more efficient administration for a portion of these dollars.
  • For context, a substantial portion of OPRE’s annual budget is from program offices designating certain funds for research, so the amount varies from year to year. Factors driving this fluctuation can be driven by changes in program office budgets or evaluation needs. Other factors influencing changes in OPRE’s funding include sometimes receiving funds from outside of ACF (e.g., SSA in FY19, the Office of the Assistant Secretary for Health in other years). Like program offices, the amount and frequency of these funds are driven by specific evaluation needs.
3.3 Did the agency provide financial and other resources to help city, county, and state governments or other grantees build their evaluation capacity (including technical assistance funds for data and evidence capacity building)?
Score
7
Performance Management / Continuous Improvement

Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures and did it frequently collect, analyze, and use data and evidence to   improve outcomes, return on investment, and other dimensions of performance in FY22?

4. 1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • Every four years, HHS updates its Strategic Plan, which describes its work to address complex, multifaceted, and evolving health and human services issues. The Administration for Children and Families was an active participant in the development of the FY22-FY26 HHS Strategic Plan, which includes several ACF-specific objectives. It regularly reports on progress associated with the current objectives as part of the FY21 HHS Annual Performance Plan/Report, including the twelve total performance measures from ACF programs that support this plan. These performance measures primarily support Goal 3: “strengthen social well-being, equity, and economic resilience.” ACF supports objective 3.1 (provide effective and innovative pathways leading to equitable economic success for all individuals and families), objective 3.2 (strengthen early childhood development and expand opportunities to help children and youth thrive equitably within their families and communities), and objective 3.4 (increase safeguards to empower families and communities to prevent and respond to neglect, abuse, and violence while supporting those who have experienced trauma or violence) by reporting annual performance measures. It is also an active participant in the HHS Strategic Review process, which is an annual assessment of progress on the subset of twelve performance measures it reports on as part of the HHS Strategic Plan.
  • The Administration for Children and Families launched its strategic plan in early 2022. It incorporates five high-level strategic goals: (1) advance equity by reducing structural barriers including racism and other forms of discrimination that prevent economic and social well-being (goal 1 is intended to be an explicit part of each of the other four goals); (2) take a preventative and proactive approach to ensuring child, youth, family, and individual well-being; (3) use whole-family, community-based strategies to increase financial stability and economic mobility; (4) support communities and families responding to acute needs and facilitate recovery from a range of crises and emergency situations; and (5) enable and promote innovation within ACF to improve the lives of children, youth, families, and individuals. These goals cut across all ACF programs and populations.
  • There will be five pilots, one per goal, that will allow ACF to implement and test ideas over 2022. For instance, the strategic goal 1 pilot is intended to center and integrate the perspectives and experiences of program participants in the design, management, evaluation, and decision-making of ACF programs and operations. The pilot will be focused on formalizing a process of consulting with communities that experience racism and other forms of discrimination to listen and build trust. As well, the strategic goal 4 pilot will support the objective of fostering resiliency among ACF’s customers to aid them in weathering and recovering from emergencies—such as the current pandemic. This pilot will assess how grantees and communities are integrating services into their programming to respond to children’s and parents’ social/emotional challenges. Staff from across ACF offices will work together to lead and execute these five pilots, and they will drive the design and intended outcomes. Following this initial set of pilots, ACF will continue to focus on action-oriented projects to yield meaningful progress on the ground.
  • In April 2021, the assistant secretary for ACF announced the launch of the implementation of an ambitious agency-wide equity agenda and named the associate commissioner of the Administration for Children Youth and Families as lead for the implementation of the Executive Order on Advancing Racial Equity and Support for Underserved Communities Through the Federal Government. ACF is advancing equity across four priority work areas: (1) the internal ACF workforce, (2) data, (3) programmatic and policy change, and (4) procurement and grant making. To steer this effort, in May 2021 ACF founded the Equity Advisory Group, which was made up of leadership from every ACF program office. As of May 2022, every ACF program office has created a strategic plan for how it will advance equity.
  • To communicate its progress on these efforts, ACF created a web page describing its equity-related activities since the launch of this agenda, which include holding community roundtables to explore the experiences of African American and Black individuals and families in accessing ACF programs and defining and articulating ACF’s plan through an information memorandum.
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
  • The Office of Planning, Research, and Evaluation currently reviews all ACF funding opportunity announcements and advises program offices, in accordance with their respective legislative authorities, on how to best integrate evidence into program design. Similarly, program offices have applied ACF research to inform their program administration. For example, ACF developed the Learn Innovate Improve (LI2) model, a systematic evidence-informed approach to program improvement that has since informed targeted technical assistance efforts for the TANF program and the evaluation requirement for the child support demonstration grants.
  • Administration for Children and Families’ programs also regularly analyze and use data to improve performance. For example, two ACF programs (Health Profession Opportunity Grants and Healthy Marriage and Responsible Fatherhood) have developed advanced web-based management information systems (PAGES and nFORM, respectively) that are used to track grantee progress, produce real-time reports so that grantees can use their data to adapt their programs, and record grantee and participant data for research and evaluation purposes.
  • ACF also uses the nFORM data to conduct the HMRF Compliance Assessment and Performance (CAPstone) Grantee Review, a process by which federal staff and technical assistance providers assess grantee progress toward and achievement in meeting programmatic, data, evaluation, and implementation goals. The results of the CAPstone process guide federal directives and future technical assistance.
4.3 Did the agency have continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement (examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)?
Score
7
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data consistent with strong privacy protections to improve (or help other entities improve) outcomes, cost-effectiveness, and/or the performance of federal, state, local, and other service providers programs in FY22 (examples: model data-sharing agreements or data-licensing agreements, data tagging and documentation, data standardization, open data policies, and data use policies)?

5.1 Did the agency have a strategic data plan, including an open data policy? [example: Evidence Act 202(c), Strategic Information Resources Plan]?
  • The Administration for Children and Families’ Interoperability Action Plan was established in 2017 to formalize its vision for effective and efficient data sharing. Under this plan ACF and its program offices will develop and implement a Data Sharing First strategy that starts with the assumption that data sharing is in the public interest. The plan states that ACF will encourage and promote data sharing broadly, constrained only when required by law or when there are strong countervailing considerations.
5.2 Did the agency have an updated comprehensive data inventory? (Example: Evidence Act 3511)
  • In 2020, ACF released a Compendium of ACF Administrative and Survey Data Resources. The compendium documents administrative and survey data collected by ACF that could be used for evidence building purposes. It includes summaries of twelve major ACF administrative data sources and seven surveys. Each summary includes an overview, basic content, available documentation, available data sets, restrictions on use, capacity to link to other data sources, and examples of prior research. It is a joint product of ACF’s Office of Planning, Research, and Evaluation and HHS’s Office of the Assistant Secretary for Planning and Evaluation.
  • In addition, in 2019 OPRE compiled the descriptions and locations of hundreds of  its archived datasets that are currently available for secondary analysis and made this information available on a single web page. The office continues to regularly update this website with current archiving information. It regularly archives research and evaluation data for secondary analysis, consistent with the ACF Evaluation Policy, which promotes rigor, relevance, transparency, independence, and ethics in the conduct of evaluation and research. This new consolidated web page serves as a one-stop resource that will help to make it easier for potential users to find and use the data that OPRE archives for secondary analysis.
  • In 2020 ACF launched the Data Governance Consulting and Support project, which is providing information gathering, analysis, consultation, and technical support to ACF and its partners to strengthen data governance practices within ACF offices, and between ACF and its partners at the federal, state, local, and tribal levels. Initial work is focusing on data asset tracking and metadata management, among other topics.
5.3 Did the agency promote data access or data linkage for evaluation, evidence-building, or program improvement [(examples: model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c)]?
  • The Administration for Children and Families has multiple efforts underway to promote and support the use of documented data for research and improvement, including making numerous administrative and survey datasets publicly available for secondary use and actively promoting the archiving of research and evaluation data for secondary use. These data are machine readable, downloadable, and de-identified as appropriate for each data set. For example, individual-level data for research are held in secure restricted use formats, while public use data sets are made available online. To make it easier to find these resources, ACF released a Compendium of ACF Administrative and Survey Data and consolidated information on archived research and evaluation data on the OPRE website.
  • Many data sources that may be useful for data linkage for building evidence on human services programs reside outside of ACF.
  • In 2020, OPRE released the Compendium of Administrative Data Sources for Self-Sufficiency Research, describing promising administrative data sources that may be linked to evaluation data in order to assess long-term outcomes of economic and social interventions. It includes national, federal, and state sources covering a range of topical areas. In addition, in October 2021 OPRE released A Guide for Using Administrative Data to Examine Long-Term Outcomes in Program Evaluation, a resource to assist program evaluation project teams—including funders, sponsors, and evaluation research partners—in assessing the feasibility and potential value of examining long-term outcomes using administrative data. It describes common steps that are involved in linking evaluation data and administrative data and how to assess the quality of linked study and administrative data, as well as how to assess the quality of linked study and administrative data. While it is primarily targeted to research audiences seeking to access administrative data to assess long-term outcomes once an evaluation has been completed, it is also useful for designing research and evaluations up front in order to enable such analysis at a later date.  Both publications were produced under contract by MDRC as a part of OPRE’s Assessing Options Evaluate Long-Term Outcomes Using Administrative Data project.
  • The Office of Planning, Research, and Evaluation has also released multiple publications to assist states, localities, and research teams in negotiating the privacy and confidentiality requirements of linking and accessing data for research, evaluation, and/or operational and program improvement purposes. For example, in October 2021 OPRE released an updated Confidentiality Toolkit, which contains information on how to responsibly share personally identifiable information collected by human services and related programs to improve program outcomes.  It discusses key federal privacy requirements, strategies to resolve challenges, and information technology security. It also includes documents used to facilitate record sharing and links to helpful resources. Building on the toolkit, OPRE released a Case Study Report on Iowa’s Integrated Data System for Decision-Making (I2D2) in May 2022. This report is the first in a planned series of publications that highlight innovative and unique state and local data sharing initiatives that are functional while protecting data privacy and confidentiality, consistent with the federal- and state-level legal requirements. These reports focus on the privacy and confidentiality challenges that states and localities face—and how they can be overcome—and provide model “tools” and resources (e.g., data sharing agreements) in downloadable and editable formats. Similarly, in June 2022 OPRE released the Sharing and Accessing Administrative Data: Promising Practices and Lessons Learned from the Child Maltreatment Incidence Data Linkages Project, which highlights promising practices for sharing and accessing data and discusses lessons learned related to four key activities essential to sharing and accessing data: (1) developing agreements for data sharing and use; (2) protecting the data’s security, confidentiality, and privacy; (3) securing institutional review board (IRB) and other approvals; and (4) accessing the data.
  • Additionally, ACF is actively exploring how enhancing and scaling innovative data linkage practices can improve our understanding of the populations served by ACF and build evidence on human services programs more broadly. For instance, the Child Maltreatment Incidence Data Linkages (CMI Data Linkages) project is examining the feasibility of leveraging administrative data linkages to better understand child maltreatment incidence and related risk and protective factors. Similarly, the Child and Caregiver Outcomes Using Linked Data project, a partnership between OPRE and ASPE, is working with states to enhance capacity to examine outcomes for children and parents who are involved in state child welfare systems and who may have behavioral health issues. The Office of Planning, Research, and Evaluation will shortly release a publication documenting Florida and Kentucky’s projects to link the Medicaid records of parents with the records of their children from the child welfare system and produce de-identified linked files for research use. This publication examines the practical aspects of creating such data linkages, including the language and interpretations of relevant state laws, and can be used as a guide for other states seeking to conduct the same linkages. In 2023 the project will be making available to researchers de-identified state-level datasets through a restricted use data archive. Also, in August 2021, OPRE published a brief presenting findings from the 2019 TANF Data Innovation Needs Assessment. This survey of state TANF agencies was designed to understand state strengths and challenges in linking and analyzing administrative data for program improvement. Findings from the needs assessment informed technical assistance provided to states through ACF’s TANF Data Collaborative. Information from the brief may be helpful to states, policymakers, and other funders in helping to support states in linking data for the purpose of evidence building.
  • The Administration for Children and Families actively promotes archiving of research and evaluation data for secondary use. Research contracts initiated by OPRE include a standard clause requiring contractors to make data and analyses supported through federal funds available to other researchers and to establish procedures and parameters for all aspects of data and information collection necessary to support archiving information and data collected under the contract. Many datasets from past ACF projects are stored in archives including the ACF-funded National Data Archive on Child Abuse and Neglect, the ICPSR Child and Family Data Archive, and the ICPSR data archive more broadly. Grants for secondary analysis of ACF/OPRE data have been funded by OPRE; examples in recent years include secondary analysis of strengthening families datasets and early care and education datasets. In 2019 ACF awarded Career Pathways Secondary Data Analysis Grants to stimulate and fund secondary analysis of data collected through the Pathways for Advancing Careers and Education (PACE) Study, HPOG Impact Study, and HPOG National Implementation Evaluation on questions relevant to career pathways programs’ goals and objectives. Information on all archived datasets that are currently available for secondary analysis is available on OPRE’s website. In 2022, OPRE developed a learning agenda to support ACF’s data archiving activities. Activities will document key lessons learned and identify a conceptual model to help federal project officers plan for data archiving. The resulting brief and toolkit will disseminate best practices for making data from federally funded research studies available for secondary use, as well as identify areas for future learning and growth.
5.4 Did the agency have policies and procedures to secure data and protect personal, confidential information (example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)?
  • The Administration for Children and Families receives privacy and security guidance from both the ACF and the HHS Office of the Chief Information Officer. Between these two offices, there are several policies and practices in place to ensure that all ACF data are protected. The HHS policies govern the departmental policies and procedures broadly, and ACF issues more specific policies and procedures as needed to govern ACF-specific data. This includes a process by which systems are evaluated and receive an authorization to operate. There are also teams in both offices that collectively respond to all incidents and assure they are handled in an appropriate manner. The requirements are supported by auditing mechanisms and a privacy and security training program.
  • In October 2021 OPRE released an updated Confidentiality Toolkit, which contains information on how to responsibly share personally identifiable information collected by human services and related programs to improve program outcomes. It discusses key federal privacy requirements, strategies to resolve challenges, and information technology security. It also includes documents used to facilitate record sharing and links to helpful resources. Building on the toolkit, OPRE released a Case Study Report in May 2022. This report is the first in a planned series of publications that highlight innovative and unique state and local data sharing initiatives that are functional while protecting data privacy and confidentiality, consistent with the federal- and state-level legal requirements. These publications focus on the privacy and confidentiality challenges that states and localities face—and how they can be overcome—and provide model “tools” and resources (e.g., data sharing agreements) in downloadable and editable formats. These publications were issued under the ACF Responsibly Sharing Confidential Data: Tools and Recommendations project, launched in 2020. The project is also exploring creating and maintaining a compendium of existing privacy and confidentiality laws for use by ACF staff.
  • The Administration for Children and Families also takes appropriate measures to safeguard the privacy and confidentiality of individuals contributing data for research throughout the archiving process, consistent with its core principle of ethics. Research data may be made available as public use files when the data would not likely lead to harm or to the re-identification of an individual, or through restricted access. Restricted access files are de-identified and made available to approved researchers through secure transmission and download, virtual data enclaves, physical data enclaves, or restricted online analysis.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • The Administration for Children and Families undertakes many program-specific efforts to support state, local, and tribal efforts to use human services data while protecting privacy and confidentiality. For example, ACF’s TANF Data Innovation Project supports innovation and improved effectiveness of state TANF programs by enhancing the use of data from TANF and related human services programs. This work includes transforming and documenting state-reported data to facilitate research use and establishing a data governance process to enable research requests and grants secure remote data access to approved researchers, including those from states and other ACF grantees.
  • Similarly, in 2020 OPRE awarded Human Services Interoperability Demonstration Grants to Georgia State University and Kentucky’s Department of Medicaid Services. These grants are intended to expand data sharing efforts by state, local, and tribal governments to improve human services program delivery and to identify novel data sharing approaches that can be replicated in other jurisdictions. For example, Georgia State University achieved interoperability between Georgia’s Division of Family and Children Services’ units by using an approach already implemented in their school system by using an open source, cloud-based hashing solution to match and link records. Georgia’s solution performs identity-matching functions without requiring a social security number for matching within units of the Division of Family and Child Services as well as sister agencies. The tool is open-source and available for reuse. In December 2021 OPRE awarded a second round of Interoperability Demonstration grants. The focus of the second round of grants is on using an HL7 FHIR-based approach to achieving interoperability with human services programs.  Reusable tools developed through these grants will be made available via the HL7 Human and Social Services Workgroup, which OPRE established in December 2021. Also in 2019, in partnership with ASPE, OPRE began a project to support states in linking Medicaid and child welfare data at the parent-child level to support outcomes research.
Score
8
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding purposes; did that framework prioritize rigorous research and evaluation methods; and did the agency disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY22 (example: What Works Clearinghouses)?

6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • The Administration for Children and Families has established a common evidence framework adapted for the human services context from the framework for education research developed by the U.S. Department of Education (ED) and the National Science Foundation (NSF). The ACF framework, which includes the six types of studies delineated in the ED/NSF framework, aims to (1) inform ACF’s investments in research and evaluation and (2) clarify for potential grantees’ and others’ ACF’s expectations for different types of studies.
6.2 Did the agency have a common evidence framework for funding decisions?
  • While ACF does not have a common evidence framework across all funding decisions, certain programs such  as those listed below do use a common evidence framework for funding decisions:
    • The Family First Prevention Services Act (FFPSA) enables states to use funds for certain evidence-based services. In April 2019, ACF published the Prevention Services Clearinghouse Handbook of Standards and Procedures, which provides a detailed description of the standards used to identify and review programs and services in order to rate programs and services as promising, supported, and well-supported practices.
    • The PREP Competitive Grants were funded to replicate effective, evidence-based program models or substantially incorporate elements of projects that have been proven to delay sexual activity, increase condom or contraceptive use for sexually active youth, and/or reduce pregnancy among youth. Through a systematic evidence review, HHS selected forty-four models that grantees could use, depending on the needs and age of the target population of each funded project.
6.3 Did the agency have a clearinghouse(s) or a user-friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • ACF sponsors several user friendly tools that disseminate and promote evidence-based interventions. Several evidence reviews of human services interventions have disseminated and promoted evidence-based interventions by rating the quality of evaluation studies and presenting results in a user friendly searchable format. Current evidence reviews include (1) Home Visiting Evidence of Effectiveness, which provides an assessment of the evidence of effectiveness for early childhood home visiting models that serve families with pregnant women and children from birth to kindergarten entry; (2) The Pathways to Work Evidence Clearinghouse, a user friendly website that reports on “projects that used a proven approach or a promising approach in moving welfare recipients into work, based on independent rigorous evaluations of the projects” and allows users to search for interventions based upon characteristics of the clients served by the intervention; (3) ACF’s Title IV-E Prevention Services Clearinghouse, whose easily accessible and searchable website allows users to find information about mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services designated as “promising,” “supported,” and “well-supported” practices by an independent systematic review; and (4) Child Care & Early Education Research Connections, which promotes high-quality research in child care and early education to support policymaking. Its associated website provides research and data resources for researchers, policymakers, practitioners, and others.
  • Additionally, most ACF research and evaluation projects produce and widely disseminate short briefs, tip sheets, or infographics that capture high-level findings from the studies and make information about program services, participants, and implementation more accessible to policymakers, practitioners, and others invested in the outcomes of the research or evaluation. For example, the PACE project released a series of nine short briefs to accompany the implementation and early impact reports that were released for each of the nine PACE evaluation sites.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • The Administration for Children and Families Evaluation Policy states that it is important for evaluators to disseminate research findings in ways that are accessible and useful to policymakers, practitioners, and the communities that ACF serves and that OPRE and program offices will work in partnership to inform potential applicants, program providers, administrators, policymakers, and funders through disseminating evidence from ACF-sponsored and other good quality evaluations. Research contracts initiated by OPRE include a standard clause requiring contractors to develop a dissemination plan during early project planning to identify key takeaways, target audiences, and strategies for most effectively reaching the target audiences. The dissemination strategy adopted by OPRE is also supported by a commitment to plain language; OPRE works with its research partners to ensure that evaluation findings and other evidence are clearly communicated. Additionally, it has a robust dissemination function that includes the OPRE website, including a blog, an e-newsletter, and social media presence on Facebook, Twitter, Instagram, and LinkedIn.
  • The Office of Planning, Research, and Evaluation hosts an annual “Evaluation and Monitoring 101” training for ACF staff to help agency staff better understand how to design, conduct, and use findings from program evaluation and performance monitoring, ultimately building the capacity of agency staff and program offices to use evaluation research and data analysis to improve agency operations.
  • The office biennially hosts two major conferences, the Research and Evaluation Conference on Self-Sufficiency and the National Research Conference on Early Childhood to share research findings with researchers and with program administrators and policymakers at all levels. It also convenes the Network of Infant and Toddler Researchers, which brings together applied researchers with policymakers and technical assistance providers to encourage research-informed practice and practice-informed research, and the Child Care and Early Education Policy Research Consortium, which brings together researchers, policymakers, and practitioners to discuss what is being learned from research that can help inform policy decisions for ACF, states, territories, localities, and grantees and to consider the next steps in early care and education research.
  • The Children’s Bureau sponsors the recurring National Child Welfare Evaluation Summit to bring together partners from child welfare systems and the research community to strengthen the use of data and evaluation in child welfare; disseminate information about effective and promising prevention and child welfare services, programs, and policies; and promote the use of data and evaluation to support sound decision-making and improved practice in state and local child welfare systems.
  • The Administration for Children and Families also sponsors additional resources:
Score
7
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY22 (examples: prizes and challenges, behavioral science trials, innovation labs/accelerators, performance partnership pilots, and demonstration projects or waivers with rigorous evaluation requirements)?

7.1 Did the agency have staff dedicated to leading its innovation efforts to improve the impact of its programs?
  • In late 2019, ACF implemented a customer experience initiative to enhance its delivery and administration of human services. This initiative focuses on ways to improve the experiences of both grantees and ACF employees. In 2020, ACF named a chief experience officer to lead these efforts. To date, the chief experience officer has led efforts to understand and improve upon the experiences of ACF grantees receiving funding from multiple HHS operating divisions, evaluate and address the challenges that organizations face in applying for competitive grants, and develop an internal tool for ACF teams to assess and improve upon their capability to provide excellent technical assistance to ACF grantees. In 2021, ACF launched an innovation incubator initiative that began with a series of three human-centered design trainings offered to ACF employees to equip them with the skills and resources to identify problems, brainstorm ideas for improvement, and pilot solutions using an empathetic “people-first” mindset. Through this initiative, ACF staff received training in creating, evaluating, awarding, and managing contracts that use human-centered design services. Participating staff also have access to the ACF Innovators community, a shared platform that supports interoffice idea generation and collaboration. While the customer experience initiative is funded on a by-request basis, it received approximately $1,000,000 for FY22 to support this work.
  • ACF is also wrapping up a project, Human-Centered Design for Human Services, to explore the application of human-centered design across its service delivery programs at the federal, state, and local levels. Three state/local agencies received training and technical assistance in using human-centered design and used the approach to address selected challenges. The project developed a theory of change and instruments to assess implementation, supporting future rigorous evaluations of human-centered design in human services programs.
7.2 Did the agency have initiatives to promote innovation to improve the impact of its programs?
  • The Administration for Children and Families’ mission to foster health and well-being by providing federal leadership, partnership and resources for the compassionate and effective delivery of human services is undergirded by six values: dedication, professionalism, integrity, stewardship, respect, and excellence. Its emphasis on excellence, exemplified by innovations and solutions that are anchored in available evidence, build knowledge and transcend boundaries, drives the agency’s support for innovation across programs and practices.
    • ACF’s customer experience initiative is supporting the development of innovative practices for more efficient and responsive agency operations,  improving how ACF understands and meets the needs of grantees and improving its capacity for service delivery. For example, ACF, in partnership with HRSA, convened a gathering of grantees who receive Head Start grants from ACF and federally qualified Health Center grants from HRSA to create opportunities for grantees to learn from one another and share best practices. Additionally, ACF helped a grantee analyze their data across both Head Start and Health Center programs to make operational improvements to their program.
    • A Racial Equity Impact Analysis Tool supports ACF’s commitment to advance equity and helps ACF to have a consistent approach to identifying barriers and gaps in the development of new policies and practices, as well as the review of existing ones. Possible uses of this tool include assessing proposed legislation as part of the annual A-19 budget and legislative development process; evaluating disparity impact statements in Notices of Funding Opportunities; and performing an assessment of the pilot for strategic goal 1 of the ACF Strategic Plan, advancing equity.
    • The 2022 Innovation Challenge identified the best ways to advance ACF’s mission by allowing staff to submit an idea for how ACF could solve an issue it is facing. Staff voted on ideas, which will move forward to a panel of judges. Ten winning teams will then receive support, hands-on training, and coaching for twelve weeks as they develop and pilot their idea.
    • The HHS Equity Research Agenda, one of the tasks supporting Presidential Executive Order 13985 on Advancing Racial Equity and Support for Underserved Communities Through the Federal Government, will help guide HHS efforts to (1) identify methods for disparities/equity research that are appropriately community centered, strengths based, and do not pathologize populations that are marginalized and (2) establish ways to appropriately generate new evidence and build on existing evidence to identify and reduce disparities and understand the opportunity and impact of HHS programs on underserved/under resourced communities. The Administration for Children and Families has published its commitment to equity and related research and resources on its website and publicly described its commitment to equity. This includes more than five  projects with an explicit focus on questions and activities related to advancing equity in human services (e.g., Child Welfare Study to Enhance Equity with Data, African American Child and Family Research Center, Race Equity for Fatherhood, Relationship, and Marriage Programs to Empower Black Families, and , Contextual Analysis and Methods of Participant Engagement).
    • The ACFx Customer Experience Survey in 2022 will seek responses from the authorized official for all active ACF discretionary grants with the goal of understanding customer experience. The survey will cover applying for funding, post-award training/orientation, guidance received from the ACF program office, technical assistance received, and experience with reporting and requirements. Specifically, this survey will help give teams working with grant recipients actionable information on where to target improvements, such as identifying whether improvement efforts directed at first-time grant recipients would be more impactful than efforts directed at a particular phase of the grant life cycle.
  • The Administration for Children and Families also administers select grant programs – through innovation projects, demonstration projects, and waivers to existing program requirements – that are designed to both implement and evaluate innovative interventions as a part of an ACF-sponsored evaluation or an individual evaluation to accompany implementation of that innovation. For example:
  • ACF projects that support innovation include the following:
7.3 Did the agency evaluate its innovation efforts, including using rigorous methods?
Score
7
Use of Evidence in Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its competitive grant programs in FY22 (examples: tiered-evidence frameworks, evidence-based funding set-asides, priority preference points or other preference scoring for evidence, and pay for success provisions)?

8.1 What were the agency’s five largest competitive programs and their appropriations amount (and were city, county, and/or state governments eligible to receive funds from these programs)?
  • In FY22, the five largest competitive grant programs are:
    1. Head Start:  $11,000,000,000; eligible applicants: public or private nonprofit organizations, including community-based and faith-based organizations, or for-profit agencies);
    2. Unaccompanied Children Services: $8,000,000,000; eligible applicants: private nonprofit and for-profit agencies;
    3. Refugee Support Services: $307,000,000; eligible applicants: states (via formula), private nonprofit, and for-profit agencies;
    4. Preschool Development Grants: $290,000,000; eligible applicants: states;
    5. Healthy Marriage Promotion and Responsible Fatherhood Grants: $149,000,000; eligible applicants: states, local governments, tribal entities, and community-based organizations, both for profit and nonprofit, including faith-based).
8.2 Did the agency use evidence of effectiveness to allocate funds in five largest competitive grant programs (e.g., were evidence-based interventions/practices required or suggested and was evidence a significant requirement)?
  • The Head Start Designation Renewal System (DRS) determines whether Head Start/Early Head Start grantees are delivering high-quality comprehensive services to the children and families that they serve. These determinations are based on seven conditions, one of which looks at how Head Start classrooms within programs perform on the Classroom Assessment Scoring System (CLASS), an observation-based measure of the quality of teacher-child interactions. When the DRS deems that they are underperforming, grantees are denied automatic renewal of their grant and must apply for funding renewal through a standard open competition process. In the most recent language, grantees who are re-competing for Head Start funds must include a description of any violations, such as deficiencies, areas of noncompliance, and/or audit finding in their record of past performance (p. 28). Applicants may describe the actions they have taken to address these violations. According to Head Start policy, in competitions to replace or potentially replace a current grantee, the responsible HHS official will give priority to applicants that have demonstrated capacity in providing effective, comprehensive, and well-coordinated early childhood education and development services and programs.
8.3 Did the agency use its five largest competitive grant programs to build evidence (e.g., requiring grantees to participate in evaluations)?
  • The Administration for Children and Families’ template (see p. 14 in Attachment C) for competitive grant announcements includes standard language that funding opportunity announcement drafters may select to require grantees to either (1) collect performance management data that contribute to continuous quality improvement and are tied to the project’s logic model or (2) conduct a rigorous evaluation for which applicants must propose an appropriate design specifying research questions, measurement, and analysis.
  • As a condition of award, Head Start grantees are required to participate fully in ACF-sponsored evaluations if selected to do so. As such, ACF has an ongoing research portfolio that is building evidence in Head Start. Research sponsored through Head Start funding over the past decade has provided valuable information not only to guide program improvement in Head Start itself, but also to guide the field of early childhood programming and early childhood development. Dozens of Head Start programs have collaborated with researchers in making significant contributions in terms of program innovation and evaluation, as well as the use of systematic data collection, analysis, and interpretation in program operations.
  • The Administration for Children and Families’ 2020 HMRF Grants established required evidence activities by scope of grantee services. For example, large scope services (requesting funding between $1,000,000 and $1,500,000) “must propose a rigorous impact evaluation (i.e., randomized controlled trial (RCT) or high-quality, quasi-experimental design (QED) study) . . . and must allocate at least 15%, but no more than 20%, of their total annual funding for evaluation”. Regardless of their scope of services, all 2020 HMRF grantees must plan for and carry out continuous quality improvement activities and conduct a local evaluation or participate in a federally led evaluation or research effort. ACF has an ongoing research portfolio building evidence related to Strengthening Families, Healthy Marriage, and Responsible Fatherhood, and has conducted randomized controlled trials with grantees in each funding round of these grants.
  • ACF reviewed performance data from the 2015 cohort of HMRF grantees (using the nFORM system) to set priorities, interests, and expectations for HMRF grants that were awarded in 2020. For example, because nFORM data indicated that organizations were more likely to meet enrollment targets and engage participants when they focused on implementing one program model, ACF’s 2020 funding opportunity announcement, which led to 113 HMRF grant awards in September 2020, mentioned specific interest in grantee projects “that implement only one specific program model designed for one specific youth service population.”
  • In its award decisions, ACF gave “preference to those applicants that were awarded a Healthy Marriage or Responsible Fatherhood grant between 2015 and 2019, and that (a) [were] confirmed by ACF to have met all qualification requirements under Section IV.2, The Project Description, Approach, Organizational Capacity of this FOA; and (b) [were] confirmed by ACF to have received an acceptable rating on their semi-annual grant monitoring statements during years three and four of the project period. [ACF gave] particular consideration to applicants that: (1) designed and successfully implemented, through to end of 2019, an impact evaluation of their program model, and that the impact evaluation was a fair impact test of their program model and that was not terminated prior to analysis; or (2) successfully participated in a federally-led impact evaluation”.
  • The Administration for Children and Families also evaluated HMRF grant applicants based upon their capacity to conduct a local impact evaluation and their proposed approach (for applicants required or electing to conduct local evaluations); their ability to provide a reasonable rationale and/or research base for the program model(s) and curriculum(a) proposed; and their inclusion of a continuous quality improvement plan, clearly describing the organizational commitment to data-driven approaches to identify areas for program performance, testing potential improvements, and cultivating a culture and environment of learning and improvement, among other things. Further, The compliance and performance reviews (CAPstone) entail a thorough review of each grantee’s performance. The Office of Family Assistance (OFA) sends a formal set of questions about grantee performance that the grant program specialists and technical assistance providers answer ahead of time, and then they convene meetings where the performance of each grantee is discussed by OFA, OPRE, and the technical assistance provider at length using nFORM data and the answers to the formal questions mentioned above.
  • The 2003 Reauthorization of the Runaway and Homeless Youth Act called for a study of long-term outcomes for youth who are served through the Transitional Living Program. In response, ACF is sponsoring a study that will capture data from youth at program entry and at intermediate and longer term follow-up points after program exit and will assess outcomes related to housing, education, and employment. Additionally, ACF is sponsoring a process evaluation of the 2016 Transitional Living Program Special Population Demonstration Project.
  • The Administration for Children and Families manages the Runaway and Homeless Youth Training and Technical Assistance Center (RHYTTAC), the national entity that provides resources and direct assistance to the RHY grantees and other youth serving organizations eligible to receive RHY funds. This training and technical assistance center disseminates information about and supports grantee implementation of high-quality, evidence-informed, and evidence-based practices. In the most recent RHYTTAC grant award, applicants were evaluated based on their strategy for tracking RHY grantee uptake and implementation of evidence-based or evidence-informed strategies. Additionally, as described in the FY22 Transitional Living Program funding opportunity announcement, successful applicants must train all staff and volunteers on evidence-informed practices and provide case management services that include the development of service and treatment plans employing evidence-informed strategies.
  • The Administration for Children and Families also evaluates Unaccompanied Children Services, Preschool Development Grants, and RHY grant applicants based upon their proposed program performance evaluation plan how their data will contribute to continuous quality improvement, and their demonstrated experience with comparable program evaluation, among other factors.
8.4 Did the agency use evidence of effectiveness to allocate funds in any other competitive grant programs (besides its five largest grant programs)?
  • ACF’s Personal Responsibility Education Program includes three individual discretionary grant programs that fund programs exhibiting evidence of effectiveness, innovative adaptations of evidence-based programs, and promising practices that teach youth about abstinence and contraception to prevent pregnancy and sexually transmitted infections.
  • To receive funding through ACFs Sexual Risk Avoidance Education program, applicants must cite evidence published in a peer-reviewed journal and/or a randomized controlled trial or quasi-experimental design to support their chosen interventions or models.
8.5 What are the agency’s 1-2 strongest examples of how competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • As mentioned above, ACF is conducting a multipronged evaluation of the Health Profession Opportunity Grants Program . Findings from the first cohort of HPOG grants influenced the funding opportunity announcement for the second round (HPOG 2.0) funding. ACF used findings from the impact evaluation of the first cohort of HPOG grants to provide insights to the field about which HPOG program components are associated with stronger participant outcomes. For example, based on the finding that many participants engaged in short-term training for low-wage, entry-level jobs, the HPOG 2.0 funding opportunity announcement more carefully defined the career pathways framework, described specific strategies for helping participants progress along a career pathway, and identified and defined key HPOG education and training components. Applicants were required to more clearly describe how their program would support career pathways for participants. Based on an analysis, which indicated limited collaborations with health care employers, the HPOG 2.0 funding opportunity announcement required applicants to demonstrate the use of labor market information, consult with local employers, and describe their plans for employer engagement. It also placed more emphasis on providing basic skills education and assessment of barriers to make the programs accessible to clients who were most prepared to benefit, based on the finding that many programs were screening out applicants with low levels of basic literacy, reading, and numeracy skills.
  • The Administration for Children and Families’ Personal Responsibility Education Innovative Strategies Program grantees must conduct independent evaluations of their innovative strategies for the prevention of teen pregnancy, births, and sexually transmitted infections, supported by ACF training and technical assistance. These rigorous evaluations are designed to meet the HHS Teen Pregnancy Prevention Evidence-Based Standards and are expected to generate lessons learned so that others can benefit from these strategies and innovative approaches.
  • In 2019, ACF awarded two child welfare discretionary grants to build knowledge of what works. Regional Partnership Grants to Increase the Well-Being of, and to Improve the Permanency Outcomes for, Children and Families Affected By Opioids and Other Substance Abuse aim to build evidence on the effectiveness of targeted approaches that improve outcomes for children and families affected by opioids and other substance use disorders. To this end, grantees will evaluate their local program; select and report on performance indicators that align with proposed program strategies and activities; and participate in a national cross-site evaluation that will describe outcomes for children, adults, and families enrolled in regional partnership grant projects as well as the outcomes of the partnerships. Grants for Community Collaboratives to Strengthen and Preserve Families will support the development, implementation, and evaluation of primary prevention strategies to improve the safety, stability, and well-being of all families through a continuum of community-based services and supports. Projects will include both process and outcome evaluations.
8.6 Did the agency provide guidance that makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity building efforts?
  • The template for ACF’s competitive grant announcements includes standard language instructing grantees to conduct evaluation efforts. Program offices may use this template to require grantees to collect performance data or conduct a rigorous evaluation. Applicants are instructed to include third-party evaluation contracts in their proposed budget justifications.
  • ACF’s 2020 HMRF grants established required evidence activities by scope of grantee services. For example, large scope services (requesting funding between $1,000,000 and $1,500,000) “must propose a rigorous impact evaluation (i.e., randomized controlled trial (RCT) or high-quality, quasi-experimental design (QED) study) . . . and must allocate at least 15%, but no more than 20%, of their total annual funding for evaluation”. Regardless of their scope of services, all 2020 HMRF grantees must plan for and carry out continuous quality improvement activities and conduct a local evaluation or participate in a federally led evaluation or research effort.
  • ACF’s 2018 Preschool Development Grants funding announcement notes that “it is intended that States or territories will use a percentage of the total amount of their [renewal] grant award during years two through four to conduct the proposed process, cost, and outcome evaluations, and to implement a data collection system that will allow them to collect, house, and use data on the populations served, the implementation of services, the cost of providing services, and coordination across service partners.”
  • ACF’s rules (section 1351.15) allow RHY grant awards to be used for data collection and analysis.
  • Regional Partnership Grants require a minimum of 20% of grant funds to be spent on evaluation elements. To support the evaluation capacity of RPG grantees, ACF has provided technical assistance for data collection, performance measurement, and continuous quality improvement; implementation of the cross-site evaluation; and knowledge dissemination. It has also provided group technical assistance via webinars and presentations.
  • Grants for Community Collaboratives to Strengthen and Preserve Families (CCSPF) grants (p. 7) require a minimum of 10% of grant funds to be used on data collection and evaluation activities. ACF has supported the evaluation capacity of CCSPF grantees by providing technical assistance for developing research questions, methodologies, process and outcome measures; implementing grantee-designed evaluations and continuous quality improvement activities; analyzing evaluation data; disseminating findings; and supporting data use in project and organizational decision-making processes.
  • ACF also provides evaluation technical assistance to grantees:
    • to support grantees participating in federal evaluations (e.g., projects supporting grantees from Health Profession Opportunity Grants 2.0 and Tribal Health Profession Opportunity Grants 2.0); and
    • to support grantees conducting their own local evaluations (e.g., projects supporting Healthy Marriage and Responsible Fatherhood grantees, PREP grantees, and YARH grantees.
Score
7
Use of Evidence in Noncompetitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its non-competitive grant programs in FY22 (examples: evidence-based funding set-asides, requirements to invest funds in evidence-based activities, and pay for success provisions)?

9.1 What were the agency’s five largest noncompetitive programs and their appropriation amounts (and were city, county, and/or state governments eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in the largest five noncompetitive grant programs (e.g., are evidence-based interventions/practices required or suggested and Is evidence a significant requirement)?
  • The FFPSA (Division E, Title VII of the Bipartisan Budget Act of 2018), funded under the Foster Care budget, enables states to use federal funds available under parts B and E of Title IV of the Social Security Act to provide enhanced support to children and families and prevent foster care placements through the provision of evidence-based mental health and substance abuse prevention and treatment services, in-home parent skill-based programs, and kinship navigator services. This act requires an independent systematic review of evidence to designate programs and services as “promising,” “supported,” and “well-supported” practices. Only interventions designated as evidence-based will be eligible for federal funds. Attached to the FY20 Appropriations Act was the Family First Transition Act (P.L. 116-94), which provided grantees with additional time and resources to implement the requirements of FFPSA.
  • Most of ACF’s noncompetitive grant programs are large block grants without the legislative authority to use evidence of effectiveness to allocate funds.
9.3 Did the agency use its five largest noncompetitive grant programs to build evidence? (e.g., requiring grantees to participate in evaluations)?
  • TANF Grant Program: The TANF statute gives HHS responsibility for building evidence about the TANF program: “Evaluation of the Impacts of TANF: The Secretary shall conduct research on the effect of State programs funded under this part and any other State program funded with qualified State expenditures on employment, self-sufficiency, child well-being, unmarried births, marriage, poverty, economic mobility, and other factors as determined by the Secretary.”  Since FY17, Congress has designated 0.33% of the TANF Block Grant for related research, evaluation, and technical assistance.  The Administration for Children and Families has a long-standing and ongoing research portfolio in service of building evidence for the TANF Grant Program. It conducts research and evaluation projects in collaboration with TANF grantees, typically in areas where TANF grantees are facing challenges, innovating, or carrying out demonstration projects. Recent and ongoing work includes building evidence around career pathways training programs, subsidized employment approaches, job search assistance, and employment coaching. These are all program approaches used by state and county TANF grantees to meet their employment goals. The administration widely disseminates information from its research and evaluation activities to TANF grantees and provides extensive training and technical assistance.
  • The TANF Data Innovation (TDI) project, launched by ACF in 2017, supports the innovation and improved effectiveness of state TANF programs by enhancing the use of data from TANF and related human services programs. In 2019, the TANF Data Collaborative (TDC), an initiative of the TDI project, conducted a needs assessment survey of all states. It is now supporting a TANF agency Pilot program with eight pilot sites. To support state and local efforts and build strategic partnerships, pilot agencies are receiving funding and intensive training and technical assistance.
  • Child Care Development Block Grant Program: While the Child Care Development Block Grant Act does not allocate funding for states to independently build evidence, the act allows for up to 0.5% of CCDBG funding for a fiscal year to be reserved for HHS to conduct research and demonstration activities and to conduct periodic, external, independent evaluations of the CCDF program with respect to increasing access to child care services and improving the quality and safety of child care services.Health and Human Services must then disseminate the key findings of these evaluations widely and on a timely basis. In recent years, appropriations acts have also authorized the use of up to 0.5% of child care entitlement funds for this purpose. ACF manages this ongoing research portfolio to build evidence for the CCDBG Program, conducting research and evaluation projects in collaboration with CCDBG grantees, typically in areas where they are facing challenges, innovating, or carrying out demonstration projects. Major ongoing and recent projects include the National Survey of Early Care and Education; assessment of evidence on ratings in quality rating and improvement systems and several research partnerships between CCDF lead agencies and researchers. ACF widely disseminates information from its research and evaluation activities to CCDF grantees and provides extensive training and technical assistance.
  • Foster Care and Related Child Welfare Grant Programs: The Administration for Children and Families administers several foster care and related child welfare grant programs that do not possess the funding authority for states to conduct independent evidence-building activities. Some of these programs have set-asides for federal research; the Foster Care Independence Act of 1999, for instance, sets aside 1.5% of the allocation for the John H. Chafee Foster Care Program for Successful Transition to Adulthood program for evaluations of promising independent living programs.
  • As such, ACF has an ongoing research and evaluation portfolio on the Title IV-E foster care grant program and related grant programs. It conducts research and evaluation in collaboration with child welfare grantees, typically focusing on areas in which grantees are facing challenges, innovating, or conducting demonstrations. Examples include strategies for prevention of maltreatment, meeting service needs, and improving outcomes for children who come to the attention of child welfare. For instance, the Supporting Evidence Building in Child Welfare project is intended to increase the number of evidence-supported interventions grantees can use to serve the child welfare population. Other child welfare research and evaluation efforts include National Survey of Child and Adolescent Well-being Building Capacity to Evaluate Child Welfare Community Collaborations to Strengthen and Preserve Families, Building Capacity to Evaluate Interventions for YARH, and Child Welfare Study to Enhance Equity with Data.
  • The Administration for Children and Families has begun work on conducting formative evaluations of independent living programs of potential national significance in preparation for possible future summative evaluations. This work builds on the multi-site evaluation of foster youth programs, a rigorous, random assignment evaluation of four programs funded under the Chafee program completed in 2011.
  • Also, ACF’s Community-Based Child Abuse Prevention (CBCAP) formula grants, with a focus on supporting community-based approaches to prevent child abuse and neglect, are intended to inform the use of other child welfare funds more broadly.
  • Child Support Enforcement Research and Evaluation Grant Program: Section 1115 of the Social Security Act provides unique authority for research and evaluation grants to child support enforcement grantees to “improve the financial well-being of children or otherwise improve the operation of the child support program.” For instance, ACF awarded digital marketing grants to test digital marketing approaches and partnerships to reach parents who could benefit from child support services and create or improve two-way digital communication and engagement with parents.
  • The ACF child support enforcement research portfolio is multifaceted.  A variety of research and evaluation components are administered to understand more about cost and program effectiveness. Research and evaluation within the portfolio have consisted of (1) supporting large multi-state demonstrations that include random assignment evaluations (described in criteria question 7.4), (2) funding a supplement to the Census Bureau’s Current Population survey, and (3) supporting research activities of other government programs and agencies by conducting matches of their research samples to the NDNH.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other noncompetitive grant programs (besides its five largest grant programs)?
  • States applying for funding from ACF’s CBCAP grant program must “demonstrate an emphasis on promoting the increased use and high quality implementation of evidence-based and evidence-informed programs and practices.” The Children’s Bureau defines evidence-based and evidence-informed programs and practices along a continuum with four categories: emerging and evidence-informed; promising; supported; and well supported. Programs determined to fall within specific program parameters will be considered to be evidence-informed or evidence-based practices , as opposed to programs that have not been evaluated using any set criteria. ACF monitors progress on the percentage of program funds directed toward evidence-based and evidence-informed practices. Similarly, the State Personal Responsibility Education Program (State PREP), which awards grants to state agencies to educate young people on both abstinence and contraception to prevent pregnancy and sexually transmitted infections . These state PREP projects replicate effective evidence-based program models or substantially incorporate elements of effective programs.
9.5 What are the agency’s 1-2 strongest examples of how noncompetitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • In Section 413 of the Social Security act where Congress gives HHS primary responsibility for building evidence about the TANF program, Congress also commissions HHS to develop “a database (which shall be referred to as the ‘What Works Clearinghouse of Proven and Promising Projects to Move Welfare Recipients into Work’) of the projects that used a proven approach or a promising approach in moving welfare recipients into work, based on independent, rigorous evaluations of the projects”. In April of 2020, ACF officially launched the Pathways to Work Evidence Clearinghouse, a user friendly website that shares the results of the systematic review and provides web-based tools and products to help state and local TANF administrators, policymakers, researchers and the general public make sense of the results and better understand how this evidence might apply to questions and contexts that matter to them.
  • Additionally, ACF has continued to produce findings from numerous randomized controlled trials providing evidence on strategies that TANF agencies can use such as subsidized employment, coaching, career pathways, and job search strategies. Ongoing ACF efforts to build evidence for what works for TANF recipients and other low-income individuals include the Building Evidence on Employment Strategies for Low-Income Families project and the Next Generation of Enhanced Employment Strategies project; these projects are evaluating the effectiveness of innovative programs designed to boost employment and earnings among low-income individuals.
  • ACF’s Office of Child Care drew on research and evaluation findings related to eligibility redetermination, continuity of subsidy use, use of dollars to improve the quality of programs, and more to inform regulations related to child care and development block grant reauthorization.
9.6 Did the agency provide guidance which makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • Community-Based Child Abuse Prevention programs are authorized as part of the Child Abuse Prevention and Treatment Act (CAPTA), which promotes the use of evidence-based and evidence-informed programs and practices that effectively strengthen families and prevent child abuse and neglect. This includes efforts to improve the evaluation capacity of the states and communities to assess progress of their programs and collaborative networks in enhancing the safety and wellbeing of children and families. The 2020 Program Instruction for the CBCAP grant program states that CBCAP funds made available to states must be used for financing, planning, community mobilization, collaboration, assessment, information and referral, startup, training and technical assistance, information management and reporting, and reporting and evaluation costs for establishing, operating, or expanding community-based and prevention-focused programs and activities designed to strengthen and support families and prevent child abuse and neglect, among other things.
  • The Child Care and Development Block Grant Act of 2014 requires states to spend not less than 7%, 8%, and 9% of their CCDF awards (“quality funds”), respectively, for years 1-2, 3-4, and 5+ after 2014 CCDBG enactment on activities to improve the quality of child care services provided in the state, including:
    • 1B: supporting the training and professional development of the child care workforce through . . . incorporating the effective use of data to guide program improvement;
    • 3: developing, implementing, or enhancing a quality rating system for child care providers and services, which may support and assess the quality of child care providers in the State (A) and be designed to improve the quality of different types of child care providers (C);
    • 7: evaluating and assessing the quality and effectiveness of child care programs and services offered in the state, including evaluating how such programs positively impact children.
  • The Administration for Children and Families requires all CCDF lead agencies to annually report on how their CCDF quality funds were expended, including the activities funded and the measures used by states and territories to evaluate progress in improving the quality of child care programs and services. It released a program instruction for state and territorial lead agencies to provide guidance on reporting the authorized activities for the use of quality funds.
  • It also provides evaluation technical assistance for grantees:
Score
6
Repurpose for Results

In FY22, did the agency shift funds away from or within any practice, policy, or program that consistently failed to achieve desired outcomes (examples: requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; incentivizing or urging grant applicants to stop using ineffective practices in funding announcements; proposing the elimination of ineffective programs through annual budget requests; incentivizing well-designed trials to fill specific knowledge gaps; supporting low-performing grantees through mentoring, improvement plans, and other forms of assistance; and using rigorous evaluation results to shift funds away from a program)?

10.1 Did the agency have policy(ies) for determining when to shift funds away from grantees, practices, policies, interventions, and/or programs that consistently failed to achieve desired outcomes, and did the agency act on that policy?
  • The Family First Prevention Services Act of 2018 allows federal matching funds only for evidence-based prevention services offered by states, thereby incentivizing states to shift their spending from non-evidence-based approaches.
  • For ACF’s Child and Family Services Reviews of state child welfare systems, states determined not to have achieved substantial conformity in all the areas assessed must develop and implement a Program Improvement Plan addressing the areas of nonconformity. ACF supports the states with technical assistance and monitors implementation of their plans. States must successfully complete their plans to avoid financial penalties for nonconformance.
  • The ACF Head Start program significantly expanded its accountability provisions with the establishment of five-year Head Start grant service periods and the DRS. The DRS was designed to determine whether Head Start and Early Head Start programs are providing high-quality comprehensive services to the children and families in their communities. Where they are not, grantees are denied automatic renewal of their grant and must apply for funding renewal through an open competition process. Those determinations are based on seven conditions, one of which looks at how Head Start classrooms within programs perform on CLASS, an observation-based measure of the quality of teacher-child interactions. Data from ACF’s Head Start FACES and Quality Features, Dosage, Thresholds and Child Outcomes (Q-DOT) study were used to craft the regulations that created the DRS and informed key decisions in its implementation. This included where to set minimum thresholds for average CLASS scores, the number of classrooms within programs to be sampled to ensure stable program-level estimates on CLASS, and the number of cycles of CLASS observations to conduct. At the time the DRS notification letters were sent out to grantees in 2011, there were 1,421 non-tribal active grants, and of these, 453 (32%) were required to re-compete.
  • Findings from the evaluation of the first round HPOG Grants program influenced the funding opportunity announcement for the second round of HPOG funding. Namely, the scoring criteria used to select HPOG 2.0 grantees incorporated knowledge gained about challenges experienced in the HPOG 1.0 grant program. For example, based on those challenges, applicants were asked to clearly demonstrate﹘and verify with local employers﹘an unmet need in their service area for the education and training activities proposed. Applicants were also required to provide projections for the number of individuals expected to begin and complete basic skills education. Grantees must submit semiannual and annual progress reports to ACF to show their progress in meeting these projections. If they have trouble doing so, grantees are provided with technical assistance to support improvement or are put on a corrective action plan so that ACF can more closely monitor their steps toward improvement.
10.2 Did the agency identify and provide support to agency programs or grantees that failed to achieve desired outcomes?
  • In an effort to create operational efficiencies and increase grantee capacity for mission-related activities, ACF implemented a process in 2019 in which the grants management office completes annual risk modeling of grantee financial administrative datasets, which helps identify organizations that would benefit from targeted technical assistance. The grants management office provides technical assistance to these grantees to improve their financial management and help direct resources toward effective service delivery.
  • As mentioned in 10.1, states reviewed by a Child and Family Services Review and determined not to have achieved substantial conformity in all the areas assessed must develop and implement a Program Improvement Plan addressing the areas of nonconformity. The Administration for Children and Families supports states with technical assistance and monitors implementation of their plans. It also provides broad programmatic technical assistance to support grantees in improving their service delivery, including the Child Welfare Capacity Building Collaborative. The collaborative is designed to help public child welfare agencies, tribes, and courts enhance and mobilize the human and organizational assets necessary to meet federal standards and requirements; improve child welfare practice and administration; and achieve safety, permanency, and well-being outcomes for children, youth, and families. The administration also sponsors the Child Welfare Information Gateway, a platform connecting child welfare, adoption, and related professionals as well as the public to information, resources, and tools covering topics on child welfare, child abuse and neglect, out-of-home care, adoption, and more.
  • Healthy Marriage and Responsible Fatherhood grantees are required to enter performance data in the nFORM management information system, and ACF uses that data to closely monitor grantee performance toward preestablished performance targets. The administration identifies grantees who are not on track to meet key targets on a regular basis and provides targeted technical assistance through its technical assistance contracts to help them overcome obstacles and improve performance in a data-driven manner. It also relies heavily on nFORM data in its official annual CAPstone review of HMRF grantee performance. In addition, technical assistance providers work closely with HMRF grantees that are conducting local evaluations to ensure that they are implemented successfully. Grantees conducting local impact evaluations are required to demonstrate readiness prior to receiving approval to launch data collection for the impact study.
Back to the Standard

Visit Results4America.org