2022 Federal Index
Performance Management / Continuous Improvement
Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures, and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY22?
Score
7
7
Millennium Challenge Corporation
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
- All MCC investments have a singular objective: to reduce poverty through economic growth. Each program is thus measured by its ability to meet this goal. Additionally, MCC leadership has named three strategic priorities for the agency: climate-smart investments, inclusion and gender, and private sector engagement. The agency published a Climate Change Strategy and Inclusion and Gender Strategy in support of these new agency priorities.
- In an additional effort to track and aggregate evidence across its entire portfolio, MCC has implemented a common indicators structure across the seven sectors in which it invests: energy; land and property rights; education; water, sanitation, and irrigation; health; roads and transport infrastructure; and agriculture. In all MCC countries, projects in these sectors capture evidence across a common set of indicators to allow MCC to build an agency-wide evidence base around its investments.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
- Millennium Change Corporation is committed to using high-quality data and evidence to drive its strategic planning and program decisions. The M&E plans for all programs and tables of key performance indicators for all projects are available online by compact and threshold program and by sector, for use by both partner countries and the general public. Prior to investment, MCC performs a cost-benefit analysis to assess the potential impact of each project and estimates an economic rate of return. It uses a 10% economic rate of return hurdle to more effectively prioritize and fund projects with the greatest opportunity for maximizing impact. It then recalculates economic rates of return at investment closeout, drawing on information from its monitoring data (among other data and evidence) to test original assumptions and assess the cost effectiveness of MCC programs. In an effort to complete the evidence loop, MCC now includes evaluation-based cost-benefit analysis as a part of its independent final evaluation. As a part of the independent evaluation, the evaluators analyze the MCC-produced economic rate of return and associated project assumptions five or more years after investment close to understand if and how the benefits actually accrued. These evaluation-based economic rates of return add to the evidence base by contributing to a better understanding of the long-term effects and sustainable impact of MCC’s programs.
- In addition, MCC produces periodic reports that capture the results of its learning efforts in specific sectors and translate that learning into actionable evidence for future programming. Once MCC has a critical number of evaluations in a given sector, it endeavors to draw portfolio-wide learning from that sector in the form of Principles into Practice reports.
4.3 Did the agency have continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement (examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)?
- To enhance its credibility in terms of results, transparency, learning, and accountability, MCC continues to implement and expand its reporting system. The Star Report and its associated quarterly business process capture key information to provide a framework for results and improve the ability to promote and disseminate learning and evidence throughout the compact and threshold program lifecycle. For each compact and threshold program, evidence is collected on performance indicators, evaluation results, partnerships, sustainability efforts, and learning, among other elements. Critically, this information is available in one report after each program ends. Each country will have a Star Report published roughly seven months after completion.
- Continual learning and improvement is a key aspect of MCC’s operating model. The corporation continuously monitors progress toward compact and threshold program results on a quarterly basis using performance indicators that are specified in the M&E plan for each country’s investments. The M&E plans specify indicators at all levels (process, output, and outcome) so that progress toward final results can be tracked. Every quarter each partner country submits an indicator tracking table that shows actual performance of each indicator relative to the baseline that was established before the activity began and the performance targets that were established in the M&E plan. Key performance indicators and their accompanying data by country are updated every quarter and published online. MCC management and the relevant country team review this data in a formal Quarterly Performance Review meeting to assess whether results are being achieved and integrate this information into project management and implementation decisions.
- Millennium Challenge Corporation also publishes and produces semiannual updates on an exciting new interactive sector-level learning product: Sector Results and Learning pages. Sector Results and Learning pages are interactive web pages that promote learning and inform program design by consolidating the latest monitoring data, independent evaluation results, and lessons from the key sectors in which MCC invests. Critically, this information is now publicly available in one place, for the first time. An interactive learning database allows practitioners to efficiently retrieve past learning to inform new programs. Sector Results and Learning pages have been published for all six common sectors on which MCC reports: water, sanitation, and hygiene; transportation; agriculture and irrigation; education; energy; and land.
Score
8
8
U.S. Department of Education
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
- Current (FY22-FY26) and past strategic plans, including the department’s goals, strategic objectives, implementation steps, and performance objectives can be found on the department’s website, which also contains department annual performance reports (most recent fiscal year) and annual performance plans (upcoming fiscal year) .
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
- The Grants Policy Office within OPEPD works with offices across ED to ensure alignment with the Secretary’s priorities, including evidence-based practices. The Grants Policy Office looks at where ED and the field can continuously improve by building stronger evidence, making decisions based on a clear understanding of the available evidence, and disseminating evidence to decision-makers. Specific activities include strengthening the connection between the Secretary’s policies and grant implementation from design through evaluation; supporting a culture of evidence-based practices; providing guidance to grant-making offices on how to integrate evidence into program design; and identifying opportunities where ED and the field can improve by building, understanding, and using evidence. The Grants Policy Office collaborates with offices across the Department on a variety of activities, including reviews of efforts used to determine grantee performance.
- The Department of Education is focused on efforts to disaggregate outcomes by race and other demographics and to communicate those results to internal and external stakeholders. For example, in FY21, OCDO launched the Education Stabilization Fund (ESF) Transparency Portal at covid-relief-data.ed.gov, allowing ED to track performance, hold grantees accountable, and provide transparency to taxpayers and oversight bodies. The portal includes annual performance report data from CARES, Corona Virus Response and Relief Supplemental Appropriations, and ARP Act grantees, allowing ED and the public to monitor support for students and teachers and track the progress of the grantees. The portal displays key data from the annual performance reports, summarizing how the funds were used by states and districts. Data are disaggregated to the extent possible. For example, the Elementary and Secondary School Emergency Relief (ESSER) form asks for counts of students who participated in various activities to support learning recovery or acceleration for subpopulations disproportionately impacted by the COVID-19 pandemic. Categories include students with one or more disabilities, low-income students, English language learners, students in foster care, migratory students, students experiencing homelessness, and five race/ethnicity categories. As of July 2022, data in the portal includes information from the reporting period ending April 2022.
4.3 Did the agency have continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement (examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)?
- As part of the department’s performance improvement efforts, senior career and political leadership convene in quarterly performance review (QPR) meetings. As part of the QPR process, the performance improvement officer leads senior career and political officials in a review of ED’s progress toward its two-year agency priority goals and four-year strategic goals. In each QPR, assembled leadership reviews metrics that are “below target,” brainstorms potential solutions, and celebrates progress toward achieving goals that are “on track” for the current fiscal year.
- Since FY19, the department has conducted after-action reviews after each discretionary grant competition cycle to reflect on successes of the year as well as opportunities for improvement. The reviews resulted in process updates for FY21. In addition, the department updated an optional internal tool to inform policy deliberations and progress on the Secretary’s policy priorities, including the use of evidence and data.
Score
10
10
U.S. Agency for International Development
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
- The USAID partners with the U.S. Department of State to jointly develop and implement clear strategic goals, strategic objectives, and performance goals, which are articulated in the FY22-26 U.S. Department of State-USAID Joint Strategic Plan. The USAID and Department of State FY22-26 joint strategic plan incorporates as part of the planning process issues of racial equity, diversity, and inclusion. It includes five goals and nineteen objectives, including modernizing information technology and leveraging data to inform decision-making and support mission delivery. The FY22-26 joint strategic plan also includes a section on evidence building, and USAID and Department of States’ respective learning agendas are included in the annex.
- The agency measures progress toward its own strategic goals, strategic objectives, and performance goals using data from across the agency, including annual performance plans and reports completed by operating units, and uses that information to report on performance externally through the Annual Performance Plan/Annual Performance Report and the Agency Financial Report.
- To aggregate and track performance in key sectors, USAID works with the U.S. Department of State to develop and manage more than 100 standard foreign assistance indicators that have common definitions and defined collection methods. Once finalized, illustrative indicator data are published on a publicly available website known as Dollars to Results. Finally, USAID reports on agency priority goals and cross agency priority goal progress on www.performance.gov.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
- Most of USAID’s innovation or co-created programs and those done in partnerships reflect a data-driven pay for results model, where milestones are approved by all parties and payments are made when milestones are achieved. This means that, for some programs, if a milestone is unmet, funds may be reapplied to an innovation or intervention that is achieving results. This rapid and iterative performance model means that USAID more quickly understands what is not working and can move resources away from it and toward what is working.
- Prizes, Grand Challenges, and DIV can also be constructed to use pay for results approaches that have the potential to inform future USAID programming. Prizes set a results target, incentivize innovators to hit that target, and are paid only after assessors determine that the result has been achieved. A recent competition, the Intelligent Forecasting Competition, incentivized competitors to use the data from health care facilities in Cote d’Ivoire to develop intelligent forecasting methods for family planning commodities and to see if those models outperformed traditional pen-and-paper forecasts. They did. Insights from the prize-winning model are now being tested in a grant to implement intelligent forecasting methods in Cote d’Ivoire’s health facilities. If evidence from the field suggests that intelligent forecasting methods outperform historical forecasts, this approach will be mainstreamed in USAID’s global health commodities procurements. Most challenges, Grand Challenges, and DIV grants are fixed amount awards, a unique type of federal grant instrument that is tailor-made for pay for results approaches. Fixed amount awards are structured by paying for milestones achieved, which emphasizes performance (not just compliance) and reduces some administrative burden for all parties. In addition, interventions such as development impact bonds, like Instiglio’s Village Enterprise Development Impact Bond, supported by DIV, are used to create approaches where USAID pays only for outcomes and not inputs or attempts. The agency believes this model will pave the way for much of USAID’s work to be aligned with a pay for results approach. The agency is also piloting the use of the impact per dollar of cash transfers as a minimum standard of cost effectiveness for applicable program designs. Most innovations funded at USAID have a clear cost per impact ratio.
- Additionally, USAID missions develop Country Development Cooperation Strategies (CDCSs) with clear goals and objectives and a Performance Management Plan that identifies expected results, performance indicators to measure those results, plans for data collection and analysis, and regular review of performance measures to use data and evidence to adapt programs for improved outcomes. The agency also promotes data-informed operations performance management to ensure that it achieves its development objectives and aligns resources with priorities. It uses its Management Operations Council to conduct an annual strategic review of progress toward achieving the strategic objectives in its strategic plan.
- To improve linkages and break down silos, USAID continues to develop and pilot the Development Information Solution, an enterprise-wide management information system that will enable it to collect, manage, and visualize performance data across units, along with budget and procurement information, and thereby more efficiently manage and execute programming. The agency is currently in the process of world-wide deployment of the performance management module with almost half of its operating units using the system.
4.3 Did the agency have continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement (examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)?
- The agency’s Program Cycle policy (ADS 201.3.2.18) requires that missions conduct at least one portfolio review per year that focuses on progress toward strategy-level results. Missions must also conduct a mid-course stocktaking at least once during the course of implementing their CDCS, which typically spans five years.
- The agency developed an approach to explicitly ensure adaptation through learning called collaborating, learning, and adapting (CLA). It is incorporated into USAID’s Program Cycle guidance: “Strategic collaboration, continuous learning, and adaptive management link together all components of the Program Cycle.” Through CLA, USAID ensures that its programming is coordinated with others, grounded in a strong evidence base, and iteratively adapted to remain relative throughout implementation.
- Based on the results of the 2022 capacity assessment of USAID, the agency was assessed as advanced in four elements of adaptive management: (1) scheduled processes for systematic review of evidence; (2) the extent of agency staff work with stakeholders to identify successes, challenges, etc.; (3) the frequency with which staff raise and document decisions; and (4) the ability of teams and operating units to implement decisions for changes in programming. An advanced level of maturity suggests that processes are often in place for systematic review, and findings from evidence generation activities are used in programmatic decisions. Pause and reflect opportunities are often hosted for staff and partners. Operating units often work with partners to identify successes, challenges, and subjects that warrant further exploration. Where findings and conclusions are raised, they are often aligned to specific programmatic and operational decisions, and decisions are often documented. Additionally, at this maturity level, planned actions are often tracked and implemented. Operating units often use data to inform decisions on maintaining, adapting, or discontinuing current approaches and often take action to adapt strategy, projects, or activities as appropriate.
- The chief data officer’s team maintains an internal dashboard that is shared with the evaluation officer and statistical official to help track progress against milestones on an ongoing basis. This helps ensure that data needs are being met and achieving intended results.
- In addition to this focus through its programming, USAID has two senior bodies that oversee enterprise risk management and meet regularly to improve the accountability and effectiveness of USAID programs and operations through holistic risk management. The agency tracks progress toward strategic goals and annual performance goals during data-driven reviews at Management Operations Council meetings. Also, through input from the Management Operations Council, an annual agency-wide customer service survey, and other analysis, USAID regularly identifies opportunities for operational improvements at all levels of the agency as part of its operational learning agenda as well as the agency-wide learning agenda. The initial set of learning questions in the agency Learning Agenda includes four questions that focused on operational aspects of the agency’s work that influence everything from internal policy to design and procurement processes, program measurement, and staff training. It also includes key operational questions to support continuous learning and program improvement.
Score
7
7
AmeriCorps
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
- AmeriCorps’ FY22-26 strategic plan (goals and objectives) was approved by OMB and includes performance indicators. The agency has developed an internal-facing implementation tracker to facilitate the assessment of progress toward its goals and objectives.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
- The agency continued to invest in targeted return on investment analyses in FY22. An overview of findings from eight completed analyses is publicly available. The FY22 investment in assessing agency performance totaled $438,657.44.
- The agency’s chief risk officer also conducted regular risk assessments to ensure that proper internal controls were in place. Its Risk Management Council continued to meet regularly to review these data captures and adjust processes and practices accordingly to improve organizational outcomes.
4.3 Did the agency have continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement (examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)?
- To support AmeriCorps’ continuous improvement cycles, the agency is leveraging two technology tools: an internal-facing tracker for agency priorities and a strategic plan dashboard for agency data needs. These two complementary tools are designed for consistent agency-wide reporting on progress toward goals/objectives.
Score
7
7
U.S. Department of Labor
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
- The Department of Labor’s PMC leads the development of the department’s four-year Strategic Plan (FY 2018-2022) and Annual Performance Report.
- Using a performance and budget system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance to analyze progress and identify opportunities for performance improvements. Learning agendas updated annually by DOL agencies in collaboration with DOL’s Chief Evaluation Office include program performance themes and priorities for analysis needed to refine performance measures and identify strategies for improving performance. The annual strategic reviews with leadership include specific discussions about improving performance and findings from recent evaluations that suggest opportunities for improvement. Using a performance stat reporting and dashboard system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance by the deputy secretary to analyze progress and identify opportunities for performance improvements.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
- Using a performance and budget system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance to analyze progress and identify opportunities for performance improvements. Learning agendas updated annually by DOL agencies in collaboration with DOL’s Chief Evaluation Office include program performance themes and priorities for analysis needed to refine performance measures and identify strategies for improving performance. The annual Strategic Reviews with leadership include specific discussions about improving performance and findings from recent evaluations that suggest opportunities for improvement.
- In March 2022, DOL held the agency’s second Summer Data Equity Challenge, awarding $30,000 to researchers studying the impact of DOL policies and programs on traditionally underserved communities. Awardees will use data to find gaps in DOL’s knowledge and ideally propose practical solutions to fill those gaps and reduce disparities in outcomes.
4.3 Did the agency have continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement (examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)?
- The Department of Labor’s performance reporting and dashboard system supports quarterly reviews of each agency’s program performance by the deputy secretary to analyze progress and identify opportunities for performance improvements. These performance reviews connect to DOL’s broader performance and evaluation activities. Last year its Office of the Chief Information Officer developed a new dashboard, the CXO Dashboard, for use only by agency leadership to interactively assess progress on performance by providing instant access to key administrative data that enable data-driven decisions.
- The department leverages a variety of continuous learning tools, including the learning agenda approach to conceptualize and make progress on substantive learning goals for the agency, as well as its PMC Continuous Process Improvement (CPI) Program, which supports agencies in efforts to gain operational efficiencies and improve performance. The program directs customized process improvement projects throughout the department and grows the cadre of CPI practitioners through Lean Six Sigma training.
Score
7
7
Administration for Children and Families (HHS)
4. 1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
- Every four years, HHS updates its Strategic Plan, which describes its work to address complex, multifaceted, and evolving health and human services issues. The Administration for Children and Families was an active participant in the development of the FY22-FY26 HHS Strategic Plan, which includes several ACF-specific objectives. It regularly reports on progress associated with the current objectives as part of the FY21 HHS Annual Performance Plan/Report, including the twelve total performance measures from ACF programs that support this plan. These performance measures primarily support Goal 3: “strengthen social well-being, equity, and economic resilience.” ACF supports objective 3.1 (provide effective and innovative pathways leading to equitable economic success for all individuals and families), objective 3.2 (strengthen early childhood development and expand opportunities to help children and youth thrive equitably within their families and communities), and objective 3.4 (increase safeguards to empower families and communities to prevent and respond to neglect, abuse, and violence while supporting those who have experienced trauma or violence) by reporting annual performance measures. It is also an active participant in the HHS Strategic Review process, which is an annual assessment of progress on the subset of twelve performance measures it reports on as part of the HHS Strategic Plan.
- The Administration for Children and Families launched its strategic plan in early 2022. It incorporates five high-level strategic goals: (1) advance equity by reducing structural barriers including racism and other forms of discrimination that prevent economic and social well-being (goal 1 is intended to be an explicit part of each of the other four goals); (2) take a preventative and proactive approach to ensuring child, youth, family, and individual well-being; (3) use whole-family, community-based strategies to increase financial stability and economic mobility; (4) support communities and families responding to acute needs and facilitate recovery from a range of crises and emergency situations; and (5) enable and promote innovation within ACF to improve the lives of children, youth, families, and individuals. These goals cut across all ACF programs and populations.
- There will be five pilots, one per goal, that will allow ACF to implement and test ideas over 2022. For instance, the strategic goal 1 pilot is intended to center and integrate the perspectives and experiences of program participants in the design, management, evaluation, and decision-making of ACF programs and operations. The pilot will be focused on formalizing a process of consulting with communities that experience racism and other forms of discrimination to listen and build trust. As well, the strategic goal 4 pilot will support the objective of fostering resiliency among ACF’s customers to aid them in weathering and recovering from emergencies—such as the current pandemic. This pilot will assess how grantees and communities are integrating services into their programming to respond to children’s and parents’ social/emotional challenges. Staff from across ACF offices will work together to lead and execute these five pilots, and they will drive the design and intended outcomes. Following this initial set of pilots, ACF will continue to focus on action-oriented projects to yield meaningful progress on the ground.
- In April 2021, the assistant secretary for ACF announced the launch of the implementation of an ambitious agency-wide equity agenda and named the associate commissioner of the Administration for Children Youth and Families as lead for the implementation of the Executive Order on Advancing Racial Equity and Support for Underserved Communities Through the Federal Government. ACF is advancing equity across four priority work areas: (1) the internal ACF workforce, (2) data, (3) programmatic and policy change, and (4) procurement and grant making. To steer this effort, in May 2021 ACF founded the Equity Advisory Group, which was made up of leadership from every ACF program office. As of May 2022, every ACF program office has created a strategic plan for how it will advance equity.
- To communicate its progress on these efforts, ACF created a web page describing its equity-related activities since the launch of this agenda, which include holding community roundtables to explore the experiences of African American and Black individuals and families in accessing ACF programs and defining and articulating ACF’s plan through an information memorandum.
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
- The Office of Planning, Research, and Evaluation currently reviews all ACF funding opportunity announcements and advises program offices, in accordance with their respective legislative authorities, on how to best integrate evidence into program design. Similarly, program offices have applied ACF research to inform their program administration. For example, ACF developed the Learn Innovate Improve (LI2) model, a systematic evidence-informed approach to program improvement that has since informed targeted technical assistance efforts for the TANF program and the evaluation requirement for the child support demonstration grants.
- Administration for Children and Families’ programs also regularly analyze and use data to improve performance. For example, two ACF programs (Health Profession Opportunity Grants and Healthy Marriage and Responsible Fatherhood) have developed advanced web-based management information systems (PAGES and nFORM, respectively) that are used to track grantee progress, produce real-time reports so that grantees can use their data to adapt their programs, and record grantee and participant data for research and evaluation purposes.
- ACF also uses the nFORM data to conduct the HMRF Compliance Assessment and Performance (CAPstone) Grantee Review, a process by which federal staff and technical assistance providers assess grantee progress toward and achievement in meeting programmatic, data, evaluation, and implementation goals. The results of the CAPstone process guide federal directives and future technical assistance.
4.3 Did the agency have continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement (examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)?
- Administration for Children and Families’ program areas take tailored approaches to continuous improvement and rapid learning, taking into account cultural factors wherever appropriate. For example,
- ACF provides continuous quality improvement (CQI) resources specifically for PREP grantees, SRAE grantees, child welfare agencies and home visiting grantees;
- ACF provides CQI training and technical assistance for Tribal Home Visiting grantees, Tribal TANF-Child Welfare Coordination grantees, TANF, PREP grantees, SRAE grantees, and HMRF grantees;
- ACF is exploring how child care and Head Start programs can institutionalize CQI using a Breakthrough Series Collaborative approach;
- ACF has launched two learning collaboratives: (1) the Engaging Fathers and Paternal Relatives: A Continuous Quality Improvement Approach in the Child Welfare System and (2) the Tribal Home Visiting Institute to implement and evaluate an adapted version of Breakthrough Series Collaborative;
- ACF has developed a Learn Innovate Improve model that has been used with TANF programs and is using rapid cycle evaluation methods to help Responsible Fatherhood (RF) and Healthy Marriage and Relationship Education grantees address critical implementation challenges and test promising practices to address them
- ACF also conducts Child and Family Services Reviews to ensure that state child welfare systems conform to federal child welfare requirements; to gauge the experiences of children, youth, and families receiving state child welfare services; and to assist states in enhancing their capacity to help children and families achieve positive outcomes. The reviews are structured to help states identify strengths and areas needing improvement within their agencies and programs. States determined not to have achieved substantial conformity in all the areas assessed must develop and implement a Program Improvement Plan addressing the areas of nonconformity.
Score
9
9
Substance Abuse and Mental Health Services Administration
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
- The agency has a strategic plan that was developed by the prior Presidential administration. It is currently developing its next strategic plan. Its Strategic Plan for FY19 through FY23 outlines five priority areas with goals and measurable objectives to carry out its vision and mission. For each priority area, an overarching goal and series of measurable objectives are described followed by examples of key performance and outcome measures SAMHSA will use to track progress. As appropriate, SAMHSA plans to align with and support the goals and objectives outlined in the HHS Strategic Plan.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
- The Office of Evaluation, in partnership with SAMHSA program centers and collaboration with program staff, oversees the identification of a set of performance indicators to monitor its programs and the development of periodic program profiles for use in agency planning, program change, and reporting to departmental and external organizations. The SAMHSA Performance Accountability and Reporting System serves as the mechanism for the collection of performance data from agency grantees. Program centers’ staff examine data entered in SPARS on a regular and real-time basis to manage grant programs and improve outcomes. The data in SPARS are available in .csv file, via report, or through data visualization.
- In FY22, the system by which SAMHSA staff and grantees may view demographic data (SPARS) developed a list of proposed enhancements to allow internal and external stakeholders an opportunity to examine discretionary grant data (such as demographics and changes in stable housing, education, and employment) on a real time basis as well as the ability to compare clients by such characteristics as race, ethnicity, gender, and age over time. On an annual basis, SAMHSA produces SPARS-informed program profiles to examine a program’s performance. These profiles include disaggregate outcomes by race and other demographics as well as changes in behavior associated with time in the grant program. Data from these profiles are shared with grantees through a SPARS newsletter and through SAMHSA Stats e-blasts.
- The Evidence and Evaluation Board will use data and evidence to advise SAMHSA on ways to improve outcomes and returns on its investment. As stated in the board’s charter, the role of the board is to create and maintain an inventory of all agency evaluations, past, current and future; assist in developing the criteria that the agency will use to define “significant” evaluations for the purpose of prioritization; apply these criteria to the evaluation inventory and discuss implications; make any needed adjustments or revisions to the criteria; review other evaluation activities undertaken through other planning to determine whether these activities are consistent with the maturity of the program, research questions, and degree of independence necessary to conduct a rigorous evaluation to the fullest extent possible; consistently match the type of evaluation activity with program maturity, complexity, and research goals; consistently determine the degree of independence of evaluation activities for different types of programs; incorporate these practices and considerations into the contract planning process; consistently collect and disseminate meaningful and critical findings to SAMHSA’s colleagues and to the behavioral health and scientific fields; be responsible for examining evidence discovered through evaluations; ensure that evaluation findings are shared to both internal and external stakeholders; incorporate these findings, as appropriate, into discussions regarding SAMHSA future activities and grants; review information on grantee challenges, innovations, and successes that will be reported back to the Evidence and Evaluation Board by government project officers as a component of evidence; and develop a “learning agenda” to identify priorities for future evaluation activities.
- Through these functions, the Evidence and Evaluation Board will establish and foster a culture of evaluation and evidence information stewardship and use with the intent of maximizing the value of evaluation data for decision-making, accountability, and the public good.
4.3 Did the agency have continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement (examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)?
- The Evidence and Evaluation Board is one of the primary agency-wide mechanisms for identifying promising practices, problem areas, possible causal factors, and opportunities for improvement. The inclusion and active participation of senior center/office and agency leadership in the board facilitates broad, rapid, agency-wide dissemination of best practices. The board has designated staff, who along with the chief data officer, the evaluation officer, and CBHSQ staff, work with vice chairs and others to identify promising practices, problem areas, causal factors, and opportunities for improvement from the different centers and offices within SAMHSA. For example, the results of the Results for America Report will be shared during the November meeting to consider areas for improvement in FY23.
- In addition, SAMHSA is dedicated to continuous improvement in addressing behavioral health equity. To that end, a DIS is required of SAMHSA grant recipients. This statement is intended to aid both the grantee and SAMHSA in gaining a greater understanding of how funding is being used to reduce behavioral health disparities across the nation, in alignment with Executive Order 13985 (Advancing Racial Equity and Support for Underserved Communities through the Federal Government). The DIS helps establish expectations around tackling disparities, articulate how to address social determinants of health, and develop and implement a quality improvement plan to reduce identified disparities. The DIS has been adopted by some sister HHS operating divisions and is under consideration for adoption by others. In addition to the DIS, SAMHSA is developing internal equity dashboards to establish a baseline and track progress in this important area.
- Beginning in April 2020 and extending through FY22 and into FY23, CBHSQ’s Office of Evaluation has offered weekly technical assistance and training on data analysis, performance management, and evaluation. These one-hour sessions offer opportunities for SAMHSA program center staff and CBHSQ to share challenges and opportunities faced by grantees and strategize solutions. These sessions also offer an opportunity for cross-center collaboration and process improvement as project officers share and learn from those managing programs in other centers. These cross-center meetings allow CBHSQ to learn about challenges in the field, technological challenges using SPARS, and opportunities to make the system more user friendly. The project officers often share grantee questions and concerns for discussion and joint problem solving. SAMHSA collects these questions to include in FAQ documents.
- In FY22, SAMHSA began organizing data parties designed to examine SAMHSA data for problem solving, sharing of diverse perspectives, and to promote opportunities for SAMHSA to discuss ways to improve data collection, data quality, and data use. The first data party included nearly eighty SAMHSA staff discussing data related to SAMHSA’s Minority AIDS initiative grant programs including aggregate client-level data. The second data party focused on a discussion of how SAMHSA can use its data to address health disparities and the importance of the DIS or disparity impact statement. The most recent data party focused on data quality, the impact of missing data on SAMHSA’s ability to make conclusions about the data, and a discussion of the importance of clearly stating limitations to methodology and data analysis.
- SAMHSA organized several listening sessions, embedded within topical summits, to inform agency work. For example, community members and individuals with lived experience provided critical insight during the listening sessions at the two most recent harm reduction summits, one with a focus on the needs of tribal communities and the second as a more general National Harm Reduction summit. SAMHSA worked with tribal leaders, the Indian Health Service, and the National Indian Health Board to develop the National Tribal Behavioral Health Agenda. Data derived from the listening sessions central to the recent Recovery Summit were used to inform the Office of Recovery.
Score
10
10
U.S. Dept. of Housing & Urban Development
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
- The Department of Housing and Urban Development’s FY22–26 Strategic Plan defines strategic objectives, priority outcome goals, and program metrics supporting each objective. Progress on program metrics is tracked through Annual Performance Plans.
- In 2019, HUD expanded the Standards for Success data collection and reporting framework for discretionary grant programs to cover Resident Opportunities and Self-Sufficiency Service Coordinator (ROSS) grants, Multifamily Housing Service Coordinator grants, and Multifamily Housing Budget-Based Service Coordinator Sites. The framework supports better outcomes by providing a more standardized performance measurement framework, better alignment with departmental strategies, and more granular reporting to support analytics.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
- The department uses data and evidence extensively to improve outcomes and return on investment, primarily through PD&R’s investments in data collection, program demonstrations and evaluations, and research guided by a multi-year learning agenda. The department’s extensive use of outcome-oriented performance metrics in the Annual Performance Plan; and senior staff oversight and monitoring of key outcomes and initiatives through quarterly performance management meetings will be supported by a new CFO performance management module under development.
4.3 Did the agency have continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement (examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)?
- Senior staff support continuous improvement and oversight and monitoring of key outcomes and initiatives through quarterly performance management meetings. These processes are supported by ongoing significant investments in evidence building as documented in Annual Performance Plans and the iterative process of developing the Agency Learning Agenda , as well as development of a new performance management module by the chief financial officer. Monitoring and analysis based on administrative data have a symbiotic and complementary relationship with structured evaluation and program demonstrations.
- The Office of PD&R also hosts ongoing knowledge collaboratives designed to support continuous learning and improve performance. Examples include a Data Knowledge Collaborative, an RCT (randomized control trial) Knowledge Collaborative, and a Knowledge Collaborative on Equity in Evaluation, as well as a new interoffice user group that shares information and tools for using statistical software effectively. The Knowledge Collaborative on Equity in Evaluation revised HUD’s Evaluation Policy to incorporate considerations of equity throughout.
Score
8
8
Administration for Community Living (HHS)
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
- As part of the HHS Annual Performance Plan and Report, ACL reports on the following two HHS Agency Priority Goals: (1) to Increase the success rate of the Protection and Advocacy Program’s individual or systemic advocacy, thereby advancing the right of individuals with developmental disabilities to receive appropriate community based services, resulting in community integration and independence, and to have other rights enforced, retained, restored, and/or expanded; and (2) to Improve dementia capability of long-term support systems to create dementia friendly, livable communities (ACL as lead agency). Outcomes measures are available, by program, in ACL’s annual Congressional Budget Justification and include measures of program efficiency. Annual reports to Congress are submitted by ACL’s Administration on Disability, Administration on Aging, and NIDILRR. In addition, ACL contributes to other department-wide reports to Congress such as the HHS Report to Congress on Minority Health.
4.2 Did the agency use data/evidence to improve outcomes and return on investment?
- The Administration for Community Living employs a moderate approach for analyzing evidence to find ways to improve return on investment that address multiple parts of the agency. In FY20, as part of its ongoing effort to ensure that agency funds are used effectively, ACL funded a three-year contract, focused on ACL’s Administration on Aging, to identify approaches to measure how and to what extent parts of the Aging Network leverage Older Americans Act funds to increase their available resources as well as how the Aging Network uses resources to measure and improve the quality of services available/provided. This evaluation is ongoing. As part of its new employment research agenda, NIDILRR conducts research to continue development of return-on-investment models that can be used by vocational rehabilitation agencies to optimize the services they provide. In addition, in January 2021 ACL announced a new phase for the Innovative Technology Solutions for Social Care Referrals challenge competition. It also recently published the results of a study measuring the economic value of volunteerism for Older Americans Act programs.
4.3 Did the agency have continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement (examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)?
- As part of ACL’s performance strategy and learning approach, OPE staff present performance data to ACL leadership several times a year. Performance data are reviewed by ACL leadership as part of the budget justification process that informs program funding decisions. To report performance measure data and results and to discuss methods for incorporating performance and evaluation findings into funding and operational decision-making, OPE staff conducts annual meetings with ACL staff. As part of annual evaluation planning efforts, OPE staff consult with ACL center directors to identify evaluation priorities and review proposed evaluation approaches to ensure that the evaluation questions identified will provide information that will be useful for program improvement. Two projects started in late 2020 with the goal of improving agency performance are a study of how the services provided by ACL grantees influence the social determinants of health and an evaluation of how ACL supports grantee use of evidence-based programs that are required under Title IIID of the Older Americans Act. Results from the social determinants of health study will be posted in FY2023. In 2021 ACL began using the National Standards for Culturally and Linguistically Appropriate Services (CLAS) in Health and Health Care to inform its evaluation framework. Specifically, ACL funded this project to explore the extent to which its grantees employ CLAS standards in their service delivery processes, particularly their responsiveness to cultural practices, language and communication needs, LGBTQ+ needs, and health literacy. The report from this study will be posted in FY23. It also funded a study to examine the use of and financial value of volunteers to its programs. In addition to a final report, ACL developed an effective practice guide to help grantees use volunteers effectively.