2022 Federal Index
U.S. Department of Labor
9
Leadership
Did the agency have senior staff members with the authority, staff, and budget to build and use evidence to inform the agency’s major policy and program decisions in FY22?
1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s Evaluation Officer or equivalent (example: Evidence Act 313)?
- The chief evaluation officer serves as the DOL evaluation officer. The chief evaluation officer oversees DOL’s Chief Evaluation Office, housed within the Office of the Assistant Secretary for Policy, and the coordination of department-wide evaluations, including office staff and leadership to interpret research and evaluation findings and to identify their implications for programmatic and policy decisions.
- The Chief Evaluation Office includes nine full-time staff plus a small number of contractors and one or two detailees. This staff is augmented by staff from research and evaluation units in other DOL agencies such as the Employment and Training Administration (ETA), which has six full-time employees dedicated to research and evaluation activities with which the chief evaluation officer coordinates extensively on the development of a learning agenda, management of studies, and dissemination of results.
- In FY22, the Chief Executive Office received a direct appropriation of $8,280,000; it may also receive up to 0.75%of funds from statutorily specified program accounts, based on the discretion of the Secretary.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s chief data officer or equivalent [example: Evidence Act 202(e)]?
- Building on existing efforts initiated before the OPEN Government Data Act, the Secretary of Labor released Secretary’s Order (02-2019) directing the department to create a chief data officer position and a data governance board to help realize the strategic value in data, as well as to establish, coordinate, and manage policy, processes, and standards for data management. The chief data officer chairs DOL’s data governance body and leads data governance efforts; open data efforts; and associated efforts to collect, manage, and utilize data in a manner that best supports its use to inform program administration and foster data-informed decision-making and policymaking.
- The department has arranged for two permanent staff to support governance and open data efforts as well as compliance with the Evidence Act, the Federal Data Strategy, and DOL’s data governance goals.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, performance improvement officer, and other related officials in order to support Evidence Act implementation and improve the agency’s major programs?
- Through a Secretary’s Order, DOL has created a structure that coordinates and leverages the important roles within the organization to accomplish objectives like those in the Evidence Act. The Secretary’s Order mandates collaboration among the chief data officer, chief performance officer, chief evaluation officer, chief information officer, and chief statistical officer. This has allowed DOL’s evidence officials to more closely coordinate with both regular and ad hoc meetings. For example, in FY19, all four evidence officials reviewed DOL agency learning agendas and Evidence Act reports.
- The Secretary’s Order mandates a collaborative approach to reviewing information technology infrastructure and data asset accessibility; developing modern solutions for managing, disseminating and generating data; coordinating statistical functions; supporting evaluation, research, and evidence generation; and supporting all aspects of performance management including assurances that data are fit for purpose.
- The department continues to leverage current governance structures. (For example, the chief evaluation officer continues to play a role in the formation of the annual budget requests of DOL’s agencies, recommendations around including evidence in grant competitions, and providing technical assistance to the department leadership to ensure that evidence informs policy design.) There are a number of mechanisms set up to facilitate this process. The chief evaluation officer traditionally participates in quarterly performance meetings with DOL leadership and the Performance Management Center (PMC). The chief evaluation officer reviews agency operating plans and works with agencies and the PMC to coordinate performance targets and measures and evaluates findings. Quarterly meetings are held with agency leadership and staff as part of the Learning Agenda process, and meetings are held as needed to strategize around addressing new priorities or legislative requirements.
10
Evaluation & Research
Did the agency have an evaluation policy, evaluation plan, and learning agenda (evidence building plan) and did it publicly release the findings of all completed program evaluations in FY22?
2.1 Did the agency have an agency-wide evaluation policy [example: Evidence Act 313(d)]?
- The Department of Labor has an Evaluation Policy that formalizes the principles that govern all program evaluations in the department, including methodological rigor, independence, transparency, ethics, and relevance. The policy represents a commitment to using evidence from evaluations to inform policy and practice. It states that “evaluations should be designed to address DOL’s diverse programs, customers, and stakeholders; and DOL should encourage diversity among those carrying out the evaluations.”
2.2 Did the agency have an agency-wide evaluation plan [example: Evidence Act 312(b)]?
- The Chief Evaluation Office develops, implements, and publicly releases evidence building plans and assessments and annual evaluations plans. These plans are based on the agency learning agendas as well as the department’s Strategic Plan priorities, statutory requirements for evaluations, and priorities of the Secretary of Labor and the Presidential administration. The evaluation plan includes the studies the office intends to undertake in the next year using set-aside dollars. Appropriations language requires the chief evaluation officer to submit a plan to the U.S. Senate and House Committees on Appropriations outlining the evaluations that will be carried out using dollars transferred to the office. The DOL evaluation plan serves that purpose. The Chief Evaluation Office also works with agencies to undertake evaluations and evidence building strategies to answer other questions of interest identified in learning agencies but not undertaken directly by the Chief Evaluation Office.
2.3 Did the agency have a learning agenda (evidence building plan) and did the learning agenda describe the agency’s process for engaging stakeholders including, but not limited to the general public, state and local governments, and researchers/academics in the development of that agenda (example: Evidence Act 312)?
- The department’s evidence building plans and assessments outline the process for internal and external stakeholder engagement. Specifically, the Chief Evaluation Office has made explicit outreach efforts with state and local workforce agencies as well as academic scholars, including outreach to historically Black colleges and universities and Hispanic-serving institutions.
- The department publishes multi-year evidence building plans (learning agendas) publicly. Further, in May 2022, the Chief Evaluation Office hosted a public event introducing the office as well as providing an opportunity for attendees to learn about upcoming research activities funded by DOL, including how individuals and organizations can engage with the office and provide input into future research priorities. The evaluation officer provided an overview of the office’s mission and activities, and staff provided an overview of DOL’s new strategic planning documents, the FY22-23 Evaluation Plan and FY22-26 Evidence Building Plan.
2.4 Did the agency publicly release all completed program evaluations?
- All DOL program evaluation reports and findings funded by the Chief Evaluation Office are publicly released and posted on the complete reports section of the website of the Office of the Assistant Secretary for Policy. Department agencies such as the ETA, also post and release their own research and evaluation reports. Some program evaluations include data and results disaggregated by such characteristics as race, ethnicity, and gender. The department’s website also provides accessible summaries and downloadable one-pagers on each study. Its research development and review process includes internal and external working groups and reviews.
- The Chief Evaluation Office publishes a quarterly newsletter and sends email campaigns on large relevant evaluations and other opportunities for academics and researchers; public events are also published on the website.
2.5 Did the agency conduct an Evidence Capacity Assessment that addressed the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts [example: Evidence Act 315, subchapter II (c)(3)(9)]?
- The Chief Evaluation Office sponsored an assessment of DOL’s baseline capacity to produce and use evidence, with the aim of helping the department and its agencies identify key next steps to improve evidence capacity. It developed technical requirements and contracted with the American Institutes for Research /IMPAQ International, LLC (research team) to design and conduct this independent third party assessment, which included the sixteen DOL agencies in the department’s Strategic Plan. The assessment reflects data collected through a survey of targeted DOL staff, focus groups with selected DOL staff, and a review of selected evidence documents. The capacity assessment is publicly available on DOL’s website.
- The department’s Evaluation Policy touches on its commitment to high-quality methodologically rigorous research through funding independent research activities. Further, Chief Evaluation Office staff have expertise in research and evaluation methods as well as in DOL programs and policies and the populations they serve. For the majority of evaluation projects the office also employs technical working groups whose members have deep technical and subject matter expertise. The office leveraged the FY20 learning agenda process to create an interim capacity assessment, per Evidence Act requirements, and has conducted a more detailed assessment of individual agencies’ capacity, as well as DOL’s overall capacity in these areas to be published in 2022.
2.6 Did the agency use rigorous evaluation methods, including random assignment studies, for research and evaluation purposes?
- The department employs a full range of evaluation methods to answer key research questions of interest, including impact evaluations when appropriate. Among DOL’s active portfolio of approximately fifty projects, the study type ranges from rigorous evidence syntheses to implementation studies to quasi-experimental outcome studies and impact studies. Examples of DOL studies with a random-assignment component include an evaluation of a Job Corps demonstration pilot, the Cascades Job Corps College and Career Academy, and the Ready-to-Work Partnership Grant evaluation An example of a multi-arm randomized control trial is the Reemployment Services and Eligibility Assessments evaluation, which assessed a range of strategies to reduce unemployment insurance duration and improve employment as well as wage outcomes.
4
Resources
Did the agency invest at least 1% of program funds in evaluations in FY22 (examples: impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance; and rigorous evaluations, including random assignments)?
3.1 ____ invested $____ on evaluations, evaluation technical assistance, and evaluation capacity-building, representing __% of the agency’s $___ billion FY22 budget.
- The Department of Labor invested $23,180,000 in evaluations, evaluation technical assistance, and evaluation capacity building, representing 0.16% of the agency’s $14,200,000,000 discretionary budget in FY22.
3.2 Did the agency have a budget for evaluation and how much was it (were there any changes in this budget from the previous fiscal year)?
- The Chief Evaluation Office receives $8,280,000 and then may receive up to 0.75% from statutorily specified program accounts, based on the discretion of the Secretary of Labor. In relative terms, between FY21 and FY22, this budget decreased by 0.14%. The Chief Evaluation Office also collaborates with DOL program offices and other federal agencies on additional evaluations being carried out by other offices and/or supported by funds appropriated to other agencies or programs. The office oversaw approximately $9,940,000 in evaluation and evidence building activities in FY19 and approximately $21,000,000 in FY18 as compared to an estimated $40,000,000 in evaluation funding in FY17.
- This amount represents only the dollars that are directly appropriated or transferred to the Chief Evaluation Office. Additionally, many DOL evaluations and research studies are supported by funds appropriated to DOL programs and/or are carried out by other offices within DOL. In some programs, such as the America’s Promise grant evaluation and the Reentry grant evaluation, evaluation set-asides exceed 1% (2.9% and 2.8%, respectively, for these programs).
3.3 Did the agency provide financial and other resources to help city, county, and state governments or other grantees build their evaluation capacity (including technical assistance funds for data and evidence capacity building)?
- Grantees and programs that participate in DOL evaluations receive technical assistance related to evaluation activities and implementation through contracted support from evaluators and resources, such as tools, guides, and webinars available on the Evaluation and Research Hub (EvalHub). Additional dissemination and technical assistance resources, including the Research and Evaluation Notes, Evaluation Peer Learning Cohort, and the Evaluation Toolkit: Key Elements for State Workforce Agencies, have been developed. The Chief Evaluation Office partners with DOL agencies like ETA to help states and local areas build evaluation capacity to meet the program evaluation requirements for the Workforce Innovation and Opportunity Act and Reemployment Services and Eligibility Assessment (RESEA). For example, it oversees contracted support through the RESEA program evaluation technical assistance project, which has supported analyses, webinars, tools, and templates to help states understand, build, and use evidence. The project’s technical assistance webinar series for states includes a webinar and webcast series and evidence resources posted online to the RESEA community of practice, and each webinar has been viewed by the field between 2,500 and 4,600 times. Additional RESEA evaluation technical assistance products are being developed and will be posted on the Chief Evaluation Office’s website, CLEAR, and in the RESEA community of practice.
- Another notable example is the Chief Executive Office’s contracted evaluation-related technical assistance focused on community colleges and Strengthening Community College grantees. In February and March 2022, in collaboration with ETA, it delivered a series of three roundtables that featured research on how community colleges can measure equity in their programs and improve employment-related outcomes for historically marginalized groups. All three rounds of the Strengthening Community College grants require a third party evaluation, and the most recent rounds of grants include a unique approach to provide additional evaluation funding to grantees that propose to conduct rigorous impact, outcomes, or behavioral impact studies. The office oversees contractor technical assistance to grantees to support the development of high-quality evaluation plans.
7
Performance Management / Continuous Improvement
Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY22?
4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
- The Department of Labor’s PMC leads the development of the department’s four-year Strategic Plan (FY 2018-2022) and Annual Performance Report.
- Using a performance and budget system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance to analyze progress and identify opportunities for performance improvements. Learning agendas updated annually by DOL agencies in collaboration with DOL’s Chief Evaluation Office include program performance themes and priorities for analysis needed to refine performance measures and identify strategies for improving performance. The annual strategic reviews with leadership include specific discussions about improving performance and findings from recent evaluations that suggest opportunities for improvement. Using a performance stat reporting and dashboard system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance by the deputy secretary to analyze progress and identify opportunities for performance improvements.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
- Using a performance and budget system linked to component agencies’ annual operating plans, PMC coordinates quarterly reviews of each agency’s program performance to analyze progress and identify opportunities for performance improvements. Learning agendas updated annually by DOL agencies in collaboration with DOL’s Chief Evaluation Office include program performance themes and priorities for analysis needed to refine performance measures and identify strategies for improving performance. The annual Strategic Reviews with leadership include specific discussions about improving performance and findings from recent evaluations that suggest opportunities for improvement.
- In March 2022, DOL held the agency’s second Summer Data Equity Challenge, awarding $30,000 to researchers studying the impact of DOL policies and programs on traditionally underserved communities. Awardees will use data to find gaps in DOL’s knowledge and ideally propose practical solutions to fill those gaps and reduce disparities in outcomes.
4.3 Did the agency have continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement (examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)?
- The Department of Labor’s performance reporting and dashboard system supports quarterly reviews of each agency’s program performance by the deputy secretary to analyze progress and identify opportunities for performance improvements. These performance reviews connect to DOL’s broader performance and evaluation activities. Last year its Office of the Chief Information Officer developed a new dashboard, the CXO Dashboard, for use only by agency leadership to interactively assess progress on performance by providing instant access to key administrative data that enable data-driven decisions.
- The department leverages a variety of continuous learning tools, including the learning agenda approach to conceptualize and make progress on substantive learning goals for the agency, as well as its PMC Continuous Process Improvement (CPI) Program, which supports agencies in efforts to gain operational efficiencies and improve performance. The program directs customized process improvement projects throughout the department and grows the cadre of CPI practitioners through Lean Six Sigma training.
8
Data
Did the agency collect, analyze, share, and use high-quality administrative and survey data consistent with strong privacy protections to improve (or help other entities improve) outcomes, cost effectiveness, and/or the performance of federal, state, local, and other service providers programs in FY22 (examples: model data-sharing agreements or data-licensing agreements; data tagging and documentation, data standardization, open data policies, and data use policies)?
5.1 Did the agency have a strategic data plan, including an open data policy [example: Evidence Act 202(c), Strategic Information Resources Plan]?
- The department’s Office of Data Governance led the launch of its data strategy, published in 2022, which includes the following strategic areas: ensuring data quality, building and maintaining data talent, integrating data into existing agency management and planning systems to create a practical and realizable path forward, and expanding the data capabilities for producing sophisticated analytics. This data strategy also includes five data principles and details about public and partner engagement in the development of the plan (p. 4). In addition, DOL has open data assets aimed at developers and researchers who desire data-as-a-service through application programming interfaces hosted by both the Office of Public Affairs and the Bureau of Labor Statistics (BLS). Each of these has clear documentation; is consistent with the open data policy; and offers transparent, repeatable, machine-readable access to data on an as-needed basis.
5.2 Did the agency have an updated comprehensive data inventory (example: Evidence Act 3511)?
- The department has conducted extensive inventories over the last ten years, in part to support common activities such as information technology modernization, White House Office of Management and Budget (OMB) data calls, and the general goal of transparency through data sharing. These form the current basis of DOL’s planning and administration. Some sections of the Evidence Act have led to a different federal posture with respect to data, such as the requirement for data to be open by default and considered shareable unless there is a legal requirement not to do so or a risk that the release of such data might help constitute disclosure risk. Led by the chief data officer and DOL Data Board, the department is currently reevaluating its inventories and its public data offerings in light of this very specific requirement and revisiting this issue among all its programs. Because this is a critical prerequisite to developing open data plans, as well as data governance and data strategy frameworks, the agency launched a website housing an updated inventory in FY22.
5.3 Did the agency promote data access or data linkage for evaluation, evidence building, or program improvement [examples: model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c)]?
- The department also has multiple restricted use access systems that exceed what would be possible with simple open data efforts. The Bureau of Labor Statistics has a confidential researcher access program, offering access under appropriate conditions to sensitive data. Similarly, the Chief Evaluation Office is launching a restricted use access program for evaluation study partners to leverage sensitive data in a consistent manner to help make evidence generation more efficient.
- The department’s Chief Evaluation Office, Employment and Training Administration, and Veterans Employment and Training Service have worked with the U.S. Department of Health and Human Services (HHS) to develop a secure mechanism for obtaining and analyzing earnings data from the National Directory of New Hires. Since FY20, DOL has entered into interagency data sharing agreements with HHS and obtained data to support ten job training and employment program evaluations.
- Since FY20, the department has continued to expand efforts to improve the quality of and access to data for evaluation and performance analysis through the Data Analytics Unit in the Chief Evaluation Office and through new pilots beginning in the BLS to access and exchange state labor market and earnings data for statistical and evaluation purposes.
5.4 Did the agency have policies and procedures to secure data and protect personal confidential information (example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)?
- The Department of Labor has a shared services approach to data security. In addition, the privacy provisions for BLS and ETA are explicit and publicly available online.
- The department has consistently sought to make as much data as possible available to the public regarding its activities. Examples of this include its Public Enforcement Database, which makes available records of activity from the worker protection agencies and the Office of Labor Management Standards’ online public disclosure room.
- The Bureau of Labor Statistics has a confidential researcher access program, offering access to sensitive data under appropriate conditions. Similarly, the Chief Evaluation Office is launching a restricted use access program for evaluation study partners to leverage sensitive data in a consistent manner to help make evidence generation more efficient.
5.5 Did the agency provide assistance to city, county, and/or state governments, and/or other grantees on accessing the agency’s datasets while protecting privacy?
- The State Wage Interchange System is a mechanism through which states can exchange wage data with other states in order to satisfy performance related reporting requirements under the Workforce Innovation and Opportunity Act (WIOA), as well as for other permitted purposes specified in the agreement. The State Wage Interchange System agreement includes the DOL’s Adult, Dislocated Worker, and Youth programs (Title I) and Employment Service program (Title III); the Department of Education’s Adult and Family Literacy Act program (Title II) and programs authorized under the Carl D. Perkins Career and Technical Education Act of 2006 (as amended); and, the Vocational Rehabilitation program (Title IV). These departments have established agreements with all fifty states, the District of Columbia and Puerto Rico.
- The Employment and Training Administration continues to fund and provide technical assistance to states under the Workforce Data Quality Initiative to link earnings and workforce data with education data in support of state program administration and evaluation. These grants support the development and expansion of longitudinal databases and enhance their ability to share performance data with stakeholders. The databases include information on programs that provide training and employment services and obtain similar information in the service delivery process.
- The Employment and Training Administration is also working to assess the completeness of self-reported demographic data to inform both agency level equity priorities and future technical assistance efforts for states and grantees to improve the completeness and quality of this information. It incorporated into funding opportunity announcements the requirement to make any data on credentials transparent and accessible through use of open linked data formats.
- In addition, ETA is working with the department’s Office of the Chief Information Officer to complete new case management, known as the Grants Performance Management System, for its national and discretionary grantees. In addition to supporting case management by grantees, this new system supports these grantees in meeting WIOA-mandated performance collection and reporting requirements and enabling automation to ensure that programs can continue to meet updated WIOA requirements. As programs onboard interact with the Grants Performance Management System, the administration continues to integrate this system into the Workforce Investment Performance System to seamlessly calculate and report WIOA primary indicators of performance and other calculations in programs’ quarterly performance reports.
- The department is currently developing a new application programming interface (version 3) that will expand the open data offerings, extend the capabilities and offer a suite of user friendly tools.
8
Common Evidence Standards / What Works Designations
Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding purposes; did that framework prioritize rigorous research and evaluation methods; and did the agency disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY22 (example: What Works Clearinghouses)?
6.1 Did the agency have a common evidence framework for research and evaluation purposes?
- The Department of Labor’s Clearinghouse for Labor Evaluation and Research (CLEAR) evidence guidelines, which describe quality standards for different types of studies These standards are applied to all independent evaluations, including all third party evaluations of DOL programs determined eligible for CLEAR’s evidence reviews across different topic areas. Requests for proposals also indicate that these CLEAR standards should be applied to all Chief Evaluation Office evaluations when considering which designs are the most rigorous and appropriate to answer specific research questions.
- In addition, the DOL Evaluation Policy identifies principles and standards for evaluation planning and dissemination. The Department of Labor collaborates with other agencies (the Department of Health and Human Services, the Department of Education’s Institute of Education Sciences, the National Science Foundation, and the Corporation for National and Community Service to develop technological procedures to link and share reviews across clearinghouses.
6.2 Did the agency have a common evidence framework for funding decisions?
- The department uses the CLEAR evidence guidelines and standards to make decisions about discretionary program grants awarded using evidence-informed or evidence-based criteria. The published guidelines and standards are used to identify evidence-based programs and practices and to review studies to assess the strength of their causal evidence or to do a structured evidence review in a particular topic area or time frame to help inform agencies about what strategies appear promising and where gaps exist.
6.3 Did the agency have a clearinghouse(s) or user-friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
- The department’s CLEAR is an online evidence clearinghouse whose goal is to make research on labor topics more broadly accessible to practitioners, policymakers, researchers, and the public so that it can inform their decisions about labor policies and programs. This clearinghouse identifies and summarizes many types of research, including descriptive statistical studies and outcome analyses, implementation studies, and causal impact studies. For causal impact studies, it assesses the strength of the design and methodology in studies that look at the effectiveness of particular policies and programs. Its study summaries and icons, found in each topic area, can help users quickly and easily understand what studies found and how much confidence to have in the results. Its search tool allows users to find studies based on target population, including race and other demographic characteristics.
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
- The Department of Labor promotes the use of evidence-based practices to encourage or implement program evaluation, foundation fact-finding, performance measurement, and policy analysis in a variety of ways. For example, the Chief Evaluation Office provides regular briefings for DOL national and regional staff on interim and final results of studies; trainings, research roundtables, and single briefings from external experts on methodological topics and new labor-related research findings through the Chief Evaluation Office Seminar Series; and a monthly research roundup on a variety of labor-related topics for DOL staff, called Labor Research News. The office’s Data Analytics Unit also offers agencies support to pilot sophisticated analyses of existing internal or external data. For the public, the office provides regular updates as well as a quarterly newsletter called Building the Evidence Base; supports trainings for workforce agencies and the public on how to access user friendly results on a topic across thousands of studies in DOL’s clearinghouse CLEAR; provides public information on how the department is building evidence by maintaining the DOL Evidence Hub; and supports the dissemination of evidence-based standards and program evaluations, for example, through the dissemination of meta-analyses of career pathways impact evaluations and a user friendly Career Trajectories and Occupational Transitions Dashboard, or through dissemination, in collaboration with ETA, of the office’s RESEA evidence-building and program implementation study, which helps states apply evaluation findings to improve the RESEA program.
- Department of Labor agencies also support the use of evidence-based practices. For example, the International Labor Affairs Bureau (ILAB) has an evaluation newsletter and maintains a public website sharing evaluation reports. The Employment and Training Administration maintains a user-friendly database and a community of practice, Workforce System Strategies, that highlights the use of evidence-based interventions as foundational fact finding and the Evaluation and Research Hub for replication. Workforce System Strategies is a comprehensive database of more than 1,500 profiles that summarize a wide range of findings from reports, studies, and technical assistance tools to guides that support program administration and improvement. The Evaluation and Research Hub is a community of practice created to support evidence and evaluation-capacity building efforts within state workforce development programs. In another effort to promote evidence-based practices, ETA has supported an applied data analytics program offered through the Coleridge Initiative for multiple teams from state workforce agencies.
6
Innovation
Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY22 (examples: prizes and challenges, behavioral science trials, innovation labs/accelerators, performance partnership pilots, and demonstration projects or waivers with rigorous evaluation requirements)?
7.1 Did the agency have staff dedicated to leading innovation efforts to improve the impact of its programs?
- The Department of Labor’s chief innovation officer is responsible for efforts to use innovative technologies, partnerships, and practices to accelerate the department’s mission. The chief innovation officer reports to the deputy secretary and also serves as the senior advisor for delivery for the department.
- The department’s chief data officer and Chief Evaluation Office Data Analytics team developed a secure data analysis platform accessible to all DOL staff, preloaded with common statistical packages and offering the capability to access and merge various administrative data for analysis. The department supports staff in executing limitless web-based A/B testing and other behaviorally informed trials with the shared service of the advanced Granicus platform’s GovDelivery communications tool, including free technical support. This tool enhances the department’s ability to communicate with the public, such as through targeted email campaigns, and to adjust these communications, informed by testing and data, to increase engagement on relevant topics. The Chief Evaluation Office also has developed toolkits and detailed resources for staff to effectively design behaviorally informed tests, shared on its new Behavioral Interventions website.
7.2 Did the agency have initiatives to promote innovation to improve the impact of its programs?
- The Chief Evaluation Office uses a variety of communication tools to share rigorous research results, lessons learned, promising practices, and other implications of its research. These include internal briefings from independent contractors and researchers, a brown bag series (Chief Evaluation Office Seminar Series) that features evidence-based promising practices and results shared by DOL staff for DOL staff, and an external expert seminar series featuring new findings or innovations in relevant areas of work. Chief Evaluation Office staff consistently use research findings in the development of new research, and DOL agencies use these findings to design and guide new discretionary grant programs, to refine performance measures for grantees, and to make decisions on compliance and enforcement practices.
- The department is strongly committed to promoting innovation in its policies and practices. For example, the Employment and Training Administration’s competitive funding routinely supports innovative programming, since grantees typically bundle various program services and components to best meet the needs of the people being served by them in their local contexts. A particularly good example of this innovation is found in the administration’s high-priority area of apprenticeships. In FY19, ETA issued nearly $184,000,000 in Scaling Apprenticeship Through Sector-Based Strategies grants to public-private partnerships for expanding apprenticeships in health care, information technology and other industries. In FY20, ETA awarded nearly $100,000,000 in Apprenticeship: Closing the Skills Gap grants. The Chief Evaluation Office’s Apprenticeship Evidence Building Portfolio project includes implementation evaluations of both of these grant programs. Additionally, the office oversees a contractor-led evaluation of the American Apprenticeship Initiative, which since 2016 has provided $175,000,000 in grants to forty-five grantees across the nation. In July 2022, ETA issued more than $121,000,000 in Apprenticeship Building America grants to strengthen registered apprenticeship programs.
- In addition, the Chief Evaluation Office’s Behavioral Insights team works with a number of DOL agencies on a continuous basis to identify and assess the feasibility of conducting studies where insights from behavioral science can be used to improve the performance and outcomes of DOL programs. The Wage and Hour Division’s Transformation Team is one such example where continuous improvement efforts are driving innovation. This work has identified potential areas where behavioral interventions and trials may inform program improvement. The Chief Evaluation Office is also working across agencies, including the Wage and Hour Division, Employment and Training Administration, Women’s Bureau, Veterans Employment and Training Service, Office of Federal Contract Compliance Programs, and International Labor Affairs Bureau, to identify and assess the feasibility of other areas where insights from behavioral science can be used to improve the performance and outcomes of DOL programs.
- The Department of Labor has also built capacity for staff innovation through the Performance Management Center’s CPI Program, an agency-wide opportunity that trains and certifies agency staff on Lean Six Sigma methodologies through real-time execution of DOL process improvement projects. The program includes classroom sessions that prepare participants for Lean Six Sigma Black Belt certification examinations, including the American Society for Quality as well as DOL’s own certification.
7.3 Did the agency evaluate its innovation efforts, including using rigorous methods?
- Through the annual Learning Agenda process, DOL systematically identifies gaps in the use of evidence. Innovation is about filling known gaps via dissemination, further research, or generation of quick turnaround assessments such as those offered to the Department by the Chief Evaluation Office’s Behavioral Insights Program.
- The department typically couples innovation with rigorous evaluation to learn from experiments. For example, DOL is participating in the Performance Partnership Pilots (P3) for innovative service delivery for disconnected youth. This program includes not only waivers and blending and braiding of federal funds, but gives bonus points in application reviews for proposing “high tier” evaluations. The Department of Labor is the lead agency for the evaluation of P3. A final report is available on the Chief Evaluation Office’s completed studies website. In 2021, the Chief Evaluation Office partnered with the Social Security Administration, the Office of Disability Employment Policy, and funded contractor Mathematica to support the ongoing evaluation of the Retaining Employment and Talent After Injury/Illness Network (RETAIN) demonstration projects. The office’s contract supports enrollment data collection and the random assignment of study participants for phase two of the RETAIN demonstration. This impact evaluation aims to assess the effectiveness of intervention strategies from RETAIN demonstration projects operating in Kansas, Kentucky, Minnesota, Ohio, and Vermont.
- The department routinely uses Job Corps’ demonstration authority to test and evaluate innovative and promising models to improve outcomes for youth. For example, the Chief Evaluation Office is overseeing a rigorous impact evaluation to examine the effectiveness of one of these pilots, the Cascades Job Corps College and Career Academy, in collaboration with ETA.
10
Use of Evidence in Competitive Grant Programs
Did the agency use evidence of effectiveness when allocating funds from its competitive grant programs in FY22 (examples: tiered-evidence frameworks, evidence-based funding set-asides, priority preference points or other preference scoring for evidence, and pay for success provisions)?
8.1 What were the agency’s five largest competitive programs and their appropriations amount (and were city, county, and/or state governments eligible to receive funds from these programs)?
- In FY22, the five largest competitive grant programs and their appropriation amounts were:
- Senior Community Service Employment Program: approximately $401,000,000 in continuation funds; eligible applicants are non-profit organizations, federal agencies, and tribal organizations.
- Apprenticeship Building America Grants: approximately $122,000,000; eligible applicants are nonprofits, labor organizations, public and state institutions of higher education, and county governments.
- National Farmworker Jobs Program: approximately $95,000,000 in continuation funds; eligible applicants are entities with an understanding of the problems of eligible migrant and seasonal farmworkers.
- YouthBuild: approximately $90,000,000: eligible applicants are public and private nonprofit agencies.
- Indian and Native American Program Employment and Training Grants: approximately $72,000,000; eligible applicants generally include federally recognized Indian tribes, tribal organizations as defined in 25 U.S.C. 450b, Alaska-Native-controlled organizations, Native-Hawaiian-controlled organizations, and Indian-controlled organizations as defined at 20 CFR 684.130.
- During the summer of 2021, ETA held a series of stakeholder listening sessions focused on grant equity in an effort to establish a baseline understanding of potential barriers to greater equity in the mix of grant applicants, peer reviewers, awardees, and communities served. This information will help inform future grant making decisions.
8.2 Did the agency use evidence of effectiveness to allocate funds in its five largest competitive grant programs (e.g., were evidence-based interventions/practices required or suggested and was evidence a significant requirement)?
- YouthBuild applicants are awarded points based on past performance; these metrics are viewed as important to demonstrating successful career outcomes for youth. As a pre-apprenticeship program that prepares young people for the construction industry and other in-demand industries, YouthBuild supports the evidence-based national strategy of apprenticeship.
- Other competitive grant programs that score applications for past performance and use of evidence-informed strategies are the Senior Community Service Employment Program and the National Farmworker Jobs Program.
8.3 Did the agency use its five largest competitive grant programs to build evidence (e.g., requiring grantees to participate in evaluations)?
- All five of DOL’s largest grant programs may be involved in evaluations designed by the Chief Evaluation Office and the relevant DOL agencies. In each case DOL requires or encourages (through language in the funding announcement and proposal review criteria) grantees to use evidence-based models or strategies in grant interventions and/or to participate in an evaluation, especially to test new interventions that theory or research suggest are promising.
- The department has recently launched a multi-year implementation study of the Senior Community Service Employment Program as well as other workforce programs for older workers to build the evidence base on these programs and identify future research options. There are options for more rigorous evaluations in the contract as appropriate.
8.4 Did the agency use evidence of effectiveness to allocate funds in any other competitive grant program (besides its five largest grant programs)?
- The Department of Labor includes requirements of demonstrated effectiveness in the allocation of funds, as well as the commitment to building new evidence in order to receive funds, both of which are of equal importance given that many DOL-funded programs lack a sufficient body of evidence to support only those that are already evidence-based. For example, among recent ETA competitive grant programs, this has involved requiring: (1) demonstration of an approach as being evidence-based or promising for receipt of funds (i.e., Reentry Employment Opportunities Funding Opportunity Announcement) or for potential to receive additional funds (i.e., TechHire); (2) an independent third party local or grantee evaluation with priority incentives for rigorous designs (e.g., tiered funding, scoring priorities, and bonus scoring for evidence-based interventions or multi-site rigorous tests); or (3) full participation in an evaluation as well as rigorous grantee (or local) evaluations (e.g., American Apprenticeship Initiative, America’s Promise Job-Driven Grant Program, and the Strengthening Community College Training Grants). Additionally, applicants for ILAB’s competitive funding opportunities are required to conduct and/or participate in evaluations as a condition of award. The department is also conducting an evaluation of the Pathway Home grant program, which builds on what was learned from the Linking Employment Activities Pre-release (LEAP) program evaluation. This evaluation will build knowledge about the grant models and include the development of a feasibility and design options paper for implementation and impact evaluations.
8.5 What are the agency’s one or two strongest examples of how competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
- In 2015, DOL funded an evaluation of the 36-month LEAP Program that included an implementation study of LEAP pilot programs that provided jail-based American Job Centers (AJCs) to individuals preparing to reenter society after time in jail. The findings of the evaluation identified many promising practices for offering both pre- and post-release services and were published in 2018 (see the Final Report and Issue Brief Compendium). Starting in 2020, DOL has funded Pathway Home Pilot Projects and accompanying evaluation that builds on lessons learned from the LEAP program by providing workforce services to incarcerated individuals pre- and post-release. For example, the Pathway Home grant requirement for participants to maintain the same caseworker pre- and post-release was suggested as a promising practice in the LEAP Implementation Study.
- The department funded a national evaluation of the Trade Adjustment Assistance Community College and Career Training grant program, which was a $1,900,000,000 initiative consisting of four rounds of grants, from 2011 to 2018. The grants were awarded to institutions of higher education (mainly community colleges) to build their capacity to provide workforce education and training programs. The implementation study assessed the grantees’ implementation of strategies to better connect and integrate education and workforce systems, address employer needs, and transform training programs and services to adult learners. The synthesis identified key implementation and impact findings based on a review of evaluation reports completed by grantees’ third party evaluators. The outcomes study examined the training, employment, earnings, and self-sufficiency outcomes of nearly 2,800 participants from nine grants in Round 4. Findings from these studies provide evidence-based practices and insights that are being applied to the three rounds of Strengthening Community College Training Grants Funding Opportunity Announcements in 2021 and 2022, as well as future DOL investments.
8.6 Did the agency provide guidance that makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
- The Department of Labor has a formal Evaluation Policy. Guidance on using funds to conduct and/or participate in program evaluations and/or to strengthen their evaluation building efforts can be found in each grant funding opportunity; this use of funds is a condition of many grants. The Special Program Requirements section of the respective grant funding opportunity notifies grantees of this responsibility. Generally, this section states that “As a condition of grant award, grantees are required to participate in an evaluation”, if undertaken by DOL. The evaluation may include an implementation assessment across grantees, an impact and/or outcomes analysis of all or selected sites within or across grantees, and a benefit/cost analysis or assessment of return on investment. Conducting an impact analysis could involve random assignment [i.e, random assignment of eligible participants into a treatment group that would receive program services or enhanced program services or into control group(s) that would receive no program services or program services that are not enhanced].
- The department may require applicants to collect data elements to aid the evaluation. As a part of the evaluation, as a condition of award, grantees must agree to (1) make records on participants, employers, and funding available to the evaluation contractor; (2) provide access to program operating personnel, participants, and operational and financial records, and any other pertinent documents to calculate program costs and benefits; (3) in the case of an impact analysis, facilitate the assignment by lottery of participants to program services (including the possibility of increased recruitment of potential participants); and (4) follow evaluation procedures as specified by the evaluation contractor under the direction of DOL, including after the period of operation. After award, grantees will receive detailed guidance on ETA’s evaluation methodology, including requirements for data collection. Grantees will receive technical assistance to support their participation in these activities.
7
Use of Evidence in Noncompetitive Grant Programs
Did the agency use evidence of effectiveness when allocating funds from its non-competitive grant programs in FY22 (examples: evidence-based funding set-asides, requirements to invest funds in evidence-based activities, and pay for success provisions)?
9.1 What were the agency’s five largest noncompetitive programs and their appropriation amounts (and were city, county, and/or state governments eligible to receive funds from these programs)?
- In FY22, the five largest noncompetitive grant programs at DOL are in the ETA:
- Unemployment Insurance State Administration: $2,591,816,000; eligible grantees: city, county, and/or state governments.
- Dislocated Worker Employment and Training formula grants: $1,075,553,000; eligible grantees: city, county, and/or state governments.
- Youth Activities: $933,130,000; eligible grantees: city, county, and/or state governments.
- Adult Employment and Training Activities: $870,649,000; eligible grantees: city, county, and/or state governments.
- Employment Security Grants to States: $675,052,000; eligible grantees: city, county, and/or state governments.
9.2 Did the agency use evidence of effectiveness to allocate funds in the largest five noncompetitive grant programs (e.g., are evidence-based interventions/practices required or suggested and Is evidence a significant requirement)?
- A signature feature of WIOA (Pub. L. 113-128) is its focus on the use of data and evidence to improve services and outcomes, particularly in provisions related to states’ role in conducting evaluations and research, as well as in requirements regarding data collection, performance standards, and state planning. Conducting evaluations is a required statewide activity, but there are additional requirements regarding coordination (with other state agencies and federal evaluations under WIOA), dissemination, and provision of data and other information for federal evaluations.
- Evidence and performance provisions of WIOA (1) increased the amount of WIOA funds states can set aside and distribute directly from 5 to 10% to 15% and authorized them to invest these funds in pay for performance initiatives; (2) authorized states to invest their own workforce development funds, as well as non-federal resources, in pay for performance initiatives; (3) authorized local workforce investment boards to invest up to 10% of their WIOA funds in pay for performance initiatives; and (4) authorized states and local workforce investment boards to award pay for performance contracts to intermediaries, community based organizations, and community colleges.
9.3 Did the agency use its five largest non-competitive grant programs to build evidence (e.g., requiring grantees to participate in evaluations)?
- Section 116(e) of WIOA and CFR 668.220 describe how the state, in coordination with local workforce boards and state agencies that administer the programs, shall conduct ongoing evaluations of activities carried out in the state under these state programs. These evaluations are intended to promote, establish, implement, and utilize methods for continuously improving core program activities in order to achieve high-level programs within and high-level outcomes from the workforce development system.
- The Employment and Training Administration sponsors WorkforceGPS, which is a community point of access to support workforce development professionals in their use of evaluations to improve state and local workforce systems. Professionals and leaders can access a variety of resources and tools, including an Evaluation Peer Learning Cohort to help them improve their research and evaluation capacities. The WorkforceGPS includes links to resources on evaluation assessment readiness, evaluation design, and performance data, all focused on improving the public workforce system. As of FY2021, eighteen state teams, consisting of the WIOA core partners, have voluntarily participated in the Evaluation Peer Learning Cohort technical assistance activities to gauge and build capacity for research and evaluation.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other noncompetitive grant programs (besides its five largest grant programs)?
- Reemployment Services and Eligibility Assessments funds must be used for interventions or service delivery strategies demonstrated to reduce the average number of weeks of unemployment insurance a participant receives by improving employment outcomes. The law provides for a phased implementation of the new program requirements over several years. In FY19, DOL awarded $130,000,000 to states to conduct RESEA programs that met these evidence of effectiveness requirements. Beginning in FY23, states must also use no less than 25% of RESEA grant funds for interventions with a high or moderate causal evidence rating and show a demonstrated capacity to improve outcomes for participants. This percentage increases in subsequent years until after FY26, when states must use no less than 50% of such grant funds for such interventions. Training and Employment Guidance Letter No. 05-21 and Unemployment Insurance Program Letter No. 10-22 describe the evaluation and evidence-based expectations for FY22 through FY27. These expectations stipulate the use of existing evidence, building future evidence, and using funding for interventions with high or moderate causal ratings.
9.5 What are the agency’s one or two strongest examples of how noncompetitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
- Institutional Analysis of American Job Centers: the goal of the evaluation was to understand and systematically document the institutional characteristics of AJCs and to identify variations in service delivery, organization structure, and administration across AJCs.
- Career Pathways Descriptive and Analytical Study: The purpose of this project was to build evidence about the implementation and effectiveness of career pathways approaches to education and training. In 2018 the Chief Evaluation Office partnered with the ETA to conduct the Career Pathways Descriptive and Analytical project, which included a portfolio of three studies: a meta-analysis of the impacts of career pathways program approaches, a longitudinal career trajectories and occupational transitions study, and an exploratory machine learning study. Researchers used data from four large nationally representative longitudinal surveys, as well as licensed data on occupational transitions from online career profiles, to examine workers’ career paths and wages. One of the final products was an interactive Career Trajectories and Occupational Transitions Dashboard.
- Analysis of Employer Performance Measurement Approaches: The goal of the study was to examine the appropriateness, reliability, and validity of proposed measures of effectiveness in serving employers, as required under WIOA. It included knowledge development to understand and document the state of the field, an analysis and comparative assessment of measurement approaches and metrics, and the dissemination of findings through a report, as well as research and topical briefs. Although the authors did not find an overwhelming case for adopting either one measure or several measures, adopting more than one measure offers the advantage of capturing more aspects of performance and may make results more actionable for the different Title I, II, III, and IV programs. Alternatively, a single measure has the advantage of clarity on how state performance is assessed and fewer resources devoted to record keeping.
9.6 Did the agency provide guidance that makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity building efforts?
- The Employment and Training Administration’s RESEA grantees may use up to 10% of their grant funds for evaluations of their programs. The administration released specific evaluation guidance to help states understand how to conduct evaluations of their RESEA interventions with these grant funds. The goal of the agency guidance, along with the evaluation technical assistance being provided to states and their partners, is to build states’ capacity to understand, use, and build evidence.
- Section 116 of WIOA establishes performance accountability indicators and performance reporting requirements to assess the effectiveness of states and local areas in achieving positive outcomes for individuals served by the workforce development system’s core programs. Section 116(e) of WIOA and CFR 668.220 require states to “employ the most rigorous analytical and statistical methods that are reasonably feasible, such as the use of control groups” and requires that states evaluate the effectiveness of their WIOA programs in an annual progress report that includes updates on (1) current or planned evaluation and related research projects, including methodologies used; (2) efforts to coordinate the development of evaluation and research projects with WIOA core programs, other state agencies, and local boards; (3) a list of completed evaluation and related reports with publicly accessible links to such reports; (4) efforts to provide data, survey responses, and timely visits for federal evaluations; (5) any continuous improvement strategies utilizing results from studies and evidence-based practices evaluated. States are permitted to use WOIA grant funds to perform the necessary performance monitoring and evaluations. States are also required to describe their approach to conducting evaluations in state plans submitted to ETA and partner agencies.
4
Repurpose for Results
In FY22, did the agency shift funds away from or within any practice, policy, or program that consistently failed to achieve desired outcomes (examples: requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; incentivizing or urging grant applicants to stop using ineffective practices in funding announcements; proposing the elimination of ineffective programs through annual budget requests; incentivizing well-designed trials to fill specific knowledge gaps; supporting low-performing grantees through mentoring, improvement plans, and other forms of assistance; and using rigorous evaluation results to shift funds away from a program)?
10.1 Did the agency have policy(ies) for determining when to shift funds away from grantees, practices, policies, interventions, and/or programs that consistently failed to achieve desired outcomes, and did the agency act on that policy?
- Many of the ETA’s competitive grant programs require consideration of past performance. For example, prospective YouthBuild applicants are selected in part on the basis of their past performance of demonstrated effectiveness in achieving critical outcomes for youth.
- Reforming Job Corps provides an example of such efforts to repurpose resources based upon a rigorous analysis of available data. As part of this reform effort, DOL’s FY20 budget request ends the Department of Agriculture’s involvement in the program, unifying responsibility in DOL. Workforce development is not a core Department of Agriculture role, and the twenty-five centers it operates are overrepresented in the lowest performing cohort of centers.
10.2 Did the agency identify and provide support to agency programs or grantees that failed to achieve desired outcomes?
- The Department’s ETA staff review and provide feedback on the WIOA state plans and two-year modifications. Feedback and technical assistance are provided on planned evaluations. To supplement this feedback, ETA sponsors the Evaluation and Research Hub on WorkforceGPS, which is a community point of access to support workforce development professionals in their use of evaluations to improve state and local workforce systems. Additionally, ETA provides technical assistance, followed by sanctions, to states that fail to achieve negotiated levels of performance under WIOA.
- Professionals and leaders also can access a variety of resources and tools, including a learning cohort community to help improve their research and evaluations capacities. The WorkforceGPS includes links to resources on assessment readiness, evaluation design, and performance data, all focused on improving the public workforce system.