2022 Federal Index


Substance Abuse and Mental Health Services Administration

Score
9
Leadership

Did the agency have senior staff members with the authority, staff, and budget to build and use evidence to inform the agency’s major policy and program decisions in FY22?

1.1 Did the agency have a senior leader with the budget and staff to serve as the agency’s evaluation officer or equivalent (example: Evidence Act 313)?
  • The director of the SAMHSA Center for Behavioral Health Statistics and Quality (CBHSQ), Office of Evaluation, serves as the agency’s evaluation lead with key evaluation staff housed in this division. The Office of Evaluation is responsible for providing centralized planning and management of SAMHSA’s program evaluations in partnership with the originating center’s program. This office has led the agency in the development of several resources and activities in support of the Foundations for Evidence-Based Policymaking Evidence Act, including a Performance Monitoring Policy and Procedures document, an Evaluation Plan for FY 2023 (SAMHSA Report of Ongoing and Planned Evaluations for Fiscal Year 2023), and the creation of the SAMHSA Evidence and Evaluation Board, including a charter and confirmation of voting members. The purpose of the Evidence and Evaluation Board is to serve as the agency’s principal evaluation and evidence forum for managing its evaluation portfolio and its evaluation and evidence data and as a strategic asset to support SAMHSA in meeting its mission and agency priorities, including implementation of the Evidence Act.
1.2 Did the agency have a senior leader with the budget and staff to serve as the agency’s chief data officer or equivalent [example: Evidence Act 202(e)]?
  • The CBHSQ director serves as the chief data officer for SAMHSA, as articulated by the agency’s Evaluation of SAMHSA Programs and Policies. As director of CBHSQ, the chief data officer has a center budget that includes evidence building activities.  The CBHSQ director is responsible for overseeing survey and surveillance datasets managed by SAMHSA and serves as a critical member of the Evidence and Evaluation Board.
1.3 Did the agency have a governance structure to coordinate the activities of its evaluation officer, chief data officer, statistical officer, performance improvement officer, and other related officials in order to support Evidence Act implementation and improve the agency’s major programs?
  • The SAMHSA Evidence and Evaluation Board (Section 4.11 of the SAMHSA Evaluation Policy and Procedure document) coordinates activities of the evaluation officer, chief data officer, statistical officer, and performance improvement officers (in all centers and offices) and provides a structured environment to pursue alignment with the framework offered by the Evidence Act. The board meets every other month. Although the board is facilitated by the evaluation officer and the chief data officer, the vice chair position is a rotating one previously served by the director of the Office of Behavioral Health Equity, the Legislative Office, and the Center for Substance Abuse Treatment. The charter and the notes from each meeting are shared through SAMHSA’s intranet.
  • The SAMHSA Evidence and Evaluation Board serves as the mechanism to both generate and disseminate knowledge and best practices relative to the requirement of the Evidence Act. For example, in FY2022, the Evidence and Evaluation Board enabled SAMHSA to work collaboratively across the agency to propose, refine, and approve the definition of “significant.” A similar process will be used to develop a set of standardized evaluation questions to be considered for all evaluation proposals including an examination of work related to behavioral workforce diversity and support for recovery and assessing the number of individuals trained and hired with lived experience.
  • Through the coordination of the Evidence and Evaluation Board, SAMHSA will improve operations in several ways, including the development of a bank of evaluation questions and evaluation templates and a repository of past evaluations and evidence building activities. The agenda for each board meeting includes at least one item devoted to conveying best practices on a selected evaluation topic. Participating in the board and sharing each center’s and offices’ activities, cultivating collaboration, and leveraging existing resources across centers and offices also helps improve operations by reducing redundancy of effort.
Score
10
Evaluation & Research

Did the agency have an evaluation policy, evaluation plan, and learning agenda (evidence building plan) and did it publicly release the findings of all completed program evaluations in FY22?

2.1 Did the agency have an agency-wide evaluation policy [example: Evidence Act 313(d)]?
  • In FY22, SAMHSA developed and approved an Evaluation of SAMHSA Programs Policies and Procedures document, which incorporates guidance provided by the Foundations for Evidence-based Policymaking Act of 2018 (Evidence Act). In recognition of the need to formalize a systematic approach to planning, managing, and overseeing programmatic and policy evaluation activities within SAMHSA, this document provides guidance to imbue core principles of consistency, quality, and rigor into all SAMHSA evaluations of program and policies while ensuring that they are conducted in an ethical manner and that the dignity, respect, rights, and privacy of all participants are zealously safeguarded. Completed significant evaluations will be posted publicly (https://www.samhsa.gov/data/program-evaluations/evaluation-reports).
  • In accordance with this policy document and the Evidence Act, SAMHSA created an agency-wide Evidence and Evaluation Board. Building on the Evaluation of SAMHSA Programs Policies and Procedures documents, the board drafted a SAMHSA FY23 Evaluation Plan that includes ongoing and planned evaluations for FY23. An Evaluation Plan will be drafted annually, with the Evaluation of SAMHSA Program Policies and Procedures document reviewed every two years. A Learning Agenda is under development by the Evidence and Evaluation Board and will be annually reviewed and updated, as needed.
  • In addition to the Policies and Procedure documents, during this fiscal year, SAMHSA dedicated staff and resources from all offices and centers to update its DIS. The statement is required of discretionary grant programs and is designed to support greater diversity, equity, and inclusion among those impacted by SAMHSA grants by raising awareness and intention to include populations that are not represented or suffer health disparities. This revised DIS template for grantees was implemented in early FY23.
2.2 Did the agency have an agency-wide evaluation plan [example: Evidence Act 312(b)]?
  • As part of the Evidence Act, agencies within the U.S. Department of Health and Human Services (HHS) submitted a plan that lists and describes the specific evaluation activities the agency plans to undertake in the fiscal year following the year in which the evaluation plan is submitted (referred to as the HHS Evaluation Plan). The HHS Evaluation Plan and Evidence Building Plan are organized based on priority areas drawn from HHS’s departmental priorities, proposed strategic plan goals, and proposed agency priority goals. The Substance Abuse and Mental Health Administration contributed to both the HHS Evaluation Plan and the Evidence Building Plan and plays an active role in HHS monthly meetings.
2.3 Did the agency have a learning agenda (evidence building plan) and did the learning agenda describe the agency’s process for engaging stakeholders including but not limited to the general public, state and local governments, and researchers/academics in the development of that agenda (example: Evidence Act 312)?
  • As an agency within HHS and an active participant in HHS Evidence and Evaluation Policy Council, SAMHSA contributed to the HHS learning agenda, participated in monthly meetings, and contributed to developing the Evidence Building Plan. One example of a SAMHSA contribution to the HHS strategy was its support for evidence building for the first HHS priority area: protect and strengthen equitable access to high-quality and affordable health care. During the COVID-19 pandemic, to avoid placing additional burden on state and local governments and representatives of non-governmental research, HHS engaged a range of stakeholders with various expertise across the department, utilizing existing communication channels and bodies, such as the HHS Evidence and Evaluation Council.
  • Through the Evidence and Evaluation Board, SAMHSA is working on an agency-specific evidence building plan that will include ongoing and proposed evaluations, performance monitoring for discretionary and block grants, foundational fact finding (through discretionary grant program profiles), and policy and evidence-based practices (through the SAMHSA Policy Lab). The engagement strategy SAMHSA will use is still under development but will likely include input from SAMHSA’s National Advisory Councils and through partnerships with SAMHSA regional offices.  Internal stakeholders will be engaged through the Evidence and Evaluation Board as well as cross-agency activities conducted by CBHSQ, such as data parties (activities designed to examine SAMHSA data for problem solving and sharing of diverse perspectives and to promote opportunities for SAMHSA to discuss ways to improve data collection, data quality, and data use) and individual outreach to key internal informants and champions.
  • Similar to the HHS plan, SAMHSA’s evidence building plan and learning agenda will include the agency’s five priority areas: overdose prevention; enhancing access to suicide prevention and crisis care; promoting resilience and emotional health for children, youth, and families; integrating behavioral and physical health care; and strengthening the behavioral health workforce. The cross-SAMHSA areas include equity, trauma-informed approaches, and commitment to data and evidence. The evidence building plan will also include conclusions found through activities legislatively mandated and process evaluations, such as the triennial report required for the Projects for Assistance in Transition from Homelessness (PATH).
2.4 Did the agency publicly release all completed program evaluations?
  • SAMHSA has website architecture to support the impending posting of the approved/cleared program evaluation reports that are currently undergoing 508 compliance conversion. The Programs Evaluations page provides access to Evaluation Reports, along with Evaluation Policies (directly linked to Evidence Act requirements), Ongoing Evaluations, and Evidence-Based Resources pages. SAMHSA is working to populate these pages with an archive of previous evaluations for purposes of transparency and to post current and future evaluation results as they are completed.
  • Publicly available evaluations analyze data by race, ethnicity, and gender, among other elements such as social determinants of health (e.g., stable housing and employment). SAMHSA strives to share program data whenever possible to promote continuous quality improvement. For example, SAMHSA’s PATH funds services for people with serious mental illness experiencing homelessness; annual data may be found online. Similarly, comparative state mental health data from block grants can be found in the SAMHSA Uniform Reporting System output tables.
  • SAMHSA shared evaluation and performance measurement data on all programs in its publicly available FY23 Congressional Justification.
  • SAMHSA is in the process of sharing several evaluations either in full or through a spotlight. For example, SAMHSA’s evaluation report for PATH has been posted to Evaluation Reports. The Strategic Prevention Framework-Prescription Drugs (SPF-Rx) program evaluation has been approved for public release and will be posted to Evaluation Reports upon completion of the 508 compliance process.
  • SAMHSA has many ongoing evaluations with results not yet available for release. Once these program evaluation reports and related materials have been finalized and cleared and have successfully completed the 508 process, they will be posted to Evaluation Reports. SAMHSA has also developed a process to not only publicly share program evaluations but also to ensure that future evaluations include a discussion of the dissemination plan during the early stages of development.
  • Not as an evaluation but  as a foundational fact-finding activity, annual project profiles were developed by CBHSQ in partnership with SAMHSA centers for discretionary grants (such as client demographics, changes in social determinants of health, and pre/post changes in substance use) covering a set of performance indicators to track and monitor performance.
  • In FY22, SAMHSA created a cross-center workgroup to systematically share data collected through discretionary grant programs. The workgroup developed a strategy for sharing data and evidence to both internal and external stakeholders including newsletters highlighting topics (such as the Minority AIDS Initiative and women’s health month) and a SAMHSA Stats e-blast listserv that shares Government and Performance Results Act data monthly to over 80,000 registered users. Data were also shared on SAMHSA’s blog.
2.5 Did the agency conduct an Evidence Capacity Assessment that addressed the coverage, quality, methods, effectiveness, and independence of the agency’s evaluation, research, and analysis efforts [example: Evidence Act 315, subchapter II (c)(3)(9)]?
  • As part of the HHS Evidence and Evaluation Council, all agencies within the department conducted an internal capacity assessment. The SAMHSA assessment was included in the HHS report. In FY22, SAMHSA created an Evidence and Evaluation Board. This board is composed of the directors of each of SAMHSA’s centers and offices as well as the chief data officer, evaluation officer, and statistician. This board will examine agency capacity, quality, and evaluation efforts. For FY22, the board served as the lead body to assess evidence capacity, as well as taking the lead to ensure that conclusions and recommendations from evaluations and evidence activities are included in discussions regarding future funding activities.
2.6 Did the agency use rigorous evaluation methods, including random assignment studies, for research and evaluation purposes?
  • Instead of applying one strategy for all evaluations, SAMHSA employs a variety of models including performance monitoring, formative, process, and summative evaluations using primarily quantitative data and mixed methods when appropriate and available. In FY22, SAMHSA developed an Evaluation Policies and Procedures document. This document articulates these principles for evaluation. In recognition of the need to formalize a systematic approach to planning, managing, and overseeing programmatic and policy evaluation activities within SAMHSA, it provides guidance to imbue core principles of consistency, quality, and rigor into all SAMHSA evaluations of program and policies while ensuring that evaluations are conducted in an ethical manner and that the dignity, respect, rights, and privacy of all participants are zealously safeguarded. This document is reviewed every two years by the Evidence and Evaluation Board and updated as needed.
  • SAMHSA strives for a balance between the need for collecting data and the desire to minimize grantee burden. For example, in FY21, an evaluation of SAMHSA’s Naloxone Education and Distribution Program used a mixed methods approach, examining qualitative data from key informant interviews and focus groups, coupled with SAMHSA’s discretionary grant data collected through the SAMHSA Performance Accountability and Reporting System (SPARS). Another example is a final report for SAMHSA’s SPF-Rx program that included several sources of primary and secondary quantitative data (for example, from SAMHSA and the Centers for Disease Control and Prevention) mixed with interviews, all in response to three primary evaluation questions. This evaluation utilized a quasi-experimental model (differences in design) and external administrative data to compare grant-funded areas to comparison counties.
  • In addition, recognizing that one size does not fit all, SAMHSA has developed a draft evaluation plan that includes a dissemination strategy for each of its current evaluation projects. The plan is still under review by the Evidence and Evaluation Board. As part of this work, all proposed and ongoing evaluations will be required to share the evaluation model and data to be included in the evaluation work. These evaluations will be encouraged to consider a mixed methods approach and to employ the most rigorous methods possible. Evaluation work must also consider how the findings will be shared with internal and external stakeholders (e.g., full report shared on SAMHSA’s website or a spotlight highlighting key findings).
  • The Substance Abuse and Mental Health Services Administration is partnering with the National Institute on Drug Abuse to support the HEALing Communities Study, which is a research initiative that intends to enhance the evidence base for opioid treatment options. Launched in 2019, this study aims to test the integration of prevention, overdose treatment, and medication-based treatment in select diverse communities hard hit by the opioid crisis. This comprehensive treatment model will be tested in a coordinated array of settings, including primary care, emergency departments, and other community settings. Findings will establish best practices for integrating prevention and treatment strategies that can be replicated by communities nationwide.
  • SAMHSA has also supported the National Study on Mental Health, which intends to provide national estimates of mental health and substance use disorders (SUDs) among U.S. adults aged eighteen to sixty-five. For the first time, this study will include adults living in households across the U.S. as well as in prisons, jails, state psychiatric hospitals, and homeless shelters. Data will be available in 2023.
Score
10
Resources

Did the agency invest at least 1% of program funds in evaluations in FY22 (examples: impact studies; implementation studies; rapid cycle evaluations; evaluation technical assistance; and rigorous evaluations, including random assignments)?

3.1  ___ invested $____ on evaluations, evaluation technical assistance, and evaluation capacity-building, representing __% of the agency’s $___ billion FY22 budget.
  • The Substance Abuse and Mental Health Services Administration invested $131,800,000 in evaluations, evaluation technical assistance, and evaluation capacity building, representing 2.01% percent of the agency’s $6,547,100,000 FY22 budget.
3.2 Did the agency have a budget for evaluation and how much was it (were there any changes in this budget from the previous fiscal year)?
  • SAMHSA’s evaluation budget remained the same from the previous fiscal year.
  • Its FY22 evaluation budget is $133,600,000, and it comes from Section 241 of the Public Health Service Act evaluation funds issued through annual appropriations for section 1935(b) activities.
  • Within the Consolidated Appropriations Act of 2022, P.L. 117-103, SAMHSA is authorized to allocate up to 5% of the amounts appropriated for data collection and program evaluation under Sec. 1920 and for technical assistance, national database, data collection, and program evaluations under Sec. 1921.
3.3 Did the agency provide financial and other resources to help city, county, and state governments or other grantees build their evaluation capacity (including technical assistance funds for data and evidence capacity building)?
  • SAMHSA aims to provide communities, clinicians, policymakers, and others in the field with the information and tools they need to incorporate evidence-based practices into their communities or clinical settings. A series of EBP guides on the Evidence-Based Practices Resource Center (EBPRC) website provide resources and guidance on capacity building and evaluation on issues and topics across the spectrum of SAMHSA’s mission. A recent example (posted September 2022) is Addressing Burnout in the Behavioral Health Workforce through Organizational Strategies. (Burnout is a complex issue resulting from chronic workplace stress that encompasses exhaustion, depersonalization, and reduced personal accomplishment. This guide highlights organization-level interventions to prevent and reduce burnout among behavioral health workers).
  • The Peer Recovery Center of Excellence (Peer Recovery CoE) advances recovery supports and services for people with substance use disorders and their families. The Peer Recovery CoE website indicates it provides training and technical assistance to build and elevate an equitable peer workforce to deliver Peer Recovery Support Services by supporting peer integration and workforce development, Recovery Community Organization capacity building, and evidence-based practice dissemination.
  • The National Training and Technical Assistance Center for Child, Youth and Family Mental Health provides states, tribes, and communities with training and technical assistance on children’s behavioral health, with a focus on systems of care. Its training and technical assistance activities for clinical best practices, wraparound services, and workforce development focus on evaluation, fidelity assessment, and quality assurance, among nine other topics.
  • All SAMHSA grantees (discretionary and formula-based) may designate set-aside funds for data collection, data reporting into SAMHSA’s Performance Accountability and Reporting System (SPARS), and individual evaluation activities. To support grantees in reporting timely and accurate data to SAMHSA, SPARS provides access to online data entry, reporting, technical assistance, and training resources. There are multiple resources available to state and community grantees including a resource library with general and center-specific information on data collection, performance monitoring, disparities impact statements, and trauma-informed care (to name just a few). Access to these resources requires that grantees enter the password protected website.
  • In addition to the technical assistance and set-aside funds, SAMHSA offers webinars to provide detailed support for organizations unfamiliar with its grant application process. In FY22, two webinars were held to highlight valuable information from the manual available to applicants.
  • In FY22, SAMHSA completed a multiyear national evaluation of its Technology Transfer Centers program. This program comprises three networks: Addiction Technology Transfer Centers, Mental Health Technology Transfer Centers, and Prevention Technology Transfer Centers. Each network consists of a National Coordinator Center, ten regional centers, an American Indian and Alaska Native focused center, and a Hispanic/Latino focused center. This report will be used to improve services and capacity.
  • In FY22, SAMHSA also engaged to develop diversity, equity, and inclusion benchmarks that might be included in notices of funding opportunities to encourage applications from diverse organizations and those that address health disparities. These benchmarks are still in the planning stages, but one component would be to better understand where SAMHSA might wish to focus its marketing efforts to attract more applications from a more diverse array of organizations. This work was begun in FY22 and will be implemented in FY23. As part of this process, a logic model has been developed. SAMHSA is considering an evaluation in year two to better understand the impact of these efforts as well as to identify areas for future growth and focus.
Score
9
Performance Management / Continuous Improvement

Did the agency implement a performance management system with outcome-focused goals and aligned program objectives and measures and did it frequently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other dimensions of performance in FY22?

4.1 Did the agency have a strategic plan with outcome goals, program objectives (if different), outcome measures, and program measures (if different)?
  • The agency has a strategic plan that was developed by the prior Presidential administration. It is currently developing its next strategic plan. Its Strategic Plan for FY19 through FY23 outlines five priority areas with goals and measurable objectives to carry out its vision and mission. For each priority area, an overarching goal and series of measurable objectives are described followed by examples of key performance and outcome measures SAMHSA will use to track progress. As appropriate, SAMHSA plans to align with and support the goals and objectives outlined in the HHS Strategic Plan.
4.2 Does the agency use data/evidence to improve outcomes and return on investment?
  • The Office of Evaluation, in partnership with SAMHSA program centers and collaboration with program staff, oversees the identification of a set of performance indicators to monitor its programs and the development of periodic program profiles for use in agency planning, program change, and reporting to departmental and external organizations. The SAMHSA Performance Accountability and Reporting System serves as the mechanism for the collection of performance data from agency grantees. Program centers’ staff examine data entered in SPARS on a regular and real-time basis to manage grant programs and improve outcomes. The data in SPARS are available in .csv file, via report, or through data visualization.
  • In FY22, the system by which SAMHSA staff and grantees may view demographic data (SPARS) developed a list of proposed enhancements to allow internal and external stakeholders an opportunity to examine discretionary grant data (such as demographics and changes in stable housing, education, and employment) on a real time basis as well as the ability to compare clients by such characteristics as race, ethnicity, gender, and age over time. On an annual basis, SAMHSA produces SPARS-informed program profiles to examine a program’s performance. These profiles include disaggregate outcomes by race and other demographics as well as changes in behavior associated with time in the grant program. Data from these profiles are shared with grantees through a SPARS newsletter and through SAMHSA Stats e-blasts.
  • The Evidence and Evaluation Board will use data and evidence to advise SAMHSA on ways to improve outcomes and returns on its investment. As stated in the board’s charter, the role of the board is to create and maintain an inventory of all agency evaluations, past, current and future; assist in developing the criteria that the agency will use to define “significant” evaluations for the purpose of prioritization; apply these criteria to the evaluation inventory and discuss implications; make any needed adjustments or revisions to the criteria; review other evaluation activities undertaken through other planning to determine whether these activities are consistent with the maturity of the program, research questions, and degree of independence necessary to conduct a rigorous evaluation to the fullest extent possible; consistently match the type of evaluation activity with program maturity, complexity, and research goals; consistently determine the degree of independence of evaluation activities for different types of programs; incorporate these practices and considerations into the contract planning process; consistently collect and disseminate meaningful and critical findings to SAMHSA’s colleagues and to the behavioral health and scientific fields; be responsible for examining evidence discovered through evaluations; ensure that evaluation findings are shared to both internal and external stakeholders; incorporate these findings, as appropriate, into discussions regarding SAMHSA future activities and grants; review information on grantee challenges, innovations, and successes that will be reported back to the Evidence and Evaluation Board by government project officers as a component of evidence; and develop a “learning agenda” to identify priorities for future evaluation activities.
  • Through these functions, the Evidence and Evaluation Board will establish and foster a culture of evaluation and evidence information stewardship and use with the intent of maximizing the value of evaluation data for decision-making, accountability, and the public good.
4.3 Did the agency have continuous improvement or learning cycle processes to identify promising practices, problem areas, possible causal factors, and opportunities for improvement (examples: stat meetings, data analytics, data visualization tools, or other tools that improve performance)?
  • The Evidence and Evaluation Board is one of the primary agency-wide mechanisms for identifying promising practices, problem areas, possible causal factors, and opportunities for improvement. The inclusion and active participation of senior center/office and agency leadership in the board facilitates broad, rapid, agency-wide dissemination of best practices. The board has designated staff, who along with the chief data officer, the evaluation officer, and CBHSQ staff, work with vice chairs and others to identify promising practices, problem areas, causal factors, and opportunities for improvement from the different centers and offices within SAMHSA. For example, the results of the Results for America Report will be shared during the November meeting to consider areas for improvement in FY23.
  • In addition, SAMHSA is dedicated to continuous improvement in addressing behavioral health equity. To that end, a DIS is required of SAMHSA grant recipients. This statement is intended to aid both the grantee and SAMHSA in gaining a greater understanding of how funding is being used to reduce behavioral health disparities across the nation, in alignment with Executive Order 13985 (Advancing Racial Equity and Support for Underserved Communities through the Federal Government). The DIS helps establish expectations around tackling disparities, articulate how to address social determinants of health, and develop and implement a quality improvement plan to reduce identified disparities. The DIS has been adopted by some sister HHS operating divisions and is under consideration for adoption by others. In addition to the DIS, SAMHSA is developing internal equity dashboards to establish a baseline and track progress in this important area.
  • Beginning in April 2020 and extending through FY22 and into FY23, CBHSQ’s Office of Evaluation has offered weekly technical assistance and training on data analysis, performance management, and evaluation. These one-hour sessions offer opportunities for SAMHSA program center staff and CBHSQ to share challenges and opportunities faced by grantees and strategize solutions. These sessions also offer an opportunity for cross-center collaboration and process improvement as project officers share and learn from those managing programs in other centers. These cross-center meetings allow CBHSQ to learn about challenges in the field, technological challenges using SPARS, and opportunities to make the system more user friendly. The project officers often share grantee questions and concerns for discussion and joint problem solving. SAMHSA collects these questions to include in FAQ documents.
  • In FY22, SAMHSA began organizing data parties designed to examine SAMHSA data for problem solving, sharing of diverse perspectives, and to promote opportunities for SAMHSA to discuss ways to improve data collection, data quality, and data use. The first data party included nearly eighty SAMHSA staff discussing data related to SAMHSA’s Minority AIDS initiative grant programs including aggregate client-level data. The second data party focused on a discussion of how SAMHSA can use its data to address health disparities and the importance of the DIS or disparity impact statement. The most recent data party focused on data quality, the impact of missing data on SAMHSA’s ability to make conclusions about the data, and a discussion of the importance of clearly stating limitations to methodology and data analysis.
  • SAMHSA organized several listening sessions, embedded within topical summits, to inform agency work. For example, community members and individuals with lived experience provided critical insight during the listening sessions at the two most recent harm reduction summits, one with a focus on the needs of tribal communities and the second as a more general National Harm Reduction summit. SAMHSA worked with tribal leaders, the Indian Health Service, and the National Indian Health Board to develop the National Tribal Behavioral Health Agenda. Data derived from the listening sessions central to the recent Recovery Summit were used to inform the Office of Recovery.
Score
6
Data

Did the agency collect, analyze, share, and use high-quality administrative and survey data consistent with strong privacy protections to improve (or help other entities improve) outcomes, cost effectiveness, and/or the performance of federal, state, local, and other service providers programs in FY22 (examples: model data-sharing agreements or data-licensing agreements, data tagging and documentation, data standardization, open data policies, and data use policies)?

5.1 Did the agency have a strategic data plan, including an open data policy [example: Evidence Act 202(c), Strategic Information Resources Plan]?
  • The current SAMHSA Strategic Plan includes priority 4: improving data collection, analysis, dissemination, and program and policy evaluation. The next strategic plan is currently under revision and is expected to include the prioritization of equity, trauma-informed approaches, and a commitment to data and evidence across all policies and programs.
  • In addition to its strategic plan, SAMHSA is developing a SAMHSA Data Plan. Development of the plan includes collecting input from fourteen listening sessions with internal and external stakeholders and users of SAMHSA data. Through the work of the newly created position of chief diversity officer, equity, diversity, and inclusion principles will infuse the full scope of the data cycle (collection, analysis, and use in evidence-informed practices).
  • SAMHSA partners with the National Center for Health Statistics to offer individuals access to restricted use data for research and evaluation purposes. This is a carefully controlled process designed to ensure that data and the individuals who provide the data are protected.
5.2 Did the agency have an updated comprehensive data inventory (example: Evidence Act 3511)?
5.3 Did the agency promote data access or data linkage for evaluation, evidence building, or program improvement [examples: model data-sharing agreements or data-licensing agreements; data tagging and documentation; data standardization; downloadable machine-readable, de-identified tagged data; Evidence Act 3520(c)]?
  • The Center for Behavioral Health Statistics and Quality oversees data collection initiatives and provides publicly available datasets so that as much data as possible can be shared with researchers and other stakeholders while preserving client confidentiality and privacy.
  • The center recently developed a data transfer agreement for a uniform and protected sharing of data that began implementation in FY22. It uses the data transfer agreement for a uniform and protected sharing of data. Updated in FY22, a data transfer agreement for uniform and protected sharing of data began implementation by CBHSQ in FY22.  Additionally, as the main center within SAMHSA that collects, stewards, and disseminates data, CBHSQ is central to the process of developing a short-term and long-term Strategic Data Plan. SAMHSA is also working with HHS on a department-wide data strategy including a data maturity model and policies for data governance and sharing.
  • In FY21, CBHSQ built internal technical capacity for data collections and began the process of modernizing them. For example, the N-SSATS and N-MHSS have been combined into the National Substance Use and Mental Health Services Survey (NSUMHSS) to decrease burden and duplication of responses. The Substance Abuse and Mental Health Data Archive contains substance use disorder and mental illness research data available from CBHSQ’s seven data collections for restricted and public use. To promote the access and use of SAMHSA’s substance abuse and mental health data, SAMHDA provides public use data files and documentation for download, as well as online analysis tools to support a better understanding of this critical area of public health.
  • In addition, SAMHSA partners with the National Center for Health Statistics to make restricted use data available through the Research Data Center (RDC). The National Center for Health Statistics (NCHS) operates the RDC to allow researchers access to restricted use data. For access to these data, researchers must submit a research proposal outlining the need for their use. In FY21, many of the procedures for the application process moved in-house from the NCHS and a CBHSQ RDC website was created.
  • In FY22, SAMHSA engaged in rigorous activities to update the DIS, a secretarial priority from the HHS Action Plan to Reduce Racial and Ethnic Health Disparities (2011). The current objective is to “assess and heighten the impact of all HHS policies, programs, processes, and resource decisions to reduce health disparities. In support of this objective, HHS leadership will assure that . . . program grantees, as applicable, will be required to submit health disparity impact statements as part of their grant applications.” The secretarial priority focuses on underserved racial and ethnic minority populations (e.g., Black/African American, Hispanic/Latino, Asian American, Native Hawaiian and Pacific Islander,  and American Indian/Alaska Native). The Office of Behavioral Health Equity also includes LGBTQI+ populations as underserved disparity-vulnerable groups.
  • Through SPARS, grantees and SAMHSA program staff monitor the performance of grantees and, when performance is below targets, provide technical assistance and support. This allows SAMHSA to support communities during the grant process. Staff at SAMHSA meet with grantees regularly to discuss progress and to examine data entered in SPARS, thus ensuring timely submission of data. To quickly resolve issues as they arise, SPARS contractors and SAMHSA staff meet weekly and work closely together.
5.4 Did the agency have policies and procedures to secure data and protect personal confidential information (example: differential privacy; secure, multiparty computation; homomorphic encryption; or developing audit trails)?
  • The Substance Abuse and Mental Health Services Administration shares data in three ways: (1) on its website (SAMHDA), (2) through the restricted use data program; and (3) through a data use agreement. Policies and procedures to secure SAMHSA’s data and protect personal confidential information mirror those of sister operating divisions.
  • Data Use Agreement: In FY22, CBHSQ updated its data use agreement to enable data to be used internally by SAMHSA staff interested in gaining access to data sets as well as by external stakeholders, such as contractors and partnering federal agencies. Signed documents and data use agreements are held by SAMHSA’s confidentiality officer to ensure that all procedures are followed and adhered to including confidentiality training.
  • SAMHSA is piloting the revised data use agreement with two of its data systems; the current data use agreement is not available on the SAMHSA website while in pilot testing. This updated agreement manages personal confidential information by limiting its release. Direct identifying information of respondents is rarely released outside of the agency, except to CBHSQ data collection contractors that use these data to conduct survey operations. The restrictions on the sharing of this information must be conducted under a written agreement or contract that must be reviewed by the confidentiality officer and then approved by the director.
  • In all instances, CBHSQ must satisfy the requirements of federal law prior to any release. When combined, data on unique characteristics can identify a respondent. These detailed data are released only to agents. Agents can be other federal agencies, state or local governments, university researchers, private businesses, or CBHSQ contractors, as part of CBHSQ’s restricted use data program. There is an application process, with approved agents required to implement and adhere to security procedures to protect the data from unauthorized disclosure or access.
  • Once the application has been approved, confidentiality training must be completed and signed off, documenting that the researcher has read and will follow the RDC disclosure review policies and procedures. The confidentiality training, confidentiality forms, and disclosure manual outline the policies and procedures required to protect the data and prevent the disclosure of confidential information. Both the principal investigator and the analyst must complete the confidentiality training and sign the confidentiality forms. The completed certificate and data user agreement forms must be uploaded with the application to be considered a complete package.
  • Agents are subject to unannounced or announced inspections of their facilities to assess compliance with CBHSQ data security requirements. More importantly, measures specific to the source and type of data are implemented to protect confidentiality of the data.
  • Micro-agglomeration, Substitution, Subsampling, and Calibration: The NSDUH survey has developed a statistical disclosure control technique called micro-agglomeration, substitution, subsampling, and calibration (multifactor identification) to protect confidentiality of the data. This is a disclosure limitation methodology specifically developed for NSDUH to meet the requirements of the Confidential Information Protection and Statistical Efficiency Act. The goal of this technique is to control the disclosure risks while minimizing the impact of the disclosure control measures on the quality of the data in a comprehensive and integrated manner. It has been successfully used to create NSDUH public use files since 1999.
  • Confidentiality Officer and Training: In addition to having a confidentiality officer within CBHSQ who ensures that staff complete training and sign a confidentiality statement, SAMHSA offers a certificate of confidentiality that protects grantees from legal requests for names or other information that would personally identify participants in the evaluation of a grant, project, or contract. The CBHSQ trains all staff in good data stewardship, whether the data are covered by CIPSEA or the Privacy Act (5 U.S.C. 552a) and the Public Health Service Act [42 U.S.C.290aa(n)].
  • The Center for Behavioral Health Statistics and Quality National Data Sets: Multiple means are used to protect data and ensure the protection of personally identifiable information including encryption and limiting access to data.
  • Discretionary Grant Data: The data entry, technical assistance request, and training system for grantees to report performance data to SAMHSA is hosted by SPARS. This system serves as the data repository for the agency’s three centers: the Center for Substance Abuse and Prevention, Center for Mental Health Services (CMHS), and Center for Substance Abuse Treatment. To safeguard confidentiality and privacy, the current data transfer agreement limits the use of grantee data to internal reports so that data collected by SAMHSA grantees will not be available to share with researchers or stakeholders beyond SAMHSA and publications based on grantee data will not be permitted.
5.5 Did the agency provide assistance to city, county, and/or state governments and/or other grantees on accessing the agency’s datasets while protecting privacy?
  • The Substance Abuse and Mental Health Services Administration provides both public access and restricted use access to its datasets in a variety of ways, for example:
    • Data from CBHSQ’s various data collections’ data are available (1) as prepublished estimates, (2) via online systems, and (3) as microdata files. A description of CBHSQ’s products can be found in the SAMHDA.
    • SAMHSA partners with NCHS to make restricted use data available through the RDC to allow researchers access to restricted use data. For access to the restricted use data, researchers must submit a research proposal outlining the need for these data. The proposal provides a framework for CBHSQ to identify potential disclosure risks and determine how the data will be used.
Score
5
Common Evidence Standards / What Works Designations

Did the agency use a common evidence framework, guidelines, or standards to inform its research and funding purposes; did that framework prioritize rigorous research and evaluation methods; and did the agency disseminate and promote the use of evidence-based interventions through a user-friendly tool in FY22 (example: What Works Clearinghouses)?

6.1 Did the agency have a common evidence framework for research and evaluation purposes?
  • The SAMHSA Strategic Plan FY2019-FY2023 (pp. 20-23) outlines five priority areas to carry out the vision and mission of SAMHSA, including priority 4: improving data collection, analysis, dissemination, and program and policy evaluation. This priority includes three objectives: (1) to develop consistent data collection strategies to identify and track mental health and substance use needs across the nation; (2) to ensure that all SAMHSA programs are evaluated in a robust, timely, and high-quality manner; and (3) to promote access to and use of the nation’s substance use and mental health data and conduct program and policy evaluations and use the results to advance the adoption of evidence-based policies, programs, and practices.
  • SAMHSA has informally incorporated qualitative data into its framework through the feedback received by the project officers and through annual narrative reports submitted by grantees. It is in regular communication with grantees and the state/community programs regarding opportunities and challenges. It is beginning to develop a more formal process in FY22 for incorporating qualitative feedback into its evaluation process.
6.2 Did the agency have a common evidence framework for funding decisions?
  • Universal language about using EBPs is included in SAMHSA’s funding opportunity announcements (FOAs) also known as NOFO (or notice of funding opportunity). This language includes acknowledgement that “EBPs have not been developed for all populations and/or service settings,” thus encouraging applicants to “provide other forms of evidence” that a proposed practice is appropriate for the intended population.
  • Specifically, the language states that applicants should:
    (1) document that the EBPs chosen are appropriate for intended outcomes;
    (2) explain how the practice meets SAMHSA’s goals for the grant program;
    (3) describe any modifications or adaptations needed for the practice to meet the goals of the project;
    (4) explain why the EBP was selected;
    (5) justify the use of multiple EBPs if applicable; and
    (6) discuss training needs or plans to ensure successful implementation.
    Lastly, the language includes resources the applicant can use to understand EBPs. Federal grants officers work in collaboration with the SAMHSA Office of Financial Resources to ensure that grantee funding announcements clearly describe the evidence standard necessary to meet funding requirements.
  • SAMHSA developed a manual Developing a Competitive SAMHSA Grant Application explains information applicants will likely need for each section of the grant application. It has two sections devoted to evidence-based practices (p. 8, p. 26), including (1) a description of the EBPs applicants plan to implement, (2) specific information about any modifications applicants plan to make to the EBPs and a justification for making them, and 3) how applicants plan to monitor the implementation of the EBPs. In addition, if applicants plan to implement services or practices that are not evidence based, they must show that these services/practices are effective.
6.3 Did the agency have a clearinghouse(s) or user friendly tool that disseminated information on rigorously evaluated, evidence-based solutions (programs, interventions, practices, etc.) including information on what works where, for whom, and under what conditions?
  • The EBPRC provides communities, clinicians, policymakers, and others with the information and tools to incorporate evidence-based practices into their communities or clinical settings. It contains a collection of scientifically based resources for a broad range of audiences, including treatment improvement protocols, toolkits, resource guides, clinical practice guidelines, and other science-based resources. The retooled EBPRC neither accepts open submissions from outside program developers nor rates individual programs.
  • Because SAMHSA recognizes that one size does not fit all, although grantees are encouraged to consider the EBPs listed on the SAMHSA EBPRC website, they must provide information on the EBP they plan to implement. Their description should reference why each EBP is appropriate for the problem area addressed by the grant as well as the specific population(s) of focus. They are also asked for specific information about any modifications planned to make the EBPs and a justification for making these modifications as well as how the grantee will monitor the implementation of the EBPs to ensure that they are implemented according to EBP guidelines.
  • Recognizing that communities currently face unprecedented challenges with access to mental health and substance use services as well as behavioral health workforce challenges, SAMHSA strives to be flexible, understanding that if notices of funding opportunities are too prescriptive, it risks losing applicants without capacity to implement certain EBPs. An innovation designed to connect more diverse and historically marginalized populations to apply for grants is SAMHSA’s DIPS. The DIPS model helps to connect community-based organizations to funders and advocates at all levels (e.g., federal and philanthropic organizations, stakeholders, and communities).
6.4 Did the agency promote the utilization of evidence-based practices in the field to encourage implementation, replication, and application of evaluation findings and other evidence?
  • SAMHSA’’s EBPRC aims to provide communities, clinicians, policymakers, and others in the field with the information and tools they need to incorporate EBPs into their communities or clinical settings. It contains a collection of science-based resources, including treatment improvement protocols, toolkits, resource guides, and clinical practice guidelines for a broad range of audiences. As of June 2022, it includes 155 items, including 15 data reports, 23 toolkits, 6 fact sheets, and 91 practice guides.
  • The Mental Health Technology Transfer Center (MHTTC) Network engages with organizations and treatment practitioners involved in the delivery of mental health services to strengthen their capacity to deliver effective evidence-based practices to individuals, including the full continuum of services spanning mental illness prevention, treatment, and recovery support. Under the State Targeted Technical Assistance grant, the Opioid Response Network was created to support efforts to address opioid use disorder prevention, treatment, and recovery, and to provide education and training at the local level in evidence-based practices.
  • The Knowledge Application Program supports the professional development of behavioral health workers and provides information and resources on best practices. Specifically, this program provides substance use treatment professionals with publications that contain information on best treatment practices.
  • The Substance Abuse and Mental Health Services Administration (SAMHSA) promotes the utilization of evidence-based practices. Within grant applications, it encourages innovation. For example, the FY20-21 Substance Use Prevention and Treatment Block Grant Application includes the following language: There is increased interest in having a better understanding of the evidence that supports the delivery of medical and specialty care including mental/substance use disorder services. Over the past several years, SAMHSA has collaborated with CMS, HRSA, SMAs, state mental/substance use disorder authorities, legislators, and others regarding the evidence of various mental and substance misuse prevention, treatment, and recovery support services.
  • States and other purchasers are requesting information on evidence-based practices or other procedures that result in better health outcomes for individuals and the general population. While the emphasis on evidence-based practices will continue, there is a need to develop and create new interventions and technologies and in turn to establish the evidence. The Substance Abuse and Mental Health Services Administration supports states’ use of the block grants for this purpose. The National Quality Forum and the Institute of Medicine recommend that evidence play a critical role in designing health benefits for individuals enrolled in commercial insurance, Medicaid, and Medicare. To respond to these inquiries and recommendations, SAMHSA has undertaken several activities. Its EPBRC assesses the research evaluating an intervention’s impact on outcomes and provides information on forty-three resources to facilitate the effective dissemination and implementation of the program. The EPBRC provides the information and tools needed to incorporate evidence-based practices into communities or clinical settings.
Score
4
Innovation

Did the agency have staff, policies, and processes in place that encouraged innovation to improve the impact of its programs in FY22 (examples: prizes and challenges, behavioral science trials, innovation   labs/accelerators, performance partnership pilots, and demonstration projects or waivers with rigorous evaluation requirements)?

7.1 Did the agency have staff dedicated to leading its innovation efforts to improve the impact of its programs?
  • In FY22, the Center for Substance Use Prevention created an office to focus on innovation called the Office of Prevention Innovation. Using dashboards and data sharing, this office plans to use data to impact its programs and increase capacity for data-driven decision-making. Within the Office of Evaluation, CBHSQ assigned a staff member to focus on aligning SAMHSA to the Evidence Act and moving the goals of the Evidence and Evaluation Board forward.  Although the Office of Prevention Innovation and Office of Evaluation are housed in different centers, there will be regularly scheduled interfaces and other opportunities for engagement.
7.2 Did the agency have initiatives to promote innovation to improve the impact of its programs?
  • Within SAMHSA grant programs, the agency encourages innovation from every state, territory, and community applicant. For example, the FY20-21 Substance Use Prevention and Treatment Block Grant application includes the following language:
  • “There is increased interest in having a better understanding of the evidence that supports the delivery of medical and specialty care including M/SUD services. Over the past several years, SAMHSA has collaborated with CMS, HRSA, SMAs, state M/SUD authorities, legislators, and others regarding the evidence of various mental and substance misuse prevention, treatment, and recovery support services. States and other purchasers are requesting information on evidence-based practices or other procedures that result in better health outcomes for individuals and the general population. While the emphasis on evidence-based practices will continue, there is a need to develop and create new interventions and technologies and in turn, to establish the evidence. SAMHSA supports states’ use of the block grants for this purpose. The National Quality Forum and the Institute of Medicine recommend that evidence play a critical role in designing health benefits for individuals enrolled in commercial insurance, Medicaid, and Medicare. To respond to these inquiries and recommendations, SAMHSA has undertaken several activities. SAMHSA’s Evidence Based Practices Resource Center assesses the research evaluating an intervention’s impact on outcomes and provides information on available 43 resources to facilitate the effective dissemination and implementation of the program. SAMHSA’s EBPRC provides the information & tools needed to incorporate evidence-based practices into communities or clinical settings.”
  • The National Mental Health and Substance Use Policy Laboratory (Policy Lab) leads the equitable coordination, analysis, development, and implementation of national policy to promote mental health, prevent substance misuse and addiction, provide treatment, and support recovery. The Policy Lab supports SAMHSA leadership in national policy efforts through partnership and coordination within SAMHSA; across federal agencies; and with existing, new, and emerging stakeholders and constituent groups. The Policy Lab comprises the following units: (1) Office of the Director; (2) Policy Analysis, Development, and Implementation Team (PADI Team); (3) Evidence-Based Practices Implementation and Dissemination Team (EBP team); (4) Executive Correspondence and Support Branch; and (5) Legislative Affairs Branch.
    • The PADI Team integrates policy and subject matter expertise throughout SAMHSA priority areas (including workgroups, grants and budget review, and information sharing) and collaboration with SAMHSA’s centers and offices, Federal partners, and stakeholders, which ensures a feedback loop, positive adaptation, and collaboration.
    • The EBP Team uses information from evaluations, literature reviews, expert panels, and other sources to inform identification and implementation of policy change; assists in the development of programs, technical assistance, evaluations, and dissemination of best practices/lessons learned; and tracks current EBPs and conducts literature reviews on high-priority topic areas and programs.
    • The Executive Correspondence and Support Branch serves as the agency liaison and action office for policy coordination including management of correspondence, advisory committees, and Freedom of Information Act activities across SAMHSA.
    • The Legislative Affairs Team advises the assistant secretary on legislative matters; provides leadership in the development of legislation; and serves as the primary contact within SAMHSA on all legislative activities.
  • Also in FY22, SAMHSA created the Recovery Innovation Challenge. This competition seeks to identify innovative practices in behavioral health that advance recovery on the ground and in the community. This innovation challenge will support the Office of Recovery, which is designed to advance the agency’s commitment to support recovery for all Americans. The Office of Recovery will serve as a national clearinghouse and resource for recovery services across mental health, substance use, and co-occurring domains to promote recovery by working in partnership with recovery community leaders. The goal of this challenge is to identify innovative practices developed by individuals, groups, and organizations or within state systems that advanced recovery in the decade since SAMHSA established its working definition of recovery. This challenge will allow participants to share details about innovative practices and models being used to promote recovery on the ground and to demonstrate how these innovations have expanded upon SAMHSA’s definition and overcome challenges in incorporating recovery into their services or systems. This challenge competition will offer up to ten awards. The challenge purse is up to $400,000. At the time of this report, SAMHSA has received more than 300 applications.
  • In addition, SAMHSA created the Diversity Inclusion Project Showcase (DIPS), an initiative that aims to connect a more diverse pool of historically marginalized populations to grant funding opportunities by providing the chance to showcase their goals and populations served to federal and state leaders, as well as philanthropic partners. The DIPS model helps to connect community-based organizations to funders and advocates at all levels (e.g., federal and philanthropic organizations, stakeholders, and communities).
  • SAMHSA also sought innovative efforts to build its existing data sets. For example, SAMHSA, in partnership with other federal agencies, submitted a proposal to identify and prioritize research, innovation, and public health and funding gaps in substance use disorders and mental health disorders using machine learning techniques. If funded, this work would be completed in 24 months.
7.3 Did the agency evaluate its innovation efforts, including using rigorous methods?
  • Grantees report innovation and use of evidence-based practices in their reports as required by SAMHSA. Information from these reports is included in rigorous evaluations planned or used for performance management when a formal evaluation has not been scheduled.
  • SAMHSA completed a rigorous evaluation of its innovative SPF-Rx program. Prescription drug misuse continues to be a critical public health problem in the United States, and the Strategic Prevention Framework (SPF) consists of five steps. This framework is a data-driven systemic public health planning approach to substance use prevention that is theory based and involves the implementation of evidence-based strategies. Grantees and subrecipients apply the overarching principles of cultural competence and sustainability throughout the dynamic SPF process.
  • Between 2017 and 2021, twenty-five state and tribal SPF-Rx grantees received funding to focus on supporting high-need community efforts to implement evidence-based practices to prevent and reduce the misuse of prescription opioids. In FY21, findings from a completed evaluation informed SAMHSA of ways to better develop, assess, and manage its prescription drug misuse prevention programs. The cross-site evaluation collected quantitative and qualitative data from three instruments: an annual implementation instrument (130 items collecting data on organization type, funding levels, assessment of capacity building and sustainability, strategic planning, prevention and intervention programming, and ongoing local evaluations); grantee level and community level outcomes modules (two modules including data on prescription drug monitoring programs’ use and prescribing patterns, opioid overdose events, etc.); and grantee level interviews (qualitative data collected at baseline and at the end of the evaluation). Results of this evaluation were used to inform the next generation of SPF Rx grants.
  • The next evaluation of this SPF Rx program has been started with an evaluation plan drafted to include an examination of secondary data on matched communities for comparison of outcome data.
Score
8
Use of Evidence in Competitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its competitive grant programs in FY22 (examples: tiered-evidence frameworks, evidence-based funding set-asides, priority preference points or other preference scoring for evidence, and pay for success provisions)?

8.1 What were the agency’s five largest competitive programs and their appropriations amount (and were city, county, and/or state governments eligible to receive funds from these programs)?
8.2 Did the agency use evidence of effectiveness to allocate funds in its five largest competitive grant programs (e.g., were evidence-based interventions/practices required or suggested and was evidence a significant requirement)?
  • In FY22, SAMHSA used the conclusions and recommendations from its 2021 Evaluation Report, Combating Opioid Misuse: Findings from the Evaluation Report of the Strategic Prevention Framework—Prescription Drug (SPF-Rx), based on a cross-site evaluation that involved the participation of all grantees, to inform the next round of competitive discretionary grants. The evaluation provided evidence of increased access and utilization of professional use prescription drug monitoring program data as well as evidence that the grant program resulted in new community prevention coalitions and partnerships and  an association with reductions in some of the negative consequences of prescription opioid misuse.
  • Substance Abuse and Mental Health Services Administration grants are intended to fund services or practices that have a demonstrated evidence base and that are appropriate to the population(s) of focus. As with all SAMHSA grants, the five largest competitive grants programs require applicants to include EBPs and activities that are backed by science. The allocation of funds is based on an application that includes a request for evidence of effective work in reducing substance use and mental health disorders. Two competitive grants have mechanisms that allocate funds toward evidence-based practices include:
    1. Certified Community Behavioral Health Clinic Expansion Grants requires applicants to describe their proposed evidence-based service/practice. The grantee must describe how the EBP meets the population(s) needs and the outcomes to be achieved. Grantees must also indicate how their practice might be modified and reasons for such modifications.
    2. Project AWARE requires grantees to identify EBPs selected to be implemented in the required number (3) of local education agencies/communities. They must also describe how the EBPs selected are effective and appropriate for school-aged youth.
8.3 Did the agency use its five largest competitive grant programs to build evidence (e.g., requiring grantees to participate in evaluations)?
  • In its draft evaluation plan, SAMHSA evaluates some of the largest competitive grant programs. These evaluations will inform and enable SAMHSA to build evidence. One mechanism for this is through the grantmaking process. In some grants, SAMHSA includes additional terms and conditions that state, depending on the funding opportunity and grant application, that a grantee may be asked to participate in a cross-site evaluation.
  • All grant programs at SAMHSA are required to submit data on race, ethnicity, gender, and sexual orientation (among other demographic data). In addition, SAMHSA’s surveys collect national data in these areas, allowing SAMHSA’s Office of Behavioral Health Equity to utilize federal and community data to identify, monitor, and respond to behavioral health disparities.
8.4 Did the agency use evidence of effectiveness to allocate funds in any other competitive grant programs (besides its five largest grant programs)?
  • Nearly all the largest grant programs in FY21 were again at the top of the list for funding in FY22 (Project AWARE is in the top five for FY22). Specification of evidence-based practices, along with discussion of how each EBP chosen is appropriate for the population(s) of focus and the outcomes identified, typically comprises a substantial proportion of the evaluative criteria for grant applications [e.g., 25% for Project AWARE (see Section C)].  Grants other than the largest also require that proposed programs use EBPs and ensure that this requirement is met through substantial point values assigned in the course of objective peer reviews (e.g., Infant and Early Childhood Mental Health Program).  All grantees are required to submit National Outcome Measurement Systems (NOMs) data as well as narrative final reports outlining successes, challenges, and innovation. These reports and quantitative data are reviewed by dedicated government project officers (GPOs), and program leadership grantees are provided technical assistance and guidance on how to approve their work, when applicable.
  • In addition, SAMHSA grantees are required to submit data on race, ethnicity, gender, and sexual orientation (among other demographic data) as well as data on social determinants of health (such as access to stable housing, employment, and education status). SAMHSA’s surveys collect national data in these areas, allowing its Office of Behavioral Health Equity, to utilize federal and community data to identify, monitor, and respond to behavioral health disparities.
8.5 What are the agency’s one or two strongest examples of how competitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • Competitive grant programs are required to consider EBPs in their application and are referred to SAMHSA’s EBPRC for tools they need to achieve better outcomes based on what works. An additional example may be found in SAMHSA’s trauma and justice portfolio, which provided a comprehensive public health approach to addressing trauma and establishing a trauma-informed approach in health, behavioral health, human services, and related systems. The intent of this initiative was to reduce both the observable and less visible harmful effects of trauma and violence on children and youth, adults, families, and communities. As part of this initiative, the SPARS team presented the video series A Trauma-Informed Approach to Data Collection, with commentary from subject matter experts and clientele from the People Encouraging People program in Baltimore, MD. This series advised grantees and GPOs about using a trauma-informed approach to collecting client-level data.
  • Another example is EBPs that SOR grantees have implemented focused on safe prescribing of naloxone and medication for opioids use disorder to help support and build knowledge around the use of these EBPs. In FY22, State Opioid Response grantees are using funds to utilize fentanyl test strips. This recent change under the Biden-Harris Administration will help to build knowledge of the utility of these EBPs.
8.6 Did the agency provide guidance that makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity-building efforts?
  • Grantees are encouraged to allocate grants funds for data collection, data analysis, and program evaluation. Some grantees hire external evaluators using grant funds to assist them in the evaluation process. For example, one funding announcement requires the applicant to “provide specific information about how you will collect the required data for this program and how the data will be utilized to manage, monitor and enhance the program.” In addition, up to 20% of the total grant award for the budget period may be used for data collection, performance measurement, and performance assessment expenses.
Score
7
Use of Evidence in Noncompetitive Grant Programs

Did the agency use evidence of effectiveness when allocating funds from its noncompetitive grant programs in FY22 (examples: evidence-based funding set-asides, requirements to invest funds in evidence-based activities, and pay for success provisions)?

9.1 What were the agency’s five largest noncompetitive programs and their appropriation amounts (and were city, county, and/or state governments eligible to receive funds from these programs)?
9.2 Did the agency use evidence of effectiveness to allocate funds in its five largest noncompetitive grant programs (e.g., are evidence-based interventions/practices required or suggested and is evidence a significant requirement)?
  • Allocation of community MHBG funds is based on a congressionally mandated formula. States are required to use at least 10% of the MHBG funds to support EBPs that address the needs of individuals with early serious mental illness, including psychotic disorders, regardless of the age of the individual at onset. States are also required to use 5% of their total allocation of MHBG funds to support and provide crisis services. SAMHSA also encourages states to use the MHBG funds to implement evidence-based practices for both adults and children.
  • For the first-episode psychosis 10% set-aside in the MHBG, states are directed to use EPS. SAMHSA provides guidance to states based on evidence. Its recommendation is to develop a state first-episode psychosis program based on the coordinated specialty care model, as evaluated by the National Institute of Mental Health. For example, the first-episode psychosis program OnTrackNY is an evaluated model that is recommended based on evidence of success.
  • Similarly, the Substance Abuse and Prevention Block Grant is a formula grant program that provides states flexibility to identify and deliver substance use-related services to meet their state-specific needs while also ensuring attention to critical prevention-focused public health issues. The authorizing legislation and implementing regulations for the SABG program include a maintenance of effort requirement and specific funding set-asides, including a 20% set-aside for primary prevention and a 5% set-aside for early intervention services for HIV for designated states. Through the SABG, states should “identify, implement, and evaluate evidence-based programs, practices, and policies that have the ability to reduce substance use and improve health and well-being in all communities.”
9.3 Did the agency use its five largest noncompetitive grant programs to build evidence (e.g., requiring grantees to participate in evaluations)?
  • Information on how to use funds for data collection and evaluation is covered in the block grant application. Grantees are encouraged to allocate grants funds for data collection, data analysis, and program evaluation. Some grantees hire external evaluators using grant funds to assist them in the evaluation process. In FY21, SAMHSA updated its application manual to include a section on developing goals and measurable objectives (see p. 38). Specifically, the document states, “To be able to effectively evaluate your project, it is critical that you develop realistic goals and measurable objectives. This chapter will provide information on developing goals and measurable objectives. It will also provide examples of well-written goals and measurable objectives.”
  • Grantees in noncompetitive grant programs are required to submit quantitative data to SAMHSA using reporting systems associated with their grant. For example, state mental health agencies receive noncompetitive grants and compile and report annual data collected from SAMHSA’s Community MHBG. More information on the Uniform Reporting System can be found online. In this way, noncompetitive grant programs not only allow the sharing of data for research and evaluation, but also allow grantees to explore data from other state grantees.
  • In the FY20-21 Block Grant Application, SAMHSA asks states to base their administrative operations and service delivery on principles of continuous quality improvement/total quality management (CQI/TQM). These processes should identify and track critical outcomes and seventy-two performance measures, based on valid and reliable data, consistent with the National Behavioral Health Quality Framework, which will describe the health and functioning of the mental health and addiction systems. The CQI processes should continuously measure the effectiveness of services and supports and ensure that they continue to reflect this evidence of effectiveness. The state’s CQI process should also track programmatic improvements using stakeholder input, including the general population and individuals in treatment and recovery and their families. In addition, the CQI plan should include a description of the process for responding to emergencies, critical incidents, complaints, and grievances.
  • In FY22, SAMHSA focused its resources on an examination of evidence and the collection of data related to both the SABG and the MHBG.
  • For SABG, SAMHSA’s Center for Substance Abuse Treatment is engaging in a multipronged approach to evaluate the program, guided by the December 2020 U.S. Government Accountability Office report Substance Use Disorder: Reliable Data Needed for Substance Abuse Prevention and Treatment Block Grant Program. Based on the Accountability Office’s recommendations, SAMHSA has initiated an assessment of the quality of grantees’ self-reported data. This includes conducting quantitative and qualitative analysis to understand reliability issues associated with grantees’ self-reported data and barriers to data collection and to identify potential alternative data sources and methodological approaches to address data gaps. This effort is expected to result in a set of recommendations in August 2022 for implementing changes to the SABG program’s data collection efforts to improve the consistency and relevance of the data collected.
  • For MHBG, SAMHSA’s Center for Mental Health Services has organized a series of state panel discussions to examine specific aspects of its data collection. The first panel was held in April 2022 to examine the burden and utility of each national outcome measure collected. The session was titled Mental Health Block Grant Uniform Reporting System—Gaps, Challenges, Strengths, and Opportunities. A survey was provided to states and the results shared with the expert panel during a three-hour discussion with conclusions and recommendations designed to increase the data quality and utility of the Block Grant program. Additional panel discussions are scheduled for the summer and fall of 2022 with the results used to inform future block grant funding.
9.4 Did the agency use evidence of effectiveness to allocate funds in any other non-competitive grant programs (besides its five largest grant programs)?
  • Nearly all SAMHSA grants are competitively awarded. SAMHSA has only four noncompetitive grants, which are included above.
9.5 What are the agency’s one or two strongest examples of how noncompetitive grant recipients achieved better outcomes and/or built knowledge of what works or what does not?
  • In 2016 SAMHSA, in partnership with the National Institute of Mental Health and Office of the Assistant Secretary for Planning and Evaluation, initiated a three-year evaluation study of the CSC model programs funded through the MHBG 10% set-aside to ascertain the effectiveness of these programs. The study results of services provided by thirty-six diverse programs indicated that the evidence-based CSC programs lead to statistically significant improvements in the health and well-being of individuals who participate in them, including reductions in hospitalization (-79%) and emergency room visits (-71%), criminal justice involvement (-41%), suicide attempts (-66%), and homelessness (-35%).
9.6 Did the agency provide guidance that makes clear that city, county, and state government, and/or other grantees can or should use the funds they receive from these programs to conduct program evaluations and/or to strengthen their evaluation capacity building efforts?
  • Information on how to use funds for data collection and evaluation is covered in the Block Grant Application. Grantees are encouraged to allocate grants funds for data collection, data analysis, and program evaluation. Some grantees hire external evaluators using grant funds to assist them in the evaluation process.
Score
4
Repurpose for Results

In FY22, did the agency shift funds away from or within any practice, policy, or program that consistently failed to achieve desired outcomes (examples: requiring low-performing grantees to re-compete for funding; removing ineffective interventions from allowable use of grant funds; incentivizing or urging grant applicants to stop using ineffective practices in funding announcements; proposing the elimination of ineffective programs through annual budget requests; incentivizing well-designed trials to fill specific knowledge gaps; supporting low-performing grantees through mentoring, improvement plans, and other forms of assistance; and using rigorous evaluation results to shift funds away from a program)?

10.1 Did the agency have policy(ies) for determining when to shift funds away from grantees, practices, policies, interventions, and/or programs that consistently failed to achieve desired outcomes, and did the agency act on that policy?
  • As a matter of policy, SAMHSA uses the term “restricted status” to describe grant recipients that are financially unstable, have inadequate financial management systems, or are poor programmatic performers. Grants placed on restricted status require additional monitoring and have additional award conditions that must be met before funds can be drawn. SAMHSA adheres to HHS’s Grants Policy Statement, including the policy on suspension or termination, which states that, “if a recipient has failed to materially comply with the terms and conditions of award, the OPDIV [Grant-Awarding Operating Division] may suspend the grant, pending corrective action, or may terminate the grant for cause” (p. II-89).
10.2 Did the agency identify and provide support to agency programs or grantees that failed to achieve desired outcomes?
  • Due to the diversity of SAMHSA grantees and the variety of grants awarded, it is not possible to have a one size fits all policy for grantees who fail to achieve desired outcomes. However, SAMHSA is dedicated to ensuring that grantees use federal funds as expected and that grantees adhere to reporting and data requirements. Following is a current example illustrating SAMHSA’s dedication to its grantees and its commitment to hold grantees accountable for federal dollars:
  • In FY22, a SAMHSA grantee was experiencing great difficulties submitting required documentations and was deemed not in compliance with program requirements. As a result of noncompliance, the Division of Grants Management and the government program officer had multiple telephone conversations with the grantee to provide technical assistance, clarify expectations, and provide deadlines extensions when possible. However, due to the seriousness of the problem and the lack of any significant progress or effort to submit the required documents, the Division of Grants Management and program officer outlined a Corrective Action Plan for the grantee. This plan included the identified delinquent items and concerns along with specific due dates for the grantee to submit the required documents to remain in compliance with program requirements.
  • The Corrective Action Plan stipulates that the grantee has two weeks to provide the required documents. If the grantee does not submit delinquent items outlined in the Corrective Action Plan by the due date, then the Division of Grants Management and government program officer will recommend relinquishment or award termination. The grant will be terminated because the grantee failed to meet the terms and conditions outlined in the Notice of Funding Announcement and the Notice of Award and was unable to correct the deficiencies within a reasonable period with multiple technical assistance attempts and unused federal funds. After the grant has been relinquished or terminated, the Division of Grants Management and government program officer will recommend offsetting unused federal funds.
  • The Performance Accountability and Reporting System allows SAMHSA staff to regularly monitor discretionary grant status as well as to meet with grant program directors. If a grantee is falling behind or not meeting proposed targets, SAMHSA staff access the data in real time to provide the support or technical assistance needed to ensure that the grantee does not fail. Given the important mission of SAMHSA, to reduce the impact of substance use and mental illness on America’s communities, it is critical that struggling communities are identified early with the goal of continuous quality improvement and support.
Back to the Standard

Visit Results4America.org