Vous êtes sur la page 1sur 24

13132 Federal Register / Vol. 71, No.

49 / Tuesday, March 14, 2006 / Notices

MSC 7844, Bethesda, MD 20892. (301) 435– Name of Committee: Center for Scientific Service Coordinator/SAMHSA, 1 Choke
1119. mselmanoff@csr.nih.gov. Review Special Emphasis Panel; Health Cherry Road, Room 8–1017, Rockville,
This notice is being published less than 15 Services Organization and Delivery Member MD 20857, (240) 276–2234.
days prior to the meeting due to the timing Conflict Special Emphasis Panel.
limitations imposed by the review and Date: April 18, 2006. Charles G. Curie,
funding cycle. Time: 10:30 a.m. to 1:30 p.m. Administrator, SAMHSA.
Name of Committee: Center for Scientific Agenda: To review and evaluate grant
Review Special Emphasis Panel; Member applications. Advancing Evidence-Based Practice
Conflict: Cellular/molecular Responses in Place: National Institutes of Health, 6701 Through Improved Decision Support
Dendritic Cells, Macrophages, and T cells. Rockledge Drive, Bethesda, MD 20892. Tools: Reconceptualizing NREPP
Date: March 27, 2006. (Telephone Conference Call).
Time: 3 p.m. to 5 p.m. Contact Person: Gertrude K. McFarland, Introduction
Agenda: To review and evaluate grant FAAN, RN, DNSC Scientific Review The Substance Abuse and Mental
applications and/or proposals. Administrator, Center for Scientific Review,
Health Services Administration
Place: National Institutes of Health, 6701 National Institutes of Health, 6701 Rockledge
Drive, Room 3156, MSC 7770, Bethesda, MD
(SAMHSA) strives to provide
Rockledge Drive, Bethesda, MD 20892.
20892. (301) 435–1784. mcfarlag@csr.nih.gov. communities with effective, high-
(Telephone Conference Call).
Contact Person: Samuel C. Edwards, PhD, (Catalogue of Federal Domestic Assistance
quality, and cost-efficient prevention
Scientific Review Administrator, Center for Program Nos. 93.306, Comparative Medicine; and treatment services for mental and
Scientific Review, National Institutes of 93.333, Clinical Research, 93.306, 93.333, substance use disorders. To meet this
Health, 6701 Rockledge Drive, Room 4200, 93.337, 93.393–93.396, 93.837–93.844, goal, SAMHSA recognizes the needs of
MSC 7812, Bethesda, MD 20892. (301) 435– 93.846–93.878, 93.892, 93.893, National a wide range of decisionmakers at the
1152. edwardss@csr.nih.gov. Institutes of Health, HHS) local, state, and national levels to have
This notice is being published less than 15 readily available and timely information
Dated: March 6, 2006.
days prior to the meeting due to the timing about scientifically established
limitations imposed by the review and Anna Snouffer,
Acting Director, Office of Federal Advisory
interventions to prevent and/or treat
funding cycle.
Committee Policy. these disorders.
Name of Committee: Center for Scientific
Review Special Emphasis Panel; [FR Doc. 06–2400 Filed 3–13–06; 8:45 am] SAMHSA, through its Science to
Atherosclerosis and Macrophages. Service Initiative, actively seeks to
BILLING CODE 4140–01–M
Date: April 11, 2006. promote Federal collaboration (e.g.,
Time: 2 p.m. to 3 p.m. with the National Institutes of Health
Agenda: To review and evaluate grant DEPARTMENT OF HEALTH AND [NIH]) in translating research into
applications. HUMAN SERVICES practice. The ideal outcome of this
Place: National Institutes of Health, 6701 Initiative is that individuals at risk for
Rockledge Drive, Bethesda, MD 20892. Substance Abuse and Mental Health or directly experiencing mental and
(Telephone Conference Call). substance abuse use disorders will be
Contact Person: Olga A. Tjurmina, PhD,
Service Administration
Scientific Review Administrator, Center for
more likely to receive appropriate
Scientific Review, National Institutes of
Changes to the National Registry of preventive or treatment services, and
Health, 6701 Rockledge Drive, Room 4030B, Evidence-Based Programs and that these services will be the most
MSC 7814, Bethesda, MD 20892. (301) 451– Practices (NREPP) effective and the highest quality that the
1375. ot3d@nih.gov. AGENCY: Substance Abuse and Mental field has to offer.
Name of Committee: Center for Scientific Health Services Administration, HHS. This report provides a summary of
Review Special Emphasis Panel; Atrial activities conducted during the past
ACTION: Notice.
Fibrillation and Pacing. year to critically evaluate SAMHSA’s
Date: April 12, 2006. SUMMARY: The Substance Abuse and recent activities and future plans for the
Time: 2 p.m. to 3:30 p.m. National Registry of Evidence-based
Agenda: To review and evaluate grant
Mental Health Services Administration
(SAMHSA) is committed to preventing Programs and Practices (NREPP). It
applications.
Place: National Institutes of Health, 6701 the onset and reducing the progression outlines the major themes that emerged
Rockledge Drive, Bethesda, MD 20892. of mental illness, substance abuse, and from a formal public comment process
(Telephone Conference Call). substance-related problems among all and links this feedback to new review
Contact Person: Olga A. Tjurmina, PhD, individuals, including youth. As part of procedures and Web-based decision
Scientific Review Administrator, Center for this effort, SAMHSA has expanded and support tools that will enhance access to
Scientific Review, National Institutes of refined the agency’s National Registry of evidence-based knowledge for multiple
Health, 6701 Rockledge Drive, Room 4030B, Evidence-based Programs and Practices audiences.
MSC 7814, Bethesda, MD 20892. (301) 451– The report is presented in four
1375. ot3d@nih.gov.
(NREPP) based on a systematic analysis
and consideration of public comments sections:
Name of Committee: Center for Scientific • Section I briefly states the
Review Special Emphasis Panel; Member
received in response to a previous
Conflict: Heart Failure Gene Therapy. Federal Register notice (70 FR 50381, background of NREPP and SAMHSA’s
Date: April 17, 2006. Aug. 26, 2005). recent request for public comments.
Time: 1 p.m. to 3 p.m. This Federal Register notice • Section II discusses the analysis of
Agenda: To review and evaluate grant summarizes SAMHSA’s redesign of comments that was conducted and
applications. NREPP as a decision support tool for presents the key recommendations for
Place: National Institutes of Health, 6701 promoting a greater adoption of NREPP based on this analysis.
Rockledge Drive, Bethesda, MD 20892. evidence-based interventions within • Section III describes the new
(Telephone Conference Call). typical community-based settings, and approach that SAMHSA is advancing
wwhite on PROD1PC65 with NOTICES

Contact Person: Rajiv Kumar, PhD, provides an opportunity for interested


Scientific Review Administrator, Center for
for NREPP.
Scientific Review, National Institutes of
parties to become familiar with the new • Section IV presents the specific
Health, 6701 Rockledge Drive, Room 4122, system. dimensions of the NREPP system in its
MSC 7802, Bethesda, MD 20892. (301) 435– FOR FURTHER INFORMATION CONTACT: new framework as a decision support
1212. kumarra@csr.nih.gov. Kevin D. Hennessy, Ph.D., Science to tool.

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00067 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices 13133

• Section V describes future activities programs and practices. Through services to know that an intervention
at SAMHSA to support NREPP. extensive dialogues with the prevention has a measurable effect on the actual
community, SAMHSA also explored behavior of participants. As researchers
I. Background: The National Registry of
ways to provide evidence-based reviews at the University of Washington
Evidence-Based Programs and Practices
of population- and community-level recommended, ‘‘the system should be
The National Registry of Evidence- interventions within NREPP. reserved for policies, programs, and
based Programs and Practices was In an effort to augment the system-level changes that have
designed to represent a key component information gained through these produced changes in actual drug use or
of the Science to Service Initiative. It activities, SAMHSA solicited formal mental health outcomes.’’
was intended to serve as a voluntary public comments through a notice • Rereview all existing programs.
rating and classification system to posted in the Federal Register on There was near consensus among the
identify programs and practices with a August 26, 2005. The notice asked for respondents to the notice that existing
strong scientific evidence base. An responses to the agency’s plans for programs with Model, Effective, and
important reason for developing NREPP NREPP, including (1) revisions to the Promising designations from the old
was to reduce the significant time lag scientific review process and review reviews should be rereviewed under the
between the generation of scientific criteria; (2) the conveying of practical new system. The Committee for
knowledge and its application within implementation information about Children pointed out that ‘‘a
communities.1 Quality treatment and NREPP programs and practices to those ‘grandfather’ system may give the
prevention services depend on service who might purchase, provide, or receive impression to users, right or wrong, that
providers’ ability to access evidence- these interventions; and (3) the types of these interventions aren’t as good as
based scientific knowledge, additional agency activities that may be those that have undergone the new
standardized protocols, practice needed to promote wider adoption of review process.’’ One individual
guidelines, and other practical interventions on NREPP, as well as suggested that programs and practices
resources. support innovative interventions needed to be rated ‘‘according to a
The precursor of NREPP, the National seeking NREPP status. A brief summary consistent set of criteria’’ so that ‘‘the
Registry of Effective Prevention of the public comments and key public adoption of an intervention by a
Programs, was developed by SAMHSA’s recommendations is presented in provider can be made with confidence.’’
Center for Substance Abuse Prevention Section II. The complete analysis of the • Train and utilize panels of
(CSAP) as a way to help professionals in public responses is included in the reviewers with specific expertise related
the field become better consumers of Appendix to this report. to the intervention(s) under review.
substance abuse prevention programs. Respondents to the notice noted that it
II. Public Responses to the Federal would be important for the NREPP
Through CSAP’s Model Program Register Notice
Initiative, over 1,100 programs were review process to utilize external
reviewed, and more than 150 were Senior staff at SAMHSA engaged in a reviewers with relevant scientific and
designated as Model, Effective, or comprehensive review of comments practical expertise related to the
Promising Programs. received in response to the Federal intervention being assessed. In addition,
Over the past 2 years, SAMHSA Register notice. Particular attention was the pool of available reviewers should
convened a number of scientific panels directed to comments from prominent broadly include community-level and
to explore the expansion of the NREPP state and Federal stakeholders, individual-level prevention as well as
review system to include interventions including providers and policymakers, treatment perspectives. In order to
in all domains of mental health and who stand to be the most affected by promote transparency of the review
substance abuse prevention and whatever system is ultimately process, the reviewer training protocols
treatment. In addition, SAMHSA implemented. Efforts were taken to should be available for review by the
committed itself to three guiding balance SAMHSA’s responsiveness to public (e.g., posted on the NREPP Web
principles—transparency, timeliness, public feedback with the need to adhere site).
and accuracy of information—in the to rigorous standards of scientific • Provide more comprehensive and
development of an evidence-based accuracy and to develop a system that balanced descriptions of evidence-based
registry of programs and practices. will be fair and equitable to multiple practices, by emphasizing the important
During this process it was determined stakeholder groups. dimension of readiness for
that, to provide the most transparent dissemination. The American
Recommendations for NREPP Psychological Association (APA)
and accurate information to the public,
evidence should be assessed at the level In the more than 100 comments Committee on Evidence-Based Practice
of outcomes targeted by an intervention, received as part of the public comment recommended greater emphasis on the
not at the more global level of process, a number of recurring themes utility descriptors (i.e., those items
interventions or programs. Based on this and recommendations were identified. describing materials and resources to
decision, SAMHSA’s current NREPP While all specific and general support implementation), stating, ‘‘these
contractor conducted a series of pilot recommendations for modification of are key outcomes for implementation
studies to explore the validity and the NREPP review process were and they are not adequately addressed
feasibility of applying an outcome- carefully considered by SAMHSA, the in the description of NREPP provided to
specific, 16-criteria evidence rating following are those that were considered date. This underscores earlier concerns
system to an expanded array of most essential to the development of an noted about the transition from efficacy
accurate, efficient, and equitable system to effectiveness.’’ The APA committee
1 As cited by the Institute of Medicine (2001), that can meet the needs of multiple noted that generalizability of programs
studies have suggested it takes an average of 17 stakeholders: listed on NREPP will remain an issue
wwhite on PROD1PC65 with NOTICES

years for research evidence to diffuse to clinical • Limit the system to interventions until this ‘‘gap between efficacy and
practice. Source: Balas, E.A., & Boren, S.A. (2000). that have demonstrated behavioral effectiveness’’ is explicitly addressed
Managing clinical knowledge for health care
improvement. In: J. Bemmel & A.T. McCray (Eds.),
change outcomes. it is inherently under a revised review system.
Yearbook of medical informatics 2000: Patient- appealing to the funders, providers, and • Avoid limiting flexibility and
centered systems. Stuttgart, Germany: Schattauer. consumers of prevention and treatment innovation; implement a system that is

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00068 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
13134 Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices

fair and inclusive of programs and populations for which interventions • Provide a user-friendly, searchable
practices with limited funding, and have been developed and applied. Most array of descriptive summary
establish policies that seek to prevent comments reflected the knowledge that information as well as reviewer ratings
the misuse of information contained on cultural factors can play an important of evidence quality.
NREPP. The National Association for role in determining the effectiveness of • Provide an efficient and cost-
Children of Alcoholics voiced this interventions. The Oregon Office of effective system for the assessment and
concern: ‘‘It has been intrinsically unfair Mental Health and Addiction Services review of prospective programs and
that only grants [referring to NIH-funded noted, ‘‘SAMHSA should focus practices.
efforts] have been able to establish considerable effort on identifying and Section III, Streamlined Review
‘evidence’ while many programs appear listing practices useful and applicable Procedures, provides a complete
very effective—often more effective in for diverse populations and rural areas. description of the modified and
some circumstances than NREPP Providers and stakeholders from these streamlined review process that
approved programs, but have not had groups have repeatedly expressed the SAMHSA will adopt in conducting
the Federal support or other major grant concern they will be left behind if no evidence-based evaluations of mental
support to evaluate them. The SAMHSA practices have been identified which fit health and substance abuse
grant programs continue to reinforce the the need of their area. We need to take interventions.
designation of NREPP programs in order particular care to ensure that their fear III. Streamlined Review Procedures
to qualify for funding, and the states is not realized.’’
tend to strengthen this ‘stipulation’ to The number and range of NREPP
• In addition to estimating the effect reviews are likely to expand
local programs,who then drop good size of intervention outcomes, NREPP
(non-NREPP) work they have been significantly under the new review
should include additional descriptive system, requiring that SAMHSA
doing or purchase and manipulate information about the practical impacts
NREPP programs that make the grant develop an efficient and cost-effective
of programs and practices. In general, review process. The streamlined review
possible. This is not always in the best comments suggested that that effect size
interest of the client population to be procedures, protocols, and training
should not be used as an exclusionary materials will be made available on the
served.’’ criterion in NREPP. It was widely noted
• Recognize multiple ‘‘streams of NREPP Web site for access by all
that effect size estimates for certain interested individuals and
evidence’’ (e.g., researcher, practitioner, types of interventions (e.g., community-
and consumer) and the need to provide organizations.
level or population-based) will tend to Reviews of interventions will be
information to a variety of stakeholders be of smaller magnitude, and that
in a decision support context. A number facilitated by doctoral-level Review
‘‘professionals in the field have not Coordinators employed by the NREPP
of comments suggested that NREPP reached consensus on how to use effect
should be more inclusive of the contractor. Each Review Coordinator
size.’’ Researchers at the University of will support two external reviewers who
practitioner and consumer perspective
Washington suggested the inclusion of will assign numeric, criterion-based
on what defines evidence. For example,
information about the reach of an ratings on the dimensions of Strength of
one commenter noted: ‘‘The narrowed
intervention, when available, as Evidence and Readiness for
interpretation of evidence-based
complementary information to effect Dissemination. Review Coordinators
practice by SAMHSA focuses almost
sizes. Several comments also suggested will provide four important support and
solely on the research evidence to the
that effect size is often confused with facilitative functions within the peer
exclusion of clinical expertise and
the clinical significance of an review process: (1) They will assess
patient values.’’ Several comments
intervention and its impact on incoming applications for the
noted that NREPP should be consistent
with the Institute of Medicine’s participants. thoroughness of documentation related
definition of evidence-based practice, • Acknowledge the need to develop to the intervention, including
which reflects multiple ‘‘streams of additional mechanisms of Federal documentation of significant outcomes,
evidence’’ that include research, support for technical assistance and the and will convey summaries of this
clinical, and patient perspectives. development of a scientific evidence information to SAMHSA Center
• Provide a summary rating system base within local prevention and Directors for their use in prioritizing
that reflects the continuous nature of treatment communities. Nearly one interventions for review; (2) they will
evidence quality. There was substantial third of the comments directly serve as the primary liaison with the
disagreement among those responding addressed the need for SAMHSA to applicant to expedite the review of
to the notice concerning whether identify and/or provide additional interventions; (3) they will collaborate
NREPP should include multiple technical assistance resources to with the NREPP applicant to draft the
categories of evidence quality. While a communities to help them adapt and descriptive dimensions for the
number of individuals and implement evidence-based practices. intervention summaries; and (4) they
organizations argued for the use of The Oregon Office of Mental Health and will provide summary materials and
categorical evidence ratings, there were Addiction Services wrote, ‘‘The guidance to external reviewers to
many who suggested that NREPP should adoption of new practices by any entity facilitate initial review and consensus
provide an average, numeric scale rating is necessarily a complex and long-term discussions of intervention ratings.
on specific evidence dimensions to process. Many providers will need
better reflect the ‘‘continuous nature of technical support if adoption and Interventions Qualifying for Review
evidence.’’ This approach would allow implementation is to be accomplished While NREPP will retain its open
the user of the system to determine what effectively. Current resources are not submission policy, the new review
level of evidence strength is required for adequate to meet this challenge.’’ system emphasizes the important role of
wwhite on PROD1PC65 with NOTICES

their particular application of an In order to align NREPP with the SAMHSA’s Center Directors and their
intervention. important recommendations solicited staff (in consultation with key
• Recognize the importance of through the public comment process, stakeholders) in setting intervention
cultural diversity and provide complete SAMHSA also recognized the review priorities that will identify the
descriptive information on the importance of the following goals: particular content areas, types of

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00069 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices 13135

intervention approaches, populations, review system and recommend information and ratings provided by
or even types of research designs that enhancements to to the review NREPP are only useful within a much
will qualify for review under NREPP. procedures and/or standards for broader context that incorporates a wide
Under the streamlined review evidence-based science and practice. range of perspectives—including
procedures, the sole requirement for Panel membership will represent a clinical, consumer, administrative,
potential inclusion in the NREPP review balance of perspectives and expertise. fiscal, organizational, and policy—into
process is for an intervention to have The panels will be comprised of decisions regarding the identification,
demonstrated one or more significant researchers with knowledge of selection, and successful
behavioral change outcomes. Center- evidence-based practices and initiatives, implementation of evidence-based
specific review priorities will be policymakers, program planners and services. In fact, an emerging body of
established and communicated to the funders, practitioners, and consumers. literature on implementation science 3
field by posting them to the NREPP Web The modified NREPP system suggests that a failure to carefully attend
site at the beginning of each fiscal year.2 embodies a commitment by SAMHSA to this broader array of data and
and its Science to Service Initiative to perspectives may well lead to
Review of Existing NREPP Programs and broaden the appeal and utility of the disappointing or unsuccessful efforts to
Practices system to multiple audiences. While adopt evidence-based interventions.
It will be the prerogative of SAMHSA maintaining the focus on the Because each NREPP user is likely to be
Center Directors to establish priorities documented outcomes achieved through seeking somewhat different information,
for the review and interventions already a program or practice, NREPP also is and for varied purposes, it is unlikely
on, and pending entry on, NREPP. As being developed as a user-friendly that any single intervention included on
indicated above, these decisions may be decision support tool to present NREPP will fulfill all of the specific
linked to particular approaches, information along multiple dimensions requirements and unique circumstances
populations, or strategic objectives as of evidence. Under the new system, of a given end-user. Appreciation of this
identified by SAMHSA as priority areas. interventions will not receive single, basic premise of NREPP as a decision
Until reviews of existing NREPP overall ratings as was the case with the support tool to be utilized in a broader
programs and practices are completed previous NREPP (e.g., Model, Effective, context will thus enable system users to
and posted to the new NREPP Web site, or Promising). Instead, an array of make their own determinations
the current listing on the SAMHSA information from multiple evidence regarding how best to assess and apply
Model Programs Web site will remain dimensions will be provided to allow the information provided.
intact. different user audiences to both identify The NREPP decision support
(through Web-searchable means) and dimensions include:
Notifications to Program/Practice prioritize the factors that are important
Developers • Descriptive Dimensions
to them in assessing the relative • Strength of Evidence Dimension
Upon the completion of NREPP strengths of different evidence-based Ratings
reviews program/practice developers (or approaches to prevention or treatment • Readiness for Dissemination
principal investigators of a research- services. Dimension Ratings
based intervention) will be notified in Section IV presents in more detail the A complete description of these
writing within 2 weeks of the review specific dimensions of descriptive dimensions is provided in the sections
results. A complete summary, information and ratings that NREPP will below.
highlighting information from each of offer under this new framework.
the descriptive and rating dimensions, Descriptive Dimensions
IV. NREPP Decision Support Tool
will be provided for review. Program/ Dimensions • Intervention Name and Summary:
practice developers who disagree with Provides a brief summary of the
the descriptive information or ratings The NREPP system will support intervention, including title, description
contained in any of the dimensions will evidence-based decisionmaking by of conceptual or theoretical foundations,
have an opportunity to discuss their providing a wide array of information and overall goals. Hyperlinks to graphic
concerns with the NREPP contractor across multiple dimensions. Many of logic model(s), when available, could be
during the 2-week period following these are brief descriptive dimensions accessed from this part of the summary.
receipt of the review outcome that will allow users to identify and • Contract Information: Lists key
notification. These concerns must be search for key intervention attributes of contact information. Typically will
expressed in writing to the contractor interest. Descriptive dimensions would include intervention developer’s title(s),
within this 2-week period. If no frequently include a brief, searchable affiliation, mailing address, telephone
comments are received, the review is keyword or attribute (e.g., ‘‘randomized and fax numbers, e-mail address, and
deemed completed, and the results may control trial’’ under the Evaluation Web site address.
be posted to the NREPP Web site. If Design dimension) in addition to • Outcome(s): A searchable listing of
points of disagreement cannot be narrative text describing that dimension. the behavioral outcomes that the
resolved by the end of this 2-week Two dimensions, Strength of Evidence intervention has targeted.
period, then written appeals for a and Readiness for Dissemination, will • Effects and Impact: Provides a
rereview of the intervention may be consist of quantitative, criterion-based description and quantification of the
considered on a case-by-case basis. ratings by reviewers. These quantitative effects observed for each outcome.
ratings will be accompanied by reviewer
NREPP Technical Expert Panel narratives summarizing the strengths 3 Fixsen, D.L., Naoom, S.F., Blase, K.A.,

SAMHSA will organize one or more and weaknesses or the intervention Friedman, R.M., & Wallace, F. (2005).
along each dimension. Implementation research: A synthesis of the
wwhite on PROD1PC65 with NOTICES

expert panels to perform periodic (e.g., literature. Tampa, Florida: University of South
annual assessments of the evidence Considerations for Using NREPP as a Florida, Louis de la Parte Florida mental Health
Decision Support Tool Institute, The National Implementation Network
2 Except for FY06 when priorities will be (FMHI Publication #231).
established and posted when the new system Web It is essential for end-users to Rogers (1995). Diffusion of innovaations (5th Ed.)
site is launched (i.e., within the third FY quarter). understand that the descriptive New York: The Free Press.

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00070 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
13136 Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices

Includes information on the statistical interrupted time series designs, etc.) 5 as have received or participated in the
significance of outcomes, the magnitude well as a narrative description of the intervention.
of changes reported including effect size design (including intervention and
Strength of Evidence Dimension Ratings
and measures of clinical significance (if comparison group descriptions) used to
available), and the typical duration of document intervention outcomes. Quantitative, reviewer-based ratings
behavioral changes produced by the • Replication(s): Coded as ‘‘None,’’ or on this dimension will be provided
intervention. will state the number of replications to within specific categories of research/
• Relevant Populations and Settings: date (only those that have been evaluation design. In this manner, users
Identifies the populations and sample evaluated for outcomes). Replications can search and select within those
demographics that characterize existing will be additionally characterized as categories of research designs that are
evaluations. The settings in which having been conducted in efficacy, most relevant to their particular
different populations have been effectiveness, or dissemination contexts. standards of evidence-based knowledge.
evaluated will be characterized along a • Proprietary or Public Domain The categories of research design that
dimension that ranges from highly Intervention: Typically will be one or are accepted within the NREPP system
controlled and selective (i.e., efficacy the other, but proprietary components are described below.
studies), to less controlled and more or instruments used as part of an Research Design
representative (i.e., effectiveness intervention will be identified.
studies), to adoption in the most diverse • Cultural Appropriateness: Coded as Quality of evidence for an
and realistic public health and clinical ‘‘Not Available’’ (N/A) if either no data intervention depends on the strength of
settings (i.e., dissemination studies).4 or no implementation/training materials adequately implemented research
• Costs: Provides a breakdown of for particular culturally identified design controls, including comparison
intervention cost(s) per recipient/ groups are available. When culture- conditions for quasi-experimental and
participant or annual as appropriate specific data and/or implementation randomized experimental designs
(including capital costs, other direct materials exist for one or more groups, (individual studies). Aggregation (e.g.,
costs [travel, etc.]). Start-up costs the following two Yes/No questions will meta-analysis and systematic research
including staff training and be provided for each group: reviews) and/or replication across well-
designed series of quasi-experimental
development. A standardized template • Was the intervention developed
would be provided to applicants for and randomized control studies provide
with participation by members of the
estimating and summarizing the the strongest evidence. The evidence
culturally identified group?
implementation and maintenance costs pyramid presented below represents a
• Are intervention and training typical hierarchy for classifying the
of an intervention. materials translated or adapted to
• Adverse Effects: Reported with strength of causal inferences that can be
members of the culturally identified obtained by implementing various
regard to type and number, amounts of group?
change reported, type of data collection, research designs with rigor.6 Designs at
• Implementation History: Provides the lowest level of evidence pyramid
analyses used, intervention and information relevant to the
comparison group, and subgroups. (i.e., observational, pilot, or case
sustainability of interventions. Provides studies), while acceptable as evidence
• Evaluation Design: Contains both a descriptive information on (1) the in some knowledge development
searchable index of specific number of sites that have implemented contexts, would not be included in the
experimental and quasi-experimental the intervention; (2) how many of those NREPP system.
designs (e.g., pre-/posttest have been evaluated for outcomes; (3)
nonequivalent groups designs, the longest continuous length of 6 Biglan, A., Mrazek, P., Carnine, D.W., & Flay, B.
regression-discontinuity designs, implementation (in years); (4) the R. (2003). The integration of research and practice
average or modal length of in the prevention of youth problem behaviors.
4 For more description of these types of studies
implementation; and (5) the American Psychologist, 58, 433–440.
and their role in supporting evidence-based Chambless, D. L., & Hollon, S. (1998). Defining
approximate number of individuals who empirically supported therapies. Journal of
services, see the report: Bridging science and
service: A report by the National Advisory mental Consulting and Clinical Psychology, 66, 7–18.
Health Council’s Clinical Treatment and Services 5 Campbell, D.T., & Stanley, J.C. (1966). Gray, J. A. (1997), Evidence-based healthcare:
Research Workgroup (http://www.nimh.nih.gov/ Experimental and quasi-experimental designs for How to make health policy and management
publicat/nimhbridge.pdf). research. Chicago: Rand McNally. decisions. New York: Churchill Livingstone.
wwhite on PROD1PC65 with NOTICES

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00071 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices 13137

1. Reliability 7 ‘‘Acceptable’’ here means validity at a 2 = There is evidence of acceptable


Outcome measures should have level that is conventionally accepted by fidelity in the form of judgment(s)
acceptable reliability to be interpretable. experts in the field. by experts, systematic collection of
‘‘Acceptable’’ here means reliability at a 0 = Absence of evidence measure data (e.g. dosage, time spent in
level that is conventionally accepted by validity, or some evidence that the training, adherence to guidelines or
experts in the field.8 measure is not valid. a manual), or a fidelity measure
0 = Absence of evidence of reliability or with unspecified or unknown
2 = Measure has face validity; absence
evidence that some relevant types psychometric properties.
of evidence that measure is not
of reliability (e.g., test-retest, 4 = There is evidence of acceptable
valid.
interrater, interitem) did not reach fidelity from a tested fidelity
4 = Measure has one or more acceptable instrument shown to have
acceptable levels.
2 = All relevant types of reliability have forms of criterion-related validity reliability and validity.
been documented to be at (correlation with appropriate,
validated measures or objective 4. Missing Data and Attrition
acceptable levels in studies by the
applicant. criteria); OR, for objective measures Study results can be biased by
4 = All relevant types of reliability have of response, there are procedural participant attrition and other forms of
been documented to be at checks to confirm data validity; missing data. Statistical methods as
acceptable levels in studies by absence of evidence that measure is supported by theory and research can be
independent investigators. not valid. employed to control for missing data
3. Intervention Fidelity and attrition that would bias results, but
2. Validity
studies with no attrition needing
Outcome measures should have The ‘‘experimental’’ intervention adjustment provide the strongest
acceptable validity to be interpretable. implemented in a study should have evidence that results are not biased.
fidelity to the intervention proposed by 0 = Missing data and attrition were
7 Each criterion would be rated on an ordinal
the applicant. Instruments that have taken into account inadequately,
scale ranging from 0 to 4. The endpoints and tested acceptable psychometric
midpoints of the scale would be anchored to a OR there was too much to control
narrative description of that rating. The remaining properties (e.g., interrater reliability, for bias.
integer points of the scale (i.e., 1 and 3) would not validity as shown by positive 2 = Missing data and attrition were
be explicitly anchored, but could be used by association with outcomes) provide the taken into account by simple
reviewers to assign intermediate ratings at their highest level of evidence.
discretion. estimates of data and observations,
wwhite on PROD1PC65 with NOTICES

8 Marshall, M., Lockwood, A., Bradley, C., 0 = Absence of evidence or only or by demonstrations of similarity
Adams, C., Joy, C., & Fenton, M. (2000). narrative evidence that the between remaining participants and
Unpublished rating scales: A major source of bias
in randomised controlled trials of treatments for
applicant or provider believes the those lost to attrition.
schizophrenia. British Journal of Psychiatry, 176, intervention was implemented with 4 = Attrition was taken into account by
EN14MR06.000</GPH>

249–252. acceptable fidelity. more sophisticated methods that

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00072 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
13138 Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices

model missing data, observations, adequate to support initial and The identification of collaborative
or participants; OR there was no ongoing implementation. mechanisms for supporting the
attrition needing adjustment. 4 = Applicant provides training and continued development and refinement
support resources that are fully of NREPP will represent a SAMHSA
5. Potential Confounding Variables adequate to support initial and priority in 2006. SAMHSA will explore
Often variables other than the ongoing implementation (tested means for providing adequate technical
intervention may account for the training curricula, mechanisms for assistance resources to communities
reported outcomes. The degree to which ongoing supervision and seeking to initiate and/or augment
confounds are accounted for affects the consultation). evidence-based practices. In addition,
strength of casual inference. appropriate technical advisors and other
3. Quality Improvement (QI) Materials
0 = Confounding variables or factors scientific resources will be utilized to
(e.g., Fidelity Measures, Outcome and
were as likely to account for the assure the continued evolution of
Performance Measures, Manuals on
outcome(s) reported as were NREPP as a state-of-the-art decision
How To Provide QI Feedback and
hypothesized causes. support tool.
Improve Practices)
2 = One or more potential confounding
variables or factors were not 0 = Applicant has limited or no Appendix: Analysis of Public
completely addressed, but the materials. Comments in Response to Federal
intervention appears more likely 2 = Applicant has materials that are Register Notice
than these confounding factors to partially adequate to support initial Background and Overview
account for the outcome(s) reported. and ongoing implementation.
4 = All known potential confounding 4 = Applicant provides resources that The Substance Abuse and Mental
variables appear to have been are fully adequate to support initial Health Services Administration
completely addressed in order to and ongoing implementation (tested (SAMHSA), through its Science to
allow causal inference between quality fidelity and outcome Service initiative, develops tools and
intervention and outcome(s) measures, comprehensive and user- resources for providers of prevention
reported. friendly QI materials). and treatment services to facilitate
evidence-based decisionmaking and
6. Appropriateness of Analyses Scoring the Strength of Evidence and practice. An important informational
Readiness for Dissemination resource is the National Registry of
Appropriate analysis is necessary to
Dimensions Evidence-based Programs and Practices
make an inference that an intervention
caused reported outcomes. The ratings for the decision support (NREPP). NREPP is a voluntary rating
0 = Analyses were not appropriate for dimensions of Strength of Evidence and and classification system designed to
inferring relationships between Readiness for Dissemination are provide the public with reliable
intervention and outcome, OR the calculated by averaging individual information on the scientific basis and
sample size was inadequate. rating criteria that have been scored by practicality of interventions designed to
2 = Some analyses may not have been reviewers according to a uniform five- prevent and/or treat mental and
appropriate for inferring point scale. For these two quantitative addictive disorders. NREPP originated
relationships between intervention dimensions, the average score on each in SAMHSA’s Center for Substance
and outcome, OR the sample size dimension (i.e., across criteria and Abuse Prevention (CSAP) in 1997 as a
may have been inadequate. reviewers) as well as average score for way to help professionals in the field
4 = Analyses were appropriate for each rating criterion (across reviewers) become better consumers of prevention
inferring relationships between will be provided on the Web site for programs. The program was expanded
intervention and outcome. Sample each outcome targeted by the in 2004 to include substance abuse
size and power were adequate. intervention.9 treatment interventions within
SAMHSA’s Center for Substance Abuse
Readiness for Dissemination Dimension V. Future Activities: Implementing and
Treatment (CSAT) and mental health
Ratings Sustaining a Streamlined NREPP
promotion and treatment interventions
1. Availability of Implementation SAMHSA plans to initiate reviews within the Center for Mental Health
Materials (e.g., Treatment Manuals, using the new NREPP review process Services (CMHS).
Brochures, Information for and procedures in summer 2006. The During the past 2 years, SAMHSA
Administrators, etc.) precise number and characteristics of reviewed existing evidence rating
new interventions that will be systems and developed and pilot-tested
0 = Applicant has insufficient prioritized for the first series of reviews
implementation materials. a revised approach to the rating of
have yet to be determined. SAMHSA specific outcomes achieved by programs
2 = Applicant has provided a limited
anticipates that many of the existing and practices. This development effort
range of implementation materials,
programs and practices currently listed led SAMHSA to propose 16 evidence
or a comprehensive range of
on the SAMHSA Model Programs Web rating criteria as well as a set of
materials of varying or limited
site will undergo an expedited set of proposed utility descriptors to describe
quality.
4 = Applicant has provided a reviews using the new system. the potential of a given intervention to
comrephensive range of standard Regardless, the current Model Programs be ‘‘transported’’ to real-world settings
implementation materials of Web site will remain intact until all and populations.
apparent high quality. relevant programs have been included Considering the prominence of
in a new Web site, http://www.national NREPP within its Science-to-Service
2. Availability of Training and Support registry.samhsa.gov initiative and the potential impact of
wwhite on PROD1PC65 with NOTICES

Resources NREPP on the research and provider


9 Note that it is unlikely that the Readiness for
0 = Applicant has limited or no training communities, SAMHSA announced a
Dissemination dimension will vary by targeted
and support resources. outcome(s), insofar as the materials and resources
formal request for public comments in
2 = Applicant provides training and are usually program specific as opposed to outcome the Federal Register on August 26, 2005
support resources that are partially specific. (70 FR 165, 50381–50390) with a 60-day

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00073 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices 13139

public comment period ending October (N=4);Wanted to submit a program for 4. Exclusion From NREPP Due to Lack
26, 2005. The notice outlined in some NREPP review (N=4); and of Funding
detail the proposed review system, • Wanted to submit a program for 5. Potential Impact on Minority
including scientific criteria for evidence NREPP review (N=4); and Populations
reviews, the screening and triage of • Responded to another Federal 6. Potential Impact on Innovation
NREPP applications, and the Register notice (N=1). 7. Provider Factors
identification by SAMHSA of priority Procedure 8. Other Agencies’ Standards and
review areas. The notice invited general Resources
as well as specific comments and Before coding began, responses were
9. Reliance on Intervention
included 11 questions soliciting targeted read to identify recurrent themes to
Developers To Submit Applications
feedback. By request of the SAMHSA include in the codebook (presented in
10. Generalizability
Project Officer, MANILA Consulting Subpart A of this Appendix). Using this
11. Other Themes and Notable
Group coded and analyzed the codebook, each submission was then
Comments
responses received in response to the 11 assigned codes identifying respondent
questions posted in the Federal Register characteristics (name, location, domain 1. Individual-Level Criteria
notice. The results of the analysts are of interest, affiliation/type of Number of respondents: 24 (22%).
presented below. organization, functional role, and level
of response) and the content or topical Recommendations made by
Method themes contained in the response. One respondents included adding cost
A total of 135 respondents submitted pair of coders coded the respondent feasibility as a 13th criterion (one
comments via e-mail, fax, and postal data, while another pair coded the respondent) and scoring all criteria
mail during the comment period. Of content. Content coding was conducted equally (two respondents). Comments
these 135 respondents, 109 (81%) by two doctoral-level psychologists with regarding specific criteria are presented
answered at least some of the 11 extensive training and experience in in Subpart C.
questions posted in the Federal Register social science research and 2. Population-, Policy-, and System-
notice. methodology. Level Criteria
Each response could be assigned
Respondents multiple codes for content. Coders Number of respondents: 29 (27%).
The 135 respondents included 53 compared their initial code assignments Comments on specific criteria are
providers, 36 researchers, 4 consumers, for all responses, discussed reasons for presented in Subpart D. Highlights of
21 respondents with multiple roles, and their code assignments when there were comments on more general issues are
21 with unknown roles visa-à-vis discrepancies, and then decided upon presented below.
NREPP. Respondents were labeled as final code assignments. In many cases, Differences in Evaluation Approaches
having one or more of the following coders initially assigned different codes for Individual-Level and Population-,
domains of interest: substance abuse but upon discussion agreed that both Policy-, and System-Level Outcomes
prevention (N=68), substance abuse coders’ assignments were applicable.
treatment (N=48), mental health Coding assignments were ultimately Two respondents noted the proposed
promotion (N=22); and mental health unanimous for all text in all responses. NREPP approach does not acknowledge
treatment (N=20). The domain of key differences between evaluating
Results individual-level outcomes and
interest was unknown for 33
respondents. The respondents The following discussion of key population-, policy-, and system-level
represented 16 national organizations, themes in the public comments is outcomes. One of these respondents
10 state organizations, and 14 local presented in order of the 11 questions argued that NREPP is based on theories
organizations; 90 were private citizens; from the Federal Register notice. Tables of change that operate only at the
and 5 were individuals with unknown containing detailed frequencies of individual level of analysis, with the
affiliations. Fifty-one respondents (38%) themes in the comments and other assumption that discrete causes lead to
were labeled ‘‘noteworthy’’ at the descriptive information are provided in discrete effects, and therefore ‘‘many of
request of the SAMHSA Project Officer. Subpart B. the NREPP criteria appear to be
Noteworthy respondents included those insufficient or inappropriate for
Comments Addressing Question 1
representing national or state determining the validity of community-
governments or national organizations, Question 1. ‘‘SAMHSA is seeking to based interventions and their context-
establish an objective, transparent, efficient, dependent effects.’’
and nationally known experts in
and scientifically defensible process for
substance abuse or mental health identifying effective, evidence-based Unclear What Interventions Are of
research or policy. interventions to prevent and/or treat mental Interest to NREPP
Twenty-six responses were judged by and substance use disorders. Is the proposed
the four MANILA coders and the NREPP system—including the suggested One organization, Community Anti-
SAMHSA Project Officer to contain no provisions for screening and triage of Drug Coalitions of America,
information relevant to the 11 questions applications, as well as potential appeals by recommended that SAMHSA present a
in the notice. These responses, labeled applicants—likely to accomplish these clear, operational definition of the types
‘‘unanalyzable’’ for the purposes of this goals?’’ of interventions it wants to include in
report, could be categorized as follows: Respondents submitted a wide range NREPP.
• Mentioned topics related to of comments addressing Question 1. Match Scale to Individual-Level
SAMHSA but made no point relevant to Highlights of these comments are Outcomes
the questions posted in the Federal presented below, organized by topic as
wwhite on PROD1PC65 with NOTICES

Register notice (N=10); follows: Twelve respondents, including the


• Mentioned only topics unrelated to 1. Individual-Level Criteria Society for Prevention Research and a
SAMHSA or incoherent text (N=7); 2. Population-, Policy-, and System- group of researchers from a major
• Asked general questions about Level Criteria university, recommended that the same
NREPP and the Federal Register notice 3. Utility Descriptors scale be used for outcomes at the

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00074 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
13140 Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices

individual level as for the population, Practice, the National Association of most or all of those seen in particular
policy, and system levels. State Alcohol and Drug Abuse Directors, agencies: HIV positive patients, native
Community Anti-Drug Coalitions of Americans, adolescents, Hispanics, or
Add Attrition Criterion
America, and the California Association African Americans. Although it is
The same group of university of Alcohol and Drug Program unreasonable to expect all EBTs to be
researchers suggested adding attrition as Executives. The National Association tested with all populations, the external
a 13th criterion to the rating criteria for for Children of Alcoholics provided the validity of existing studies remains a
studies of population outcomes. They following comment: serious concern.’’ For these reasons,
noted, ‘‘Just as attention to attrition of NREPP should establish differing criteria many respondents surmised that the
individuals from conditions is essential for projects that collected data with [National widespread application of interventions
in individual-level studies, attention to Institutes of Health] grant funds and projects developed in research contexts that
attrition of groups or communities from that collected data with no or very small might tend to limit the inclusion of
studies is essential in group-level amounts of funds. It has been intrinsically minority and/or underserved
studies. This is necessary in order to unfair that only grants have been able to populations could ultimately result in
assess attrition as a possible threat to the establish ‘‘evidence’’ while many programs decreased cultural competence among
validity of the claim that the appear very effective—often more effective in service providers.
some circumstances than NREPP approved
population-, policy-, or system-level programs—but have not had the Federal 6. Potential Impact on Innovation
intervention produced observed support or other major grant support to
outcomes.’’ evaluate them. The SAMHSA grant programs Number of respondents: 21 (19%).
continue to reinforce the designation of Twenty-one respondents cited
Include Only Interventions That Change concerns that the proposed NREPP
NREPP programs in order to qualify for
Behavior funding, and the states tend to strengthen approach could hamper innovation.
It was recommended that NREPP only this ‘stipulation’ to local programs, who then CAADPE noted that its main concerns
include interventions proven to change drop good (non-NREPP) work they have been were ‘‘the focus on the premise that
behavior. A group of university doing or purchase and manipulate NREPP treatment will improve if confined to
programs that make the grant possible. This
researchers noted: inteventions for which a certain type of
is not always in the best interest of the client
As currently described, these outcomes population to be served. research evidence is available’’ and ‘‘the
refer to implementation of changes in policy issue of ‘branding,’ which could lead to
or community service systems, not to
Another key concern was that funding some of our most innovative and
changes in behavioral outcomes themselves. for replication research is rarely effective small scale providers
In fact, as currently described, the policy or available. Several respondents suggested eliminated from funding
system change would not be required to that SAMHSA consider funding considerations.’’
show any effects on behavior in order to be evaluation research, and many argued One respondent suggested that lists of
included in NREPP. This is a serious mistake. that the lack of funding resources could evidence-based treatments could ‘‘ossify
The NREPP system should be reserved for negatively impact minority populations research and practice, and thus become
policies, programs, and system-level changes or inhibit treatment innovation. The
that have produced changes in actual drug
self-fulfilling prophecies * * * stifling
latter two themes were frequent enough innovation and the validation of
use or mental health outcomes.
to be coded and analyzed separately. existing alternatives.’’ Several
3. Utility Descriptors Results are summarized in the following respondents observed that the potential
sections. for stifling innovation is even greater
Number of respondents: 15 (14%). given that SAMHSA’s NREPP is not the
Only one respondent, the Committee 5. Potential Impact on Minority
Populations only list of evidence-based practices
for Children, recommended specific used by funders.
changes to the utility descriptors. Their Number of respondents: 13 (12%). The APA Practice Organization
comments are presented in Subpart E of Thirteen respondents noted that the recommended that NREPP focus on
this Appendix. proposed NREPP approach could ‘‘developing and promoting a range of
Seven other respondents negatively impact specific populations, more accessible and less stigmatized
recommended using utility descriptors including minority client populations. services that are responsive to
in some way to score programs. The The Federation of Families for consumers’ needs and preference, and
American Psychological Association Children’s Mental Health suggested that offer more extensive care
(APA) Committee on Evidence-Based NREPP would effectively promote opportunities.’’
Practice recommended more emphasis certain practices ‘‘simply because the
on the utility descriptors ‘‘as these are resources for promotion, training, 7. Provider Factors
key outcomes for implementation and evaluation are readily accessible * * * Number of respondents: 22 (20%).
they are not adequately addressed in the thus widening the expanse and A number of respondents noted the
description of NREPP provided to date. disparities that currently exist.’’ proposed NREPP approach does not
This underscores earlier concerns noted Another frequently noted concern was acknowledge provider effects on
about the transition from effectiveness that evidence-based practices are treatment outcomes. The APA
to efficacy.’’ currently too narrowly defined, and Committee on Evidence-Based Practice
thus as more funding sources begin to wrote, ‘‘Relationship factors in a
4. Exclusion From NREPP Due To Lack
require evidence-based practices as a therapeutic process may be more
of Funding
prerequisite for funding, some ethnic or important than specific interventions
Number of respondents: 28 (26%). racial minority organizations may be and may in fact be the largest
The possibility that NREPP will excluded from funding. One respondent determinant in psychotherapy outcome
wwhite on PROD1PC65 with NOTICES

exclude programs due to lack of funding also pointed to potential validity (see Lambert & Barley, 2002). How will
was a concern voiced by several concerns, noting that ‘‘Very little NREPP address this concern and make
organizations, including the National clinical trial evidence is available for this apparent to users?’’
Association for Children of Alcoholics, how to treat substance use disorders in Another respondent cited the Institute
the APA Committee on Evidence-Based specific populations who may constitute of Medicine’s definition of evidence-

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00075 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices 13141

based practice as ‘‘the integration of the regardless of whether or not the scientist The criteria and selection for the peer
best research evidence with clinical responds or furthers the registration review panels should be separate for
expertise and client values,’’ noting that process.’’ prevention and treatment programs. The
‘‘The narrowed interpretation of criteria and models are different and the
The Society for Prevention Research panels should not be an across the board
evidence-based practice by SAMHSA suggested that SAMHSA convene a effort, but rather representative of prevention
focuses almost solely on the research panel to periodically review available and treatment experts specific to the program
evidence to the exclusion of clinical interventions that might not be being evaluated. The plan is based as the
expertise and patient values.’’ submitted to NREPP because they ‘‘lack notice states on 1,100 prevention programs
Several respondents suggested that a champion.’’ with little experience with treatment
NREPP could place too much emphasis programs/practices.
on highly prescriptive, annualized 10. Generalizability
Synthesizing Evidence
treatments. Counselors can become Number of respondents: 48 (44%).
bored when they are not able ti ‘‘tinker’’ Three respondents suggested using
Many respondents discussed the issue
with or adapt treatments. In addition, meta-analysis to synthesize evidence for
of generalizability of evidence,
making minor modifications may outcomes. One recommended SAMHSA
especially the concern that
actually make treatments more effective consult with National Institutes of
interventions proven to work in clinical
with different population groups. Health experts in this area.
trials do not always work in real-world
8. Other Agencies’ Standards and settings. Several respondents pointed Replications
Resources out the potential conflict between The Teaching-Family Association
implementing an intervention with recommended considering replications
Number of respondents: 27 (25%). fidelity and having a adapt it for the when evaluating evidence. The Society
Nineteen respondents suggested that, setting. for Prevention Research wrote that it is
in developing NREPP, SAMHSA should The APA Evidence-Based Practice unclear how replications would be used
consult other agencies’ standards and Committee suggested that the proposed in the proposed NREPP, and suggested
resources related to evidence-based NREPP approach does not adequately averaging ratings across studies.
practices—for example, the standards distinguish between ‘‘efficacy’’ and
published by the APA, American ‘‘effectiveness,’’ and strongly Add Criteria
Society for Addiction Medicine, and the recommended that SAMHSA look for The National Student Assistance
Society for Prevention Research. One ways to bridge the two. Association Scientific Advisory Board
respondent suggested consulting with The Associations of Addiction and one other respondent suggested
National Institutes of Health scientists Services recommended paying more adding a cultural competence criterion.
about approaches for aggregating attention to how and where treatments The Society for Prevention Research
evidence; another recommended are replicated: ‘‘The highest level of recommended adding a criterion to
including in NREPP model programs evidence should be successful assess the clarity of causal inference.
identified by other agencies. One replication of the approach in multiple
respondent submitted a bibliography of community treatment settings. Range of Reviewer Perspectives
references for assessing the rigor of Experience with [the National Institute The APA Practice Association noted
qualitative research. on Drug Abuse] Clinical Trials Network the importance of having a ‘‘large and
One respondent suggested that suggests that an approach that shows broad’’ reviewer pool: ‘‘A small group of
SAMHSA did not provide other meaningful outcome improvements in reviewers representing a limited range
institutions the opportunity to provide the ‘noisy’ setting of a publicly funded of perspectives and constituencies
input on the development of NREPP community treatment program is truly would have an undue impact on the
prior to the request for public an approach worth promoting.’’ entire system. We are pleased that a
comments. A few respondents suggested that nominations process is envisioned.’’
9. Reliance on Intervention Developers NREPP score interventions according to Cost Effectiveness
To Submit Applications their readiness and amenability to
application in real-world settings. One respondent called for
Number of respondents: 4 (4%). incorporating program cost effectiveness
Four respondents cited problems with 11. Other Themes and Notable into NREPP. In choosing what program
NREPP’s reliance on intervention Comments to implement, end users often have to
developers to submit applications, and Distinguishing Treatment and decide between diverse possibilities,
suggested that literature reviews instead Prevention such as attempting to pass a tax increase
be used to identify programs eligible for on beer or implementing additional
NREPP. One private citizen wrote, ‘‘If Number of respondents: 7 (6%). classroom prevention curricula, each
no one applies on behalf of a treatment A few respondents called or with competing claims about
method, is that one ignored? Why not evaluating treatment and prevention effectiveness. A cost-effectiveness
simply start with the literature and approaches differently. One respondent framework may be the only way to
identify treatment methods with noted that some criteria appear to be compare these choices.
adequate evidence of efficacy?’’ more appropriate for treatment
Comments Addressing Question 2
Another respondent observed that modalities than for preventive
requiring an application creates a bias interventions, and recommended that Question 2. ‘‘SAMHSA’s NREPP priorities
toward programs with advocates ‘‘either SAMHSA ‘‘confer with research experts are reflected in the agency’s matrix of
ideologically or because of a vested in those respective fields and separate program priority areas. How might SAMHSA
wwhite on PROD1PC65 with NOTICES

interest in sales, visibility, and profits. out those criteria that are more relevant engage interested stakeholders on a periodic
An alternative is to select interventions to only treatment or prevention.’’ basis in helping the agency determine
for NREPP consideration solely by Another respondent suggested that intervention priority areas for review by
monitoring the peer-reviewed published the criteria are more appropriate for NREPP?’’
literature, and including them prevention that treatment: Number of respondents: 16 (15%).

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00076 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
13142 Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices

Respondents recommended a number respondents recommended combining available; and helps promote continued
of approaches to engage stakeholders: effect size with reach. A group of innovation in the development of evidence-
• Conduct meetings, conferences, and researchers from a major university based interventions. Others have argued that
seminars. noted: several distinct categories will confuse
NREPP users. Please comment on SAMHSA’s
• Send and solicit information via e- Effect sizes should be reported, but they proposal in this area.’’
mail or a Web site. should not be used as a criterion for
• Send informational notices via inclusion or exclusion from NREPP. From a Number of respondents: 35 (32%).
newletters. public health perspective, the impact of an Thirty-three respondents supported
• Survey stakeholders. intervention is a function of both its efficacy the use of multiple categories as
• Work with the Addiction and its reach (Glasgow, Vogt, & Boles, 1999). outlined in Question 4; two respondents
Technology Transfer Centers (ATTCs) to An intervention with even a very modest were opposed. Of those in favor of
administer surveys. effect size can have a substantial impact on
public health if it reaches many people.
multiple categories, nine respondents
• Consult the National Prevention wrote that this approach would reflect
Network and the Society for Prevention Therefore, NREPP should report effect sizes
for each statistically significant outcome the process of emerging evidence and
Research, which ‘‘have forged a close reported and NREPP should also include and encourage knowledge sharing early in
working relationship to foster the provide an assessment of the ‘‘reach’’ of that the process. The APA Evidence-Based
integration of science and practice and intervention. Specifically, the inclusion Practice Committee argued that
* * * would be very helpful in criteria for participation and the proportion ‘‘Including all of these NREPP products
answering this question.’’ of the recruited population that participated is seen as a desirable feature that reflects
in the intervention study should be included the continuous nature of evidence. This
Comments Addressing Question 3 in describing the likely ‘‘reach’’ of the
program.
may also be critical information for
Question 3. ‘‘There has been considerable providing reasonable options for
discussion in the scientific literature on how Three respondents noted that stakeholders when there are no or few
to use statistical significance and various professionals in the field have not evidence-based practices available.’’
measures of effect size in assessing the reached consensus on how to use effect The State Associations of Addiction
effectiveness of interventions based upon size. One noted, ‘‘Effect sizes may vary
both single and multiple studies (Schmidt &
Services pointed out that multiple
with the difficulty of the prevention categories would lessen the likelihood
Hunter, 1995; Rosenthal, 1996; Mason, goal and the methodological rigor of the
Schott, Chapman, & Tu, 2000; Rutledge & of misinterpreting information in
Loh, 2004). How should SAMHSA use
analysis. Applying standards for ‘weak,’ NREPP, and the California Department
statistical significance and measures of effect ‘moderate,’ ‘strong’ or other labels fails of Alcohol and Drug Programs added
size in NREPP? Note that SAMHSA would to take into account differences in that including multiple categories of
appreciate receiving citations for published results that may be attributable to intervention would give greater
materials elaborating upon responders’ differences in goals or methods.’’ flexibility to programs using the list.
suggestions in this area.’’ One respondent suggested Of the two respondents against
considering other indicators of clinical multiple categories, one suggested that a
Statistical Significance effectiveness, such as use of the RCI clear designation of effectiveness is
Number of respondents: 13 (12%). (reliable change index; Jacobson & needed if NREPP is to be useful to the
A group of university researchers Truax, 1984). field.
Other points made regarding effect
recommended that for programs to be
size included the following: Additional Comments
included in NREPP, they should be • Between-group effect sizes assume a
required to provide statistically One respondent argued that only two
standard comparison condition, which
significant results on drug use and/or categories should be used, effective and
is rare in nonmedical interventions.
mental health outcomes using two- emergent: ‘‘While distinctions such as
Meta-analyses with baseline-follow-up
tailed tests of significance at p<.05. The whether a program has had independent
effect sizes or a ‘‘network approach’’ to
APA Evidence-Based Practices replications as opposed to developer
effect sizes are ways to overcome this
Committee recommended further replications may be of interest to
problem.
discussion and consideration by NREPP researchers, the majority of those
• Effect size is not the equivalent of
of the conceptual distinction between responsible for choosing and
client improvement and does not assess
statistical and clinical significance. implementing programs may find this
the significance of interventions for
The County of Los Angeles level of detail to be confusing rather
their clients.
Department of Health Services urged • Effect size alone is not sufficient to than particularly helpful or relevant.’’
SAMHSA ‘‘not to place undue evaluate and rate programs; cost-benefit A group of university researchers
preference only on programs that offer information or other practical recommended assigning scores to
statistically significant results. Studies information are also needed. several categories of evidence quality:
of innovative approaches and of theoretical foundation, design adequacy,
emerging populations may not have Comments Addressing Question 4 measure adequacy, fidelity, and analysis
sample sizes large enough to support Question 4. ‘‘SAMHSA’s proposal for adequacy.
sophisticated statistical analyses, yet NREPP would recognize as effective several Several other organizations suggesting
may offer valuable qualitative categories of interventions, ranging from adding a category for programs not yet
information on effective approaches.’’ those with high-quality evidence and more shown to be evidence-based, but
replication to those with lower quality recommended for further study. One
Effect Size evidence and fewer replications. This would noted that categories of effectiveness
Number of respondents: 24 (22%). allow for the recognition of emerging as well should be the same for individual-level
wwhite on PROD1PC65 with NOTICES

as fully evidence-based interventions. Some


Most of the respondents discussing view this as a desirable feature that reflects
and population-, policy-, or system-level
effect size noted that interventions the continuous nature of evidence; provides outcomes.
aimed at achieving population change important options for interventions One respondent proposed an
were likely to have small effect sizes, recipients, providers, and funders when no approach in which SAMHSA would
even if they are very successful. Several or few fully evidence-based interventions are document the strength of evidence for

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00077 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices 13143

each approach, and allow consumers to applicability to various client Number of respondents: 32 (29%).
decide what is effective: populations. Twenty-seven respondents proposed
Various authorities have established The Oregon Office of Mental Health rereviewing existing programs under the
different and sometimes conflicting and Addiction Services suggested that revised NREPP criteria. Five
standards for when there is enough evidence SAMHSA ‘‘focus considerable effort on respondents advocated grandfathering
to constitute an EBT. Part of the problem here identifying and listing practices useful the programs into NREPP without
is drawing a discrete line (EBT or not) on and applicable for diverse populations review. Highlights of these viewpoints
what is actually a continuous dimension. and rural areas. Providers and are provided below.
* * * To inform and demystify the stakeholders from these groups have
dichotomous and somewhat arbitrary repeatedly expressed the concern they Arguments for Rereview
decision as to which treatments are evidence-
will be left behind if no practices have The Committee for Children wrote a
based and which are not, it is useful to have
a compilation of the strength of evidence for been identified which fit the need of grandfathering system ‘‘may give the
(or against) different approaches. * * * Why their area. We need to take particular impression to NREPP users, right or
not just stick to your main emphasis on care to ensure that their fear is not wrong, that ‘grandfathered’
documenting the strength of evidence for realized.’’ interventions aren’t as good as those
each approach, and let others decide where The Committee for Children suggested that have undergone the new review
they want to draw the line for what they reporting data for two separate process.’’
regard to be ‘‘effective.’’ dimensions: setting and population. Another respondent supported a
Another respondent argued that Setting dimensions would include single review process to assure
providing information on replications community data—size of community, programs that ‘‘all programs and
and having six potential categorizations community context (e.g., suburb, town), practices are being rated according to a
for evidence-based practices could be geographic location, community consistent set of criteria, and therefore
too technical and confusing for some. socioeconomic status—and agency data, that the adoption of an intervention by
Most consumers will be most interested which includes the type of agency (e.g., a provider can be made with
in whether there is some body of hospital, child care, school), confidence.’’
evidence that the program they are characteristics (e.g., outpatient vs. Two researchers (both SAMHSA
considering works. inpatient, middle school vs. elementary Model Program affiliates) noted that
One respondent, a private citizen, school), size, and resources required for grandfathering will ‘‘water down’’ the
recommended that SAMHSA ask implementation. Population dimensions NREPP criteria, and recommended
stakeholders what categories would be would include age, socioeconomic establishing a mechanism to remove
useful to them. status, ethnicity, cultural identification, programs from NREPP when the
immigrant/acculturation status, race, evidence warrants.
Comments Addressing Question 5 and gender. A program developer called for a
Question 5. ‘‘SAMHSA recognizes the How To Report gradual transition from Model Program
importance of considering the extent to to rereview:
which interventions have been tested with Three respondents submitted
diverse populations and in diverse settings. suggestions for how to report on I suggest that SAMHSA maintain the
Therefore, the agency anticipates intervention effectiveness with diverse current Model Program designation and grant
incorporating this information into the Web populations. The APA Evidence-Based these programs status within the new NREPP
site descriptions of interventions listed on for up to 3 years. During that time period the
Practices Committee suggested that
NREPP. This may allow NREPP users to learn existing programs would be screened against
SAMHSA develop ‘‘a comprehensive the new review criteria and provided an
if interventions are applicable to their
specific needs and situations, and may also glossary that addresses definitions of opportunity to obtain additional research
help to identify areas where additional different constituencies, populations, findings, if needed, in order to help achieve
studies are needed to address the and settings.’’ The Family and Child evidence-based status within the new
effectiveness of interventions with diverse Guidance Clinic and the Native NREPP. * * * Many current model programs
populations and in diverse locations. American Health Center of Oakland have invested extensive time and financial
SAMHSA is aware that more evidence is both suggested that a panel of Native resources to reference SAMHSA Model
needed on these topics. Please comment on Americans be convened to decide which Program status is their informational,
SAMHSA’s approach in this area. training, and curricula materials, under the
evidence-based programs and practices
auspices of their partnership agreements with
Number of respondents: 27 (25%). are effective for Native Americans, then the SAMHSA Model Program Dissemination
Most respondents affirmed the submit a monograph describing these Project. They did this in good faith. While
importance of the issues raised in programs and practices. the SAMHSA Model Program Project has
Question 5. Two respondents suggested Comments Addressing Question 6 been disbanded, it is reasonable to expect
that SAMHSA should facilitate research SAMHSA to honor their agreements with the
aimed at developing services for Question 6. ‘‘To promote consistent, model programs for a period of time during
minority populations. Comments reliable, and transparent standards to the the transitional phase. During this
public, SAMHSA proposes that all existing transitional phase I recommend that the
regarding what and how to report are programs on NREPP meet the prevailing model program not be earmarked as not
noted below. scientific criteria described in this proposal, having been assessed against the new NREPP
What To Report and that this be accomplished through scientific standards, but rather that they have
required rereviews of all programs currently been found to be effective under the former
Regarding what to report, respondents on NREPP. SAMHSA has considered an NREP and are awaiting review under the new
suggested tracking and reporting alternative approach that would criteria.’’
demographic changes; reporting the ‘‘grandfather’’ all existing NREPP programs
impact of interventions on different under the new system, but would provide Arguments for Grandfathering
wwhite on PROD1PC65 with NOTICES

populations; and requiring programs clear communication that these existing Those who argued for grandfathering
programs have not been assessed against the
that use NREPP interventions to report new NREPP scientific standards. Please previous Model Programs discussed the
to SAMHSA on the impact on their comment on which approach you believe to possible detrimental effects that not
client populations, as well as providers’ be in the best interests of SAMHSA grandfathering would have. One
thoughts about the intervention’s stakeholders.’’ respondent described taking away the

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00078 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
13144 Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices

Model Program designation as ‘‘a • Conduct training on NREPP believe that just providing information
breaking of faith that is just not programs through the Addiction about model programs on the Web will
acceptable. A subjective change in Technology Transfer Centers (ATTCs). result in much diffusion of the
criteria does not justify harming • Work with the Office of National innovation. NREPP must pay attention
programs that previously met the grade Drug Control Policy’s National Media to training, dissemination, fidelity, and
in all good faith * * * It also makes it campaign. sustainability.’’
hard for the end user to take the list • On the NREPP Web site, offer The Society for Prevention Research
seriously, especially if they have already downloadable information on programs suggested that SAMHSA survey
expended considerable resources to as well as a way for consumers to decisionmakers and practitioners to
replace a non-evidence-based program contact the program developers for more determine their perceptions of NREPP
with one currently designated evidence- information. as well as about other factors
based.’’ (Note: SAMHSA’s Model Program Web site influencing their decisions in order to
Another respondent described the currently does provide program summaries determine how to encourage adoption of
destabilizing effects and potential and contact information for program NREPP interventions.
developers). The APA Evidence-Based Practice
impact on credibility of programs:
Technical Assistance for Promoting Committee recommended that SAMHSA
Imagine if the ‘‘model’’ you just selected ‘‘anticipate misuses of NREPP so as to
this year at the cost of thousands of dollars Adoption of NREPP Interventions
insure that funding bodies do not
(and redesigned your prevention delivery Number of respondents: 30 (28%).
system upon) is somehow diminished or mistakenly assume that improving
lessened in ‘‘scientific’’ credibility. Would
Many respondents noted the treatment comes from confining
you not begin to wonder if you could trust importance of providing technical treatment to a list of recommended
the next ‘‘model’’ to hold credibility? * * * assistance to those looking to adopt techniques.’’
There is a very real need to be careful about NREPP-listed interventions. The Oregon
the criteria, and planning for a smooth and Office of Mental Health and Addiction Resources for Promoting NREPP
gentle segue for change * * * at the Services wrote, ‘‘The adoption of new Interventions
grassroots level if programs are rotating on practices by any entity is necessarily a Number of respondents: 27 (25%).
and off of the registry system. One might well complex and long-term process. Many Many respondents articulated ways
ask, how could a ‘‘model’’ program of today providers will need technical support if that SAMHSA could support and
not worthy of some level of inclusion
tomorrow?
adoption and implementation is to be promote NREPP interventions. One
accomplished effectively. Current common suggestion was that SAMHSA
Yet another respondent pointed out resources are not adequate to meet this should provide the funding for and/or
that not grandfathering programs could challenge.’’ help create the infrastructure that is
pose financial problems for Another respondent suggested that required for program implementation.
organizations offering model programs. SAMHSA identify point people, either For example, the California-based
Since some organizations may only at the Federal level or through the Coalition of Alcohol and Drug
receive funding for programs designated CAPTs, who can ‘‘partner with Associations wrote:
as ‘‘model programs,’’ they may not be developers to gain a clear understanding The existing treatment infrastructure
able to offer the programs while of their evidence-based interventions cannot handle the expectation for data
awaiting rereview. and become knowledgeable enough to collection. It is currently unlikely that most
accurately discuss them with community-based treatment programs could
Comments Addressing Question 7 meet the standard to be listed on the registry.
community-based preventionists.’’
Question 7. ‘‘What types of guidance, A group of university researchers How can the infrastructure be strengthened?
resources, and/or specific technical agreed that substantial training and What funding streams is SAMHSA promoting
assistance activities are needed to promote to accomplish this? * * * The initiative
technical assistance are required for the
greater adoption of NREPP interventions, and promises technical assistance, but this is not
effective implementation of preventive substitute for missing infrastructure. The
what direct and indirect methods should
interventions. They recommended using financial resources to support such efforts
SAMHSA consider in advancing this goal?’’
SAMHSA’s Communities That Care, [have] always been absent, yet the
Venue, Channel, and Format for which has been shown to increase the expectations and demands continue to be
Promoting Adoption of NREPP adoption of tested and effective placed upon underfunded community-based
Interventions preventive interventions in providers, driving some out of business and
communities, to increase adoption of requiring others to reduce services.
Number of respondents: 7.
NREPP interventions. The Coalition of Alcohol and Drug
Proposed strategies for promotion The National Student Assistance Associations also asked how SAMHSA
(venue, channel, and format) include Association Scientific Advisory Board plans to protect providers from
the following: recommended that SAMHSA use exploitation: ‘‘Already there are
• Identify stakeholders and take the existing effective program and practice examples of large sums of money being
information to them (e.g., through structures, such as Student Assistance asked for training materials on
conferences, journals, professional Programs, for technical assistance, interventions developed with tax
magazines, professional newsletters, resources, and guidance. dollars. Consultants representing
physicians, churches, and PTAs). particular practices (especially those
• Convene program developers and Guidance on Adopting NREPP
listed on RFAs or on SAMHSA lists) are
state administrators for regular meetings Interventions
charging fees of $3,000 per day. This is
about programs and implementation. Number of respondents: 10 (9%). not something most nonprofits can
• Showcase NREPP programs at Several respondents recommended afford.’’
wwhite on PROD1PC65 with NOTICES

national, regional, and state that SAMHSA provide guidance to Another respondent, a private citizen,
conferences. individuals and organizations looking to suggested that SAMHSA fund Services
• Develop fact sheets about NREPP adopt NREPP interventions. The Center to Science grants, ‘‘a category of funding
programs (in collaboration with the for Evidence-Based Interventions for which was originally designed by
program developers). Crime and Addiction wrote, ‘‘We do not SAMHSA but [is] rarely utilized.’’

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00079 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices 13145

The State Associations of Addiction Comments Addressing Question 9 might be appropriate to require that a
Services suggested that SAMHSA Question 9. ‘‘SAMHSA has identified
certain percentage of block grant dollars
‘‘consider new mechanisms for funding NREPP as one source of evidence-based be committed to the dissemination and
the development of the organizational interventions for selection by potential use of block grant monies, or to
capacity needed by providers to agency grantees in meeting the requirements establish additional incentives for the
implement and sustain evidence-based related to some of SAMHSA’s discretionary adoption of such programs.’’
practices. Such mechanisms might grants. What guidance, if any, should One respondent warned of the
require new legislative authority and/or SAMHSA provide related to NREPP as a potential danger of unfunded mandates:
new funding.’’ source of evidence-based interventions for ‘‘The worst case scenario is that best of
use under the agency’s substance abuse and practices could cost the most money but
Comments Addressing Question 8 mental health block grants?’’ by law or regulation become an
Question 8. ‘‘SAMHSA is committed to Technical Assistance unfunded mandate for a government-
consumer, family, and other nonscientist funded or not-for-profit program.’’
involvement in the NREPP process. The Number of respondents: 11 (10%). The APA Practice Association noted
panels convened by SAMHSA and described A number of respondents suggested that as NREPP is voluntary, ‘‘applicants
earlier in this notice suggested that these that SAMHSA provide training to users should not be penalized for studying
stakeholders be included specifically to on the NREPP review process, as well as programs or interventions that are not
address issues of intervention utility and guidance on the appropriate use of
practicality. Please comment on how on the NREPP.’’
NREPP and how to avoid misuse. For Two organizations, the State
consumer, family, and other nonscientist
stakeholders could be involved in NREPP.’’ example, Student Assistance Programs Associations of Addiction Services and
(SAPs) and CAPTs could be used as California Alcohol and Drug Programs,
Development of NREPP Process technical assistance resources. One considered the revised NREPP approach
Number of responses: 22 (20%). respondent wrote, ‘‘SAMHSA needs to to be too new to use as a block grant
A number of respondents discussed make it clear that the NREPP ratings are requirement.
the need to involve nonscientist established as recommendations for the
field, rather than as demands upon Comments Addressing Question 10
stakeholder (primarily providers) in
developing the NREPP process. Seven agencies and programs—that it Question 10. ‘‘SAMHSA believes that
discourages thinking of NREPP- NREPP should serve as an important, but not
respondents said consumers should be exclusive source, of evidence-based
involved in NREPP development. The approved programs or practices as a
finite list and encourages efforts that interventions to prevent and/or treat mental
Pennsylvania Department of Health and substance use disorders. What steps
pointed out that ‘‘the use of such further refine and extend these should SAMHSA take to promote
approaches depends heavily on local, programs and practices to new consideration of other sources (e.g., clinical
state, and national networks of populations and settings.’’ expertise, consumer or recipient values) in
community-based providers who need Another respondent noted that stakeholders’ decisions regarding the
to be in a position to be an active government agencies responsible for selection, delivery and financing of mental
block grant allocation may need health and substance abuse prevention and
participant in discussions related to the treatment services?’’
evaluation of interventions, practices, protection fro mandates about using
and programs.’’ NREPP interventions that may not be Number of respondents: 25 (23%).
The Oregon Office of Mental Health affordable or appropriate for their client The following suggestions were noted:
and Addiction Services argued that populations. • Develop a directory of other sources
‘‘Practices that are not readily Regulation of evidence-based practices. Some
acceptable by consumers and families suggested providing links to these
A number of respondents provided sources on the NREPP Web site.
may have limited usefulness, regardless
of the evidence of technical adequacy.
recommendations related to regulation • Use an external advisory committee
and funding priority tied to NREPP. to identify other sources of evidence-
Consumers and families should be
Twelve respondents said block grant based practices.
involved in advising SAMHSA at every
level of design, development and
funds should not be restricted based on • Include a disclaimer page that
NREPP status. The Society for includes an introduction consistent
implementation of NREPP. SAMHSA
Prevention Research and several other with the issues raised in Question 10.
may wish to establish a specific
organizations recommended giving Advertising or other promotional
consumer and family advisory group to
priority to NREPP programs, while material created around NREPP could
provide advice on NREPP issues.’’
Community Anti-Drug Coalitions of reserving some funds specifically for also include this information.
America suggested that nonscientists innovation. One respondent suggested • List other sources of evaluation
should review publications and that block grant funding should give research such as the Collaborative for
recommendations to ensure they are priority to NREPP interventions. The Academic, Social, and Emotional
clear to nonresearchers. Maryland Alcohol and Drug Abuse Learning, the U.S. Department of
Administration argued that state Education, the Office of Juvenile Justice
Role in NREPP Reviews authority should supersede Federal and Delinquency Prevention, and the
Number of respondents: 21 (19%) authority in block grant allocation. National Institute of Mental Health.
Another respondent recommended The National Association of State
Suggestions for NREPP reviews
giving funding priority to systems that Alcohol/Drug Abuse directors wrote
included the following:
• Involve consumers and implement practices known to be that its Exemplary Awards Program
practitioners in reviewing programs. effective, except where evidence-based should ‘‘serve as an ‘incubator’ for
• Have practitioners assess the degree practices have not yet been identified: programs that may wish to consider
wwhite on PROD1PC65 with NOTICES

to which a program is implementable. ‘‘Although it is clear that funding submitting into the NREPP process.’’
• Have consumer groups rate cannot entirely be limited to existing Comments Addressing Question 11
programs’ utility. evidence-based programs because of the
• Have clinicians review materials for chilling effect on innovation that such a Question 11. ‘‘SAMHSA anticipates that
clarity. stance would have, nevertheless, it once NREPP is in operation, various

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00080 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
13146 Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices

stakeholders will make suggestions for practices (e.g., one having disengaged or 1. Respondent Category
improving the system. To consider this input poorly trained counselors), and a very 1.1 Commenter Name
in a respectful, deliberate, and orderly effective program that used other than 1.1.1 First
manner, SAMHSA anticipates annually evidence-based practices (e.g., one with 1.1.2 MI
reviewing these suggestions. These reviews committed, empathic counselors using 1.1.3 Last
would be conducted by a group of scientist practices that had not yet been subjected to 1.2 Location
and nonscientist stakeholders knowledgeable research. Furthermore, given the multiple 1.2.1 City
about evidence in behavioral health and the elements that contribute to a program’s 1.2.2 State
social sciences. Please comment on overall effectiveness, its effectiveness could 1.2.3 ZIP code
SAMHSA’s proposal in this area.’’ change rapidly (e.g., when a charismatic 1.2.4 Unknown
program leader leaves, when there is 1.3 Domain Interest
Number of respondents: 35 (32%). significant counselor turnover, when funding 1.3.1 SAP
Many of the 35 responses stated that source/amount changes, etc.). Thus, it makes 1.3.2 SAT
annual review of suggestions from much less sense to rate the effectiveness of 1.3.3 MHP
stakeholders is important. Four individual programs than it does to rate the 1.3.5 Unknown
respondents noted that feedback should strength of evidence supporting specific 1.4 Affiliation
treatment practices. 1.4.1 Private
be reviewed more frequently than once
1.4.2 Organization
per year. Other themes included the 1.4.2.1 National
following: Terminology
1.4.2.2 State
• Use the annual review process as a The APA Evidence-Based Practices 1.4.2.3 Local
mechanism for fostering innovation. Committee suggested using a site 1.4.2.4 Unknown
• Use marketing strategies to glossary to define diagnostic 1.5 Functional Role
encourage participation in the annual terminology and client populations and 1.5.1 Provider
review process. communities. 1.5.2 Researcher
• Solicit annual feedback from 1.5.3 Consumer
NREPP applicants whose programs have Standard Outcomes 1.5.4 Multiple
1.5.5 Unknown
been labeled effective, as well as those One respondent recommended
1.6 Response Level
whose programs have not been labeled including a standard set of outcomes to 1.6.1 Nonresponsive
effective. be evaluated. 1.6.2 Routine
• Compare NREPP results to those in Effect of Including Mental Health 1.6.3 Noteworthy (responder or comment
other similar systems. Interventions content)
• Include a mechanism in NREPP for 2. Topical Themes
programs to be dropped from, or One national organization expressed a 2.1 Will the proposed NREPP system
improve their status on, the registry concern that included mental health identify effective interventions
(possible through the annual review). interventions will detract from the focus 2.1.1 General, not criteria specific
2.1.2 Individual-level outcome criteria
• Periodically conduct a meta- on substance abuse:
2.1.3 Population/policy/system-level
analysis of evaluation results (possible The proposed expansion of NREPP to outcome criteria
through the annual review). include substance abuse treatment and 2.1.4 Utility descriptors
• To ensure the stability of NREPP, mental health will dramatically dilute the 2.1.5 Exclusion due to lack of funding
the criteria should be maintained focus of substance abuse prevention. The 2.1.6 Negative impact on minority
without changes for a set period of time resources NREPP require will necessarily be populations
(e.g., 5 years). diluted across a broader range of issues and 2.1.7 Negative impact on program
inevitably detract from a focused mission of innovation
Comments Beyond the 11 Posted supporting efforts to prevent substance 2.1.8 Lack of acknowledgment of
Questions abuse. provider factors
Reporting the Date of Reviews 2.1.9 Use of other agencies’ standards and
Twenty-two respondents (20%)
resources
submitted comments on issues that were One respondent recommended that 2.1.10 Reliance on developers for
relevant but not specifically within the SAMHSA document and report the date submitting applications
parameters of the 11 posted questions. on which a review was conducted. This 2.1.11 Generalizability issues
These are summarized below. will allow users to know how much 2.2 How can stakeholders be engaged to
time has passed since the review and identify priority review areas
Programs Versus Practices 2.2.1 Identification (of priority areas)
prompt them to search for more recent
Fourteen respondents (13%) objected 2.2.2 Engagement (of stakeholders)
evidence if needed. 2.3 How should statistical significance
to using the terms ‘‘programs’’ and
‘‘practices’’ as if they were Rationale for Revising NREPP and effect size be used to judge
interchangeable. One private citizen effectiveness
One respondent questioned if 2.3.1 Statistical significance
who submitted comments wrote: SAMHSA had sufficiently evaluated the 2.3.2 Effect size
It is important to distinguish between the existing system before deciding to revise 2.3.3 General, NEC
value of rating practices and the value of it. 2.4 Should NREPP use multiple
rating programs. although it makes sense for categories of effectiveness
reviewers to rate the quality/strength of Subpart A.—Federal Register Notice 2.4.1 General, not outcome specific
evidence regarding a treatment practice, it is Comment Codebook 2.4.1 Pro
a much different proposition to rate the 2.4.2 Con
effectiveness of a program. The effectiveness Comment ID Number: 2.4.2 Individual-level outcome rating
of a treatment program is a function, among Coded by: categories
wwhite on PROD1PC65 with NOTICES

other things, of the treatment practices it Date coded: 2.4.2.1 Pro


employs, the ancillary services (e.g., Coded by: (each item is coded by two 2.4.2.2 Con
employment counseling) it provides, the individual coders) 2.4.3 Population/policy/system-level
qualities and behaviors of its treatment Date coded: outcome rating categories
providers * * * One could imagine a very Entered by: 2.4.3.1 Pro
ineffective program using evidence-based Date entered: 2.4.3.2 Con

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00081 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices 13147

2.5 How can NREPP best provide 2.7.1 General comment 2.9.2 Funding support
information on population-specific 2.7.2 Venue 2.9.3 Regulatory (required to use)
needs and situations 2.7.3 Channel 2.10 What additional sources of
2.5.1 General comment 2.7.4 Format information should be considered
2.5.2 Venue (e.g., organized events/ 2.7.5 Technical assistance
regarding SAMHSA services
meetings, national or regional 2.7.6 Guidance
organizations) 2.7.7 Resources 2.10.1 Steps SAMHSA should take
2.5.3 Channel (distribution mechanisms, 2.8 How should nonscientist stakeholders 2.10.2 Source
e.g., listservs, clearinghouses, etc.) be involved in the NREPP process 2.11 How should an annual review of
2.5.4 Format (media type, document type, 2.8.1 General comment NREPP procedures and practices be
e.g., fact sheets, white papers, policy 2.8.2 Venue, channel, format conducted
publications, etc.) 2.8.3 Potential stakeholders 2.12 Other issues
2.6 Should current NREPP programs be 2.8.4 Involvement in the development of 2.12.1 Program vs. practice
‘‘grandfathered’’ or rereviewed the NREPP process
2.6.1 Grandfathered 2.8.5 Involvement in program reviews Subpart B.—Comments on SAMHSA’s
2.6.2 Rereviewed 2.9 What relationship should exist
Federal Register Notice: Frequencies
2.6.3 General, NEC between NREPP and SAMHSA block
2.7 How should SAMHSA promote grants and Percentages
greater adoption of NREPP interventions 2.9.1 Technical assistance provision

TABLE 1.—CHARACTERISTICS OF RESPONDENTS


[N=135]

n Percent

Domain interest (not mutually exclusive)

Substance abuse prevention ................................................................................................................................................... 68 50.4


Substance abuse treatment ..................................................................................................................................................... 48 35.6
Mental health promotion .......................................................................................................................................................... 22 16.3
Mental health treatment ........................................................................................................................................................... 20 14.8
Unknown .................................................................................................................................................................................. 33 24.4

Affiliation

Private ...................................................................................................................................................................................... 90 66.7


National organization ............................................................................................................................................................... 16 11.9
State organization .................................................................................................................................................................... 10 7.4
Local organization .................................................................................................................................................................... 14 10.4
Unknown organization ............................................................................................................................................................. 5 3.7

Functional role

Provider .................................................................................................................................................................................... 53 39.3


Researcher .............................................................................................................................................................................. 36 26.7
Consumer ................................................................................................................................................................................ 4 3.0
Multiple roles ............................................................................................................................................................................ 21 15.6
Unknown .................................................................................................................................................................................. 21 15.6

Respondent clout

Noteworthy ............................................................................................................................................................................... 51 37.8


Responsive .............................................................................................................................................................................. 58 43.0
Unanalyzable ........................................................................................................................................................................... 26 19.3

Current program status

Affiliated with a current program ............................................................................................................................................. 10 7.4


No known affiliation with a current program ............................................................................................................................ 125 92.6

TABLE 2.—COMMENTS REGARDING THE PROPOSED NREPP SYSTEM ACCOMPLISHING ITS GOALS
[Question 1]

National org. State org. Local org. Unknown org. Private

n %1 n % n % n % n %

‘‘Noteworthy’’ respondents
wwhite on PROD1PC65 with NOTICES

General, not criteria


specific 2 ................ 11 78.6 4 50.0 2 100 2 66.7 16 84.2
Individual-level out-
come criteria ......... 1 7.1 1 12.5 0 0.0 1 33.3 14 73.7

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00082 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
13148 Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices

TABLE 2.—COMMENTS REGARDING THE PROPOSED NREPP SYSTEM ACCOMPLISHING ITS GOALS—Continued
[Question 1]

National org. State org. Local org. Unknown org. Private

n %1 n % n % n % n %

Population-, policy-,
or system-level
outcome criteria .... 2 14.3 4 50.0 1 50.0 1 33.3 14 73.7
Utility descriptors ...... 4 28.6 1 12.5 0 0.0 0 0.0 3 15.8
Funding .................... 7 50.0 3 37.5 1 50.0 0 0.0 3 15.8
Minority populatons .. 1 7.1 0 0.0 1 50.0 0 0.0 2 10.5
Program innovation .. 4 28.6 4 50.0 2 100 0 0.0 2 10.5
Provider factors ........ 4 28.6 4 50.0 1 50.0 1 33.3 4 21.1
Use of other agen-
cies’ standards and
resources .............. 4 28.6 2 25.0 0 0.0 0 0.0 12 63.2
Developers submit-
ting applications .... 1 7.1 0 0.0 0 0.0 0 0.0 2 10.5
Generalizability ......... 7 50.0 5 62.5 2 100 0 0.0 5 26.3

‘‘Responsive’’ respondents

General, not criteria


specific 2 ................ 0 0.0 0 0.0 4 40.0 2 100 18 43.9
Individual-level out-
come criteria ......... 0 0.0 0 0.0 1 10.0 0 0.0 6 14.6
Population-, policy-,
or system-level
outcome criteria .... 0 0.0 0 0.0 2 20.0 0 0.0 5 12.2
Utility descriptors ...... 0 0.0 0 0.0 0 0.0 0 0.0 7 17.1
Funding .................... 0 0.0 0 0.0 5 50.0 0 0.0 9 22.0
Minority populations 0 0.0 0 0.0 2 20.0 0 0.0 7 17.1
Program innovation .. 0 0.0 0 0.0 3 30.0 0 0.0 6 14.6
Provider factors ........ 0 0.0 0 0.0 4 40.0 0 0.0 4 9.8
Use of other agen-
cies’ standards and
resource ................ 0 0.0 0 0.0 3 30.0 0 0.0 6 14.6
Developers submit-
ting applicaitons .... 0 0.0 0 0.0 0 0.0 0 0.0 1 2.4
Generalizability ......... 0 0.0 0 0.0 7 70.0 1 50.0 21 51.2
1 All percentages are calculated based on those providing comments.
2 These categories are not mutually exclusive.

TABLE 3.—COMMENTS REGARDING HOW SAMHSA MIGHT ENGAGE INTERESTED STAKEHOLDERS TO DETERMINE
INTERVENTION PRIORITY AREAS FOR REVIEW
[Question 2]

National org. State org. Local org. Unknown org. Private

n %1 n % n % n % n %

‘‘Noteworthy’’ respondents

Identification of pri-
ority areas 2 ........... 3 42.9 0 0.0 0 0.0 0 0.0 2 100
Engagement of
stakeholders ......... 5 71.4 1 100 1 100 0 0.0 1 50.0

‘‘Responsive’’ respondents

Identification of pri-
ority areas 2 ........... 0 0.0 0 0.0 1 50.0 0 0.0 1 33.3
Engagement of
stakeholders ......... 0 0.0 0 0.0 2 100 0 0.0 3 100
wwhite on PROD1PC65 with NOTICES

1 All percentages are calculated based on those providing comments.


2 These categories are not mutually exclusive.

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00083 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices 13149

TABLE 4.—COMMENTS REGARDING STATISTICAL SIGNIFICANCE AND EFFECT SIZE


[Question 3]

National org. State org. Local org. Unknown org. Private

n %1 n % n % n % n %

‘‘Noteworthy’’ respondents

Statistical signifi-
cance 2 .................. 1 25.0 0 0.0 1 50.0 0 0.0 11 84.6
Effect size ................. 2 50.0 3 100 1 50.0 1 100 13 100
General ..................... 2 50.0 0 0.0 0 0.0 0 0.0 2 15.4

‘‘Responsive’’ respondents

Statistical signifi-
cance 2 .................. 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0
Effect size ................. 0 0.0 0 0.0 3 100 0 0.0 6 85.7
General ..................... 0 0.0 0 0.0 0 0.0 0 0.0 1 14.3
1 All percentages are calculated based on those providing comments.
2 These categories are not mutually exclusive.

TABLE 4.—COMMENTS REGARDING STATISTICAL SIGNIFICANCE AND EFFECT SIZE


[Question 3]

National org. State org. Local org. Unknown org. Private

n %1 n % n % n % n %

‘‘Noteworthy’’ respondents

General, not outcome


specific:.
General com-
ment 2 ............ 2 20.0 0 0.0 0 0.0 0 0.0 3 20.0
Pro ..................... 10 100 3 100 1 100 0 0.0 12 80.0
Con .................... 0 0.0 0 0.0 0 0.0 0 0.0 1 6.7
Individual-level out-
come rating cat-
egories:
General com-
ment ............... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0
Pro ..................... 1 10.0 0 0.0 0 0.0 0 0.0 0 0.0
Con .................... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0
Population-, policy-,
or system-level
outcome rating cat-
egories:
General com-
ment ............... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0
Pro ..................... 0 0.0 1 33.3 0 0.0 0 0.0 0 0.0
Con .................... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0

‘‘Responsive’’ respondents

General, not outcome


specific:.
General com-
ment 2 ............ 0 0.0 0 0.0 1 50.0 0 0.0 3 37.5
Pro ..................... 0 0.0 0 0.0 1 50.0 1 100 6 75.0
Con .................... 0 0.0 0 0.0 1 50.0 0 0.0 0 0.0
Individual-level out-
come rating cat-
egories:
General com-
ment ............... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0
Pro ..................... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0
Con .................... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0
wwhite on PROD1PC65 with NOTICES

Population-, policy-,
or system-level
outcome rating cat-
egories:
General com-
ment ............... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00084 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
13150 Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices

TABLE 4.—COMMENTS REGARDING STATISTICAL SIGNIFICANCE AND EFFECT SIZE—Continued


[Question 3]

National org. State org. Local org. Unknown org. Private

n %1 n % n % n % n %

Pro ..................... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0


Con .................... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0
1 All percentages are calculated based on those providing comments.
2 These categories are not mutually exclusive.

TABLE 6.—COMMENTS REGARDING SAMHSA’S APPROACH FOR INCORPORATING INFORMATION ON THE EXTENT TO WHICH
INTERVENTIONS HAVE BEEN TESTED WITH DIVERSE POPULATIONS AND IN DIVERSE SETTINGS
[Question 5]

National org. State org. Local org. Unknown org. Private

n %1 n % n % n % n %

‘‘Noteworthy’’ respondents

General comment 2 .. 6 100 2 100 1 100 0 0.0 12 100


Venue ....................... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0
Channel .................... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0
Format ...................... 1 16.7 0 0.0 0 0.0 0 0.0 0 0.0

‘‘Responsive’’ respondents

‘‘Responsive’’ re-
spondents.
General comment 2 .. 0 0.0 0 0.0 1 100 0 0.0 4 80.0
Venue ....................... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0
Channel .................... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0
Format ...................... 0 0.0 0 0.0 0 0.0 0 0.0 1 20.0
1 All percentages are calculated based on those providing comments.
2 These categories are not mutually exclusive.

TABLE 7.—COMMENTS REGARDING WHETHER ALL EXISTING PROGRAMS ON NREPP SHOULD BE REREVIEWED OR
‘‘GRANDFATHERED’’
[Question 6]

Noteworthy Responsive
Percent Percent
of those of those
n providing n providing
com- com-
ments ments

Comments from individuals affiliated with an existing NREPP program


(8 individuals [3 Noteworthy, 5 Responsive] provided comments on this question)

Rereview* ......................................................................................................................................... 2 66.7 1 20.0


Grandfather ...................................................................................................................................... 1 33.3 3 60.0
General comment ............................................................................................................................ 1 33.3 2 40.0

Comments from individuals not known to be affiliated with an existing NREPP program
(29 individuals [21 Noteworthy, 8 Responsive] provided comments on this question)

Rereview .......................................................................................................................................... 19 90.5 5 62.5


Grandfather ...................................................................................................................................... 0 0.0 1 12.5
General comment ............................................................................................................................ 2 9.5 2 25.0
wwhite on PROD1PC65 with NOTICES

*Note: These categories are not mutually exclusive. There were instances of individuals who both commented specifically on whether to re-
review or grandfather a program and also provided a general comment with regard to this question.

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00085 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices 13151

TABLE 8.—COMMENTS REGARDING GUIDANCE, RESOURCES, AND/OR TECHNICAL ASSISTANCE TO PROMOTE GREATER
ADOPTION OF NREPP INTERVENTIONS
[Question 7]

National org. State org. Local org. Unknown org. Private

n %1 n % n % n % n %

‘‘Noteworthy’’ respondents

General comment 2 .. 3 30.0 2 25.0 0 0.0 0 0.0 2 11.8


Venue ....................... 0 0.0 0 0.0 0 0.0 0 0.0 0 0.0
Channel .................... 2 20.0 0 0.0 0 0.0 0 0.0 1 5.9
Format ...................... 1 10.0 0 0.0 0 0.0 0 0.0 0 0.0
Technical assistance 5 50.0 5 62.5 1 100 0 0.0 11 64.7
Guidance .................. 4 40.0 0 0.0 0 0.0 0 0.0 1 5.9
Resources ................ 6 60.0 5 62.5 0 0.0 1 100 3 17.6

‘‘Responsive’’ respondents

General comment 2 .. 0 0.0 0 0.0 0 0.0 0 0.0 3 16.7


Venue ....................... 0 0.0 0 0.0 0 0.0 0 0.0 2 11.1
Channel .................... 0 0.0 0 0.0 1 20.0 0 0.0 3 16.7
Format ...................... 0 0.0 0 0.0 0 0.0 0 0.0 1 5.6
Technical assistance 0 0.0 0 0.0 2 40.0 0 0.0 6 33.3
Guidance .................. 0 0.0 0 0.0 1 20.0 0 0.0 4 22.2
Resources ................ 0 0.0 0 0.0 3 60.0 0 0.0 9 50.0
1 All percentages are calculated based on those providing comments.
2 These categories are not mutually exclusive.

TABLE 9.—COMMENTS REGARDING HOW CONSUMER, FAMILY, AND OTHER NONSCIENTIST STAKEHOLDERS COULD BE
INVOLVED IN NREPP
[Question 8]

National org. State org. Local org. Unknown org. Private

n %1 n % n % n % n %

‘‘Noteworthy’’ respondents

General comment 2 .. 0 0.0 0 0.0 1 50.0 0 0.0 1 50.0


Venue, channel, for-
mat ........................ 2 20.0 0 0.0 0 0.0 0 0.0 1 50.0
Potential stake-
holders .................. 7 70.0 5 71.4 0 0.0 0 0.0 0 0.0
Involvement in the
development of the
NREPP process .... 5 50.0 4 57.1 1 50.0 0 0.0 0 0.0
Involvement in pro-
gram reviews ........ 6 60.0 5 71.4 1 50.0 0 0.0 0 0.0

‘‘Responsive’’ respondents

General comment 2 .. 0 0.0 0 0.0 1 16.7 0 0.0 1 5.6


Venue, channel, for-
mat ........................ 0 0.0 0 0.0 1 16.7 0 0.0 4 22.2
Potential stake-
holders .................. 0 0.0 0 0.0 4 66.7 1 100 14 77.8
Involvement in the
development of the
NREPP process .... 0 0.0 0 0.0 4 66.7 0 0.0 8 44.4
Involvement in pro-
gram reviews ........ 0 0.0 0 0.0 2 33.3 1 100 6 33.3
1 All percentages are calculated based on those providing comments.
2 These categories are not mutually exclusive.
wwhite on PROD1PC65 with NOTICES

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00086 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
13152 Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices

TABLE 10.—COMMENTS REGARDING GUIDANCE SAMHSA SHOULD PROVIDE FOR USE UNDER THE AGENCY’S SUBSTANCE
ABUSE AND MENTAL HEALTH BLOCK GRANTS
[Question 9]

National org. State org. Local org. Unknown org. Private

n %1 n % n % n % n %

‘‘Noteworthy’’ respondents

Technical assist-
ance 2 .................... 1 11.1 2 50.0 0 0.0 0 0.0 1 8.3
Funding support ....... 4 44.4 3 75.0 1 100 1 100 9 75.0
Regulatory ................ 6 66.7 1 25.0 0 0.0 0 0.0 2 16.7

‘‘Responsive’’ respondents

Technical assist-
ance 2 .................... 0 0.0 0 0.0 1 50.0 0 0.0 2 18.2
Funding support ....... 0 0.0 0 0.0 2 100 0 0.0 9 81.8
Regulatory ................ 0 0.0 0 0.0 1 50.0 0 0.0 2 18.2
1 All percentages are calculated based on those providing comments.
2 These categories are not mutually exclusive.

TABLE 11.—COMMENTS REGARDING STEPS SAMHSA SHOULD TAKE TO PROMOTE CONSIDERATION OF OTHER SOURCES
OF EVIDENCE-BASED INTERVENTIONS
[Questions 10]

National org. State org. Local org. Unknown org. Private

n %1 n % n % n % n %

‘‘Noteworthy’’ respondents

Steps SAMHSA
should take 2 ......... 4 80.0 1 100 0 0.0 0 0.0 12 100
Source ...................... 1 20.0 0 0.0 1 100 1 100 0 0.0

‘‘Responsive’’ respondents

Steps SAMHSA
should take 2 ......... 0 0.0 0 0.0 2 100 0 0.0 2 66.7
Source ...................... 0 0.0 0 0.0 0 0.0 0 0.0 2 66.7
1 All percentages are calculated based on those providing comments.
2 These categories are not mutually exclusive.

TABLE 12.—COMMENTS REGARDING ANNUAL REVIEWS OF SUGGESTIONS FOR IMPROVING THE SYSTEM
[Question 11]

National org. State org. Local org. Unknown org. Private

n % n % n % n % n %

‘‘Noteworthy’’ respondents

General comment ..... 8 100 3 100 1 100 0 0.0 14 100

‘‘Responsive’’ respondents

General comment ..... 0 0.0 0 0.0 2 100 0 0.0 7 100


1 All percentages are calculated based on those providing comments.

TABLE 13.—ADDITIONAL COMMENTS NOT CLASSIFIED ELSEWHERE


National org. State org. Local org. Unknown org. Private

n %1 n % n % n % n %
wwhite on PROD1PC65 with NOTICES

‘‘Noteworthy’’ respondents

Other issues 2 ........... 4 66.7 1 25.0 1 50.0 0 0.0 1 100


Defining terms .......... 5 83.3 3 75.0 1 50.0 0 0.0 1 100

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00087 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices 13153

TABLE 13.—ADDITIONAL COMMENTS NOT CLASSIFIED ELSEWHERE—Continued


National org. State org. Local org. Unknown org. Private

n %1 n % n % n % n %

‘‘Responsive’’ respondents

Other issues 2
........... 0 0.0 0 0.0 1 50.0 0 0.0 5 71.4
Defining terms .......... 0 0.0 0 0.0 2 100 0 0.0 2 28.6
1 All percentages are calculated based on those providing comments.
2 These categories are not mutually exclusive.

Subpart C.—Comments on Specific The American Evaluation Association (AEA) Two respondents pointed out that
Evidence Rating Criteria duly notes this consideration in its 2003 expectations might be an active
commentary on scientifically based component of the intervention. One
Some of the respondents to evaluation methods. wrote that ‘‘trying to control
SAMHSA’s August 2005 Federal [expectations] might reduce
Another service provider noted that
Register notice submitted comments generalization of the eventual findings.
studies that include the target
about specific evidence rating criteria. A In addition, given current ethical
intervention, comparison intervention,
summary and highlights of key guidelines and human subjects policies,
and attention control ‘‘would require
comments about these criteria are it is hard to see how one could ‘mask’
funding at extremely high levels to have
presented below. study conditions in many studies. In
enough N in each group for statistical
Intervention Fidelity analysis. To conduct such a study in obtaining consent, one has to tell
today’s economic climate is probably participants about the conditions to
Two respondents commented on this which they might be assigned and it is
criterion. One noted that it is difficult to impractical.’’
likely that participants will know to
monitor or confirm how treatment is A private citizen who submitted
which condition they have been
delivered and how staff are trained in comments wrote:
assigned.’’
programs with complex approaches, This is a critical criterion and should be
such as community reinforcement or weighted more heavily than many, if not all, Data Collector Bias
family training. of the other criteria. With the proposed Three respondents commented on this
system, if one were trying to ‘‘game the criterion. One noted, ‘‘Changes to this
Comparison Fidelity system,’’ it would be advantageous to choose
criterion should recognize the critical
Eleven respondents commented on a comparison intervention that was
ineffective (and thus receive a low score on
need to ensure the fidelity of
this criterion. Ten of the respondents, a psychosocial treatment interventions.
group of researchers from a major this criterion), so as to increase the likelihood
of a significant treatment effect. Nevertheless, Fidelity, in these cases, can only be
university, wrote: ensured through staff awareness of the
the practice being evaluated could have
The comparison fidelity evidence quality ‘‘strong evidence’’ by scoring highly on other actions required of them. Masking
criterion assumes the implementation and criteria. conditions actually inhibits
fidelity monitoring of a ‘‘comparison psychosocial treatment fidelity.’’
condition.’’ In universal and selective A group of university researchers said
prevention trials, this is not standard that it is unclear how prevention Selection Bias
protocol. Rather, individuals or communities practices being compared to existing Three respondents commented on this
selected for comparison/control conditions prevention services would be scored
receive standard prevention services criterion. One suggested that approaches
using this criterion. other than random assignment, such as
available in the community. In such studies,
it does not make sense to measure the Assurances to Participants blocking variables of interest, should
‘‘fidelity’’ of the comparison condition. qualify for the highest score on this
However, as currently scored, this criterion One respondent questioned ‘‘whether item. Another pointed out that random
will penalize prevention studies. I such studies [without documented assignment to psychosocial
recommend the criterion and rating system assurances to participants] should ever interventions might not be possible due
be changed to reflect this difference between clear the bar for NREPP consideration. to ethical problems with nondisclosure.
prevention and treatment research. If investigators do not observe He suggested rewording the item to
Nature of Comparison Condition appropriate procedures to safeguard clarify that random assignment does not
study participants’ interests, it is at least refer only to ‘‘blinding’’ participants to
Fourteen respondents provided questionable whether their products
comments on this criterion. One their treatment condition.
should receive any degree of attention
respondent, a director of research and and support from SAMHSA.’’ Attrition
evaluation for a prevention program Two respondents commented on this
noted: Participant Expectations
criterion. One pointed out that the
Many program participants are drawn from Three respondents commented on this criterion is unclear, and that ‘‘attrition
undeserved or marginalized populations, e.g. criterion. Two respondents listed needing adjustment’’ is not defined, nor
incarcerated youth, the mentally ill, potential problems with controlling is the difference between ‘‘crude’’ and
linguistically isolated subgroups, or those expectations in school settings. For ‘‘sophisticated’’ methods of adjusting for
wwhite on PROD1PC65 with NOTICES

suffering from Human Immunodeficiency


Virus (HIV). For these populations, there may
example, for an intervention to be attrition. This respondent also pointed
be no option to withhold active treatment implemented effectively by teachers, the out that ‘‘sophisticated’’ does not
only to the intervention group, due to legal teachers would have to be trained and necessarily mean better than ‘‘crude’’
requirements, health and safety therefore would be aware of the (this comment also applied to the
considerations, or other ethical constraints. intervention they implement. Missing Data criterion).

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00088 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
13154 Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices

Theory-Driven Method Selection culturally detrimental and unethical to work training to answer sophisticated
with coalitions in such a way that they are questions from the most highly
Eleven respondents commented on not involved in the evaluation process.
this criterion. A group of university experienced program implementers? Is
Expecting the data collectors to be blind to
researchers wrote: the efforts of the community means that the
there implementation support through a
researchers are outside the community and variety of media? What support is there
This is an important criterion. However, for transfer of learning? For example,
this criterion should recognize that a number would have no understanding of the context
of preventive interventions seek to address in which the coalition works. Many practice beyond specific lessons,
and reduce risk factors or enhance protective evaluators and researchers view this as the opportunities for population served to
absolute wrong way to work with coalitions. demonstrate, and be reinforced for skills
factors that research has shown are common
Criterion Seven [Data Collector Bias] runs beyond specific lessons, support for
shared predictors of a range of drug use,
counter to participatory research which is the
mental health, and other outcomes. It is staff awareness of skills, how to
standard in working with coalitions.
important to explicitly recognize this fact in recognize skills, how to reinforce skills,
formulating and describing this criterion examples typical in the daily setting,
* * * Not all reviewers, especially those Population Studied
materials for engaging family members
from treatment backgrounds, will be familiar Eleven respondents commented on of the population served, materials for
with the concept of addressing shared this criterion. One respondent stated
predictors of broader outcomes in preventive
engaging staff outside the implementers
that quasi-experimental time-series of the program (e.g., residential
trials in order to affect wide-ranging designs might be as internally valid as
outcomes. This criterion needs to educate housekeeping staff, school playground
randomized control designs, and felt monitors), support for engaging
reviewers about this in the same way that the
criterion currently warns against ‘‘dredging’’
this should be reflected in the criterion. community members outside the
A group of university researchers
for current significant results. implementation setting, what training is
advocated excluding single-group pre-/
required, what training is available
Subpart D.—Criterion-Specific Themes posttest design studies from NREPP.
beyond that which is required?
for Population-, Policy-, and System- They wrote, ‘‘A group randomized
Level Outcomes design with adequate numbers of groups 2. Quality Monitoring
in each condition holds the greatest Are the tools supplied for quality
Logic-Driven Selection of Measures potential for ruling out threats to monitoring user-friendly and
A group of researchers from a major internal validity in community-level inexpensive? How well are they adapted
university suggested that this item and studies. This criterion should be specifically to the program? What are
the parallel item for individual-level expanded to provide a rating of four for their psychometric characteristics?
outcomes, Theory-Driven Measure group randomized studies with
Selection, should have the same label. adequate Ns.’’ 3. Unintended or Adverse Events
No further comments.
Intervention Fidelity Subpart E.—Comment for Children’s
The seven respondents who Suggestions for Utility Descriptors 4. Population Coverage
commented on this criterion observed 1. Implementation Support Are the materials appropriate to the
that interventions must be adapted for population to be served in regard to, for
individual communities to be effective. Regarding the ease of acquiring example: length of lessons, vocabulary,
The criterion as written does not materials is there centralized ordering concepts and behavioral expectations,
account for this. for all materials? What implementation teaching strategies.
support materials are included in initial
Nature of Comparison Condition program cost, and are they adequate? 5. Cultural Relevance and Cultural
One respondent stated that there is Are basic program updates and Competence
not consensus among evaluation replacement parts all easily available? To what extent was cultural relevance
researchers on this topic, and until there Regarding start-up support, research addressed during the development of
is, ‘‘we should reserve judgment on how suggests that there are several features the program? Is there a theoretical basis
best to define the nature of comparison that are important to the effectiveness to the program that addresses cultural
conditions within community level and sustainability of programs. These relevance? Were stakeholders from a
interventions.’’ She also pointed out, include an active steering committee, variety of relevant backgrounds engaged
‘‘Since the collective behaviors of administrator support, engagement of in the development process? How early
members in each community will vary family members, and wholeschool in the development process were they
* * * how can they possibly be implementation (for school-based involved? In what ways were they
compared to each other in a valid and programs). Do the basic program involved? Were professionals with
reliable way.’’ materials provided supply adequate multicultural expertise involved in the
guidance for effectively gaining these development process? How early in the
Data Collector Bias sources of support? On the other hand, development process were they
A group of university researchers some clients are not in the position to involved? In what ways were they
pointed out that the item assumes achieve all of these goals. Is it possible involved?
archival data are unbiased, while they to effectively implement the program
may be biased by institutional practices. without them? Are needs assessment 6. Staffing
They suggested that the highest rating tools offered? This is important for Since FTEs are often difficult to
‘‘be reserved for studies in which data determining whether implementation estimate and estimates many therefore
collectors were masked to the should take place at all. What is the be unreliable, the required time should
population’s condition.’’ nature of the start-up implementation be estimated for the following: Required
wwhite on PROD1PC65 with NOTICES

Another respondent, a national support? What is the nature of the training time, on-site start-up activities,
organization, wrote: ongoing implementation support? Is implementer preparation time per week,
The very nature of coalition work requires client support differentiated for new lesson length × number of lessons per
coalition members to be involved in its and experienced clients? Do client implementer, time required for other
evaluation and research efforts. It is support personnel have adequate activities.

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00089 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1
Federal Register / Vol. 71, No. 49 / Tuesday, March 14, 2006 / Notices 13155

7. Cost ACTION: Notice. collection described below. This notice


No further comments on this is soliciting comments from members of
SUMMARY: The proposed information
descriptor except to reiterate that cost the public and affecting agencies
collection requirement described below
considerations play into several of the concerning the proposed collection of
has been submitted to the Office of
other descriptors. Management and Budget (OMB) for information to: (1) Evaluate whether the
review, as required by the Paperwork proposed collection of information is
8. Motivational Issues Affecting
Reduction Act. The Department is necessary for the proper performance of
Implementation
soliciting public comments on the the functions of the agency, including
We suggest that consideration be whether the information will have
subject proposal.
given to examining what further Mortgagee’s must obtain written practical utility; (2) Evaluate the
motivational issues may impact whether consent from HUD’s National Servicing accuracy of the agency’s estimate of the
the programs are implemented and Center to accept a deed-in-lieu of burden of the proposed collection of
sustained with fidelity. These include: foreclosure when the mortgagor is a information; (3) Enhance the quality,
appeal of materials and activities for the corporate mortgagor or a mortgagor utility, and clarity of the information to
population to be served, appeal of owning more than one property insured
materials and activities for the staff who be collected; and (4) Minimize the
by the Department of Housing and burden of the collection of information
will implement the programs, support of Urban Development (HUD). Mortgagees
the program for the preexisting goals on those who are to respond; including
must provide HUD with specific through the use of appropriate
and programs of the site (e.g., school- information,
based programs that support automated collection techniques or
academics), how well the program DATES: Comments Due Date: April 13, other forms of information technology,
otherwise integrates with existing goals, 2006. e.g., permitting electronic submission of
programs, and activities of the site (e.g., ADDRESSES: Interested persons are responses.
teachers are expected to direct student invited to submit comments regarding This notice also lists the following
discussions, but not therapy), support this proposal. Comments should refer to information:
offered for adapting the program to the proposal by name and/or OMB
specific local populations, fit of Title of Proposal: Deed-in-Lieu of
approval Number (2502–0301) and
materials to the typical structures of the should be sent to: HUD Desk Officer, Foreclosure (Corporate Mortgagors or
setting (e.g., short enough lessons to fit Office of Management and Budget, New Mortgagors Owning More than One
within a class period, necessary Executive Office Building, Washington, Property).
equipment is usually available in the DC 20503; fax: 202–395–6974. OMB Approval Number: 2502–0301.
setting). FOR FURTHER INFORMATION CONTACT: Form Numbers: None.
[FR Doc. 06–2313 Filed 3–13–06; 8:45 am] Lillian Deitzer, Reports Management
Description of the Need for the
BILLING CODE 4160–01–M Officer, AYO, Department of Housing
and Urban Development, 451 Seventh Information and Its Proposed Use:
Street, SW., Washington, DC 20410; e- Mortgagee’s must obtain written consent
mail Lillian Deitzer at from HUD’s National Servicing Center
DEPARTMENT OF HOUSING AND to accept a deed-in-lieu of foreclosure
URBAN DEVELOPMENT Lillian_L_Deitzer@HUD.gov or
telephone (202) 708–2374. This is not a when the mortgagor is a corporate
[Docket No. FR–5037–N–12] toll-free number. mortgagor or a mortgagor owning more
Copies of available documents than one property insured by the
Notice of Submission of Proposed submitted to OMB may be obtained Department of Housing and Urban
Information Collection to OMB; Deed- from Ms. Deitzer. Development (HUD). Mortgagees must
in-Lieu of Foreclosure (Corporate provide HUD with specific information.
SUPPLEMENTARY INFORMATION: This
Mortgagors or Mortgagors Owning
notice informs the public that the Frequency of Submission: On
More than One Property)
Department of Housing and Urban occasion.
AGENCY: Office of the Chief Information Development has submitted to OMB a
Officer, HUD. request for approval of the information

Number of Annual Hours per Burden


× =
respondents responses response hours

Reporting Burden: ............................................................................. 600 0.041 0.5 12.5

Total Estimated Burden Hours: 12.5. DEPARTMENT OF THE INTERIOR availability of the Draft Conservation
Status: Extension of a currently Agreement for the Yellow-billed Loon
approved collection. Fish and Wildlife Service (Gavia adamsii) for public review and
Authority: Section 3507 of the Paperwork comment.
Reduction Act of 1995, 44 U.S.C. 35, as Draft Conservation Agreement for the
DATES: Comments on the draft
amended. Yellow-Billed Loon (Gavia adamsii)
conservation agreement must be
Dated: March 9, 2006. AGENCY: U.S. Fish and Wildlife Service, received on or before April 13, 2006.
Lillian L. Deitzer, Interior. ADDRESSES: Copies of the conservation
wwhite on PROD1PC65 with NOTICES

Departmental Paperwork Reduction Act ACTION: Notice of document availability agreement are available for inspection,
Officer, Office of the Chief Information for review and comment. by appointment, during normal business
Officer. hours at the following location: U.S.
[FR Doc. E6–3616 Filed 3–13–06; 8:45 am] SUMMARY: We, the U.S. Fish and Fish and Wildlife Service, Fairbanks
BILLING CODE 4210–67–P Wildlife Service, announce the Fish and Wildlife Field Office, 101 12th

VerDate Aug<31>2005 19:18 Mar 13, 2006 Jkt 208001 PO 00000 Frm 00090 Fmt 4703 Sfmt 4703 E:\FR\FM\14MRN1.SGM 14MRN1

Vous aimerez peut-être aussi