Vous êtes sur la page 1sur 20

Implementation and Outcomes:

Results in Osseo and Minnesota

Eric Kloos and Aaron Barnes


Minnesota Department of Education
November 20, 2014

education.state.mn.us

O
L
P
X
E

O
I
T
A
R

SCH
O
Tak OL Im
es T
p
ime leme
nta
:2
tion
4
Yea
rs

IO
T
A
L
L
A
ST
IN

N
O
I
L
A AT
I
IT NT
IN E
M
E
PL
IM

O
I
L AT
L
T
FU EN
M
E
PL
IM

PBIS
EXPLORATION
Exploration
Application information is verification that
exploration activities have started
Teams have pre-requisites in place and are
highly likely to be ready for installation stage

SCHOOL EXAMPLE OF
EXPLORATION

PBIS EXPLORATION
Exploration

Attended professional development on PBIS


Data-based definition of need
80% buy in
Resources allocated to support training, coaching
and evaluation
Awareness and knowledge of key PBIS
components
EXAMPLE
OF support for PBIS
SCHOOL
Building district
administrative

EXPLORATION

Installation

Representative team to attends training bring


info back to school staff

Team meets to develop/review action plans


Baseline data is collected (fidelity, outcome)
Install a data system
Select a building coach to help facilitate
meetings, collect data, support action planning,
and work with staff throughout the school

Installation

Develop systems for communication and data


review/sharing with school staff.

The team drafts 3-5 positive school-wide


expectations, with a matrix for teaching
expectations across school environments.

Installing key features of PBIS by all school staff


members

Initial Implementation

School staff adopts the school-wide expectations


Teaching matrix is taught to students
Data system is in place to produce regular data
Team establishes a regular meeting schedule,
collects and reviews data
School staff agrees and operationalizes consistent
classroom managed behaviors and office managed
behaviors

Initial Implementation
Recognition system is in place
Violation system is well defined with a continuum of
consequences in place.
Ongoing action planning,
Coach facilitates team meetings, synthesis of data,
and networking with other coaches.
Attending training to continue learning key features
and concepts in PBIS

Full
FULL Implementation
Implementation

Completed full scope and sequence of PBIS


content

Data and evidence of implementing PBIS at a


quality standard

Information on efforts and outcomes is shared


with school community

Implementation data is used to identify tier 2


and tier 3 supports

Full
FULL Implementation
Implementation
Implementation data is being used to refine and
focus school-wide efforts and found in action
plans and school improvement plans

Coaching support is an operational norm within


the school

The way we roll

Cohort 8 SET Results


Fall 12-Spring 14

education.state.mn.us

Sustainability
New research identifies four factors that predict sustained
implementation of PBIS.

District Priority
School Priority
Capacity Building
Team Use of Data

School team/staff skill,

regular team meetings,

data collection,

use of data for decision making,

presenting data to staff and community

(McIntosh, et al., 2013)

Creating Implementation Informed Expectations


at a School Level
In Minnesota, baselines are rising (average baseline SET = 69), but there are
still predictable differences between schools starting training and
sustaining schools (average SET = 90, BoQ = 84).
What features are similar between baseline schools and sustaining schools?

Administrator is an active PBIS team member (96% baseline schools/97% sustaining


schools)
Administrator reports that team meetings occur (98%/98%)

What features are different between baseline schools and sustaining schools?

Documented system of teaching expectations (46%/83%)


Teaching expectations has occurred this year (74%/94%)
SW behavior program has been taught/reviewed with staff this year (78%/97%)
Team provides discipline data summary to staff at least 3 times per year (50%/91%)
90% of team members report that discipline data is used for decision-making (57%/97%)

Why do we measure implementation across time in a school?


Because it varies!

education.state.mn.us

Sharing Data and Outcomes:


Disciplinary Reductions for District
and State

One school example of progress

Total
Referral
s

Discipline Data for black students across 3 years of


implementation
# of
% Black Referral
Black
students s for
% of
Student
Student
w/
Black overall
Enrollme Enrollme s w/
Referral Student Referral
Year
nt
nt % Referral
s
s
s

1666

201112

233

33.2

141

59

1010

60.6

1215

201213

220

29

123

51.7

748

61.6

911

201314

203

30.4

96

48.2

511

56.1

Year

Black Student
Enrollment Enrollment %

Total
Referrals

2011-2012

233

33.2

1666

2012-2013

220

29

1215

2013-2014

203

30.4

911

A Focused Look
Over time, enrollment is dropping (overall and
for black students)
Black students are maintaining roughly the
same proportion of the student body
Overall referrals are decreasing dramatically

Year

# of Referrals
for Black
Total Referrals
Students

% of overall
Referrals

2011-2012

1666

1010

60.6

2012-2013

1215

748

61.6

2013-2014

911

511

56.1

A Focused Look (continued)


One measure of disproportionality the
percentage of overall referrals received by black
students is just starting to decline.
But the number of referrals received by black
students during that same time is cut almost in
half.
499 fewer referrals, 45 fewer students who
received one or more referrals

Resources
Minnesota PBIS:
http://pbismn.org

The Minnesota Department of Education:


http://education.state.mn.us

The Active Implementation Hub:


http://implementation.fpg.unc.edu/about-the-ai-hub
Eric.Kloos@state.mn.us
Aaron.Barnes@state.mn.us

education.state.mn.us

Vous aimerez peut-être aussi