Vous êtes sur la page 1sur 2

Ê  

      







      
 


 
   

! " 

#
—‘ A recent paper from the Brookings Institution signed by six experts concluded that ͞value-added
data has an important role to play in teacher evaluation systems͟ because ͞it predicts more
about what students will learn from the teachers to which they are assigned than any other
source of information.͟
—‘ áhe paper also concludes that ͞value-added scores for individual teachers turn out to be about
as reliable as performance assessments used elsewhere for high stakes decisions.͟

  

 
  $ %  &  
.
—‘ Go measure of teacher effectiveness can be perfect; that's why HISD͛s new evaluation system
balances value-added results with other information, such as results of teacher-generated
assessments and rigorous classroom observations by appraisers, to get a more complete picture
of a teacher's performance.
—‘ uesearchers plainly state that value-added data should not be discarded or ignored when
making key personnel decisions, including evaluation.

 
%   
 
 
  
    
  
# For
example, a teacher might be an especially good or bad fit for a particular group of students. But despite
these normal fluctuations and the uncertainty inherent in any measure, relatively few teachers who
earn top value-added scores early in their careers go on to earn bottom scores later, and vice-versa. In
fact, a recent study showed that value-added is as reliable as real-world measures of performance in
many other professions, including batting averages in baseball.

     $  


%    $
 $
 # Most
observations revolve around the teacher's actions, on the assumption that certain actions will lead to
student learning. Value-added measures student learning more directly. It also provides a picture of a
teacher's success over the course of an entire school year, rather than snapshots of performance on a
handful of days throughout the year.

"'(     )


In a report written by Battelle for Kids and Commissioned by the Bill & Melinda Gates Foundation
released on May 10, 2011, various growth and value-added models are discussed. According to the
characteristics of ͞Advanced Value-Added Model͟ in the report, SAS EVAAS©model falls within this
definition:

͞Complex value-added models typically develop their estimates of students͛ achievement using multiple
prior test scores ͙ [and] include[s] other school or student factors in an attempt to more reliably
estimate the influence of educators on student learning.

Complex value-added models employ more sophisticated statistical approaches to minimize the effect
of external factors and improve the reliability of their estimates. áhe most sophisticated models apply
strategies to handle missing student test data, measurement error associated with the tests, multiple
educators sharing instructional responsibility and many other factors in an attempt to produce the most

c
reliable estimates of effectiveness. áhese models regularly produce information about the confidence in
the value-added estimate to better inform conclusions that can be made from the data.͟ (P. 6)

áhe report identifies 7 key considerations in the selection of a growth model. áhese are: (1) Intended
uses; (2) Inputs for analysis; (3) Measurement error and uncertainty; (4) uesults and outputs; (5)
Communications, training and support; (6) Experience, expertise and capacity; and (7) Costs. áhese
should not be ignored by local districts creating a teacher appraisal system or by the state in adoption
and creation of a new teacher appraisal and development system.


"'
$
$    
 
  $ 
 
 $ 
 

   #

—‘ áhe SAS EVAAS© for K-12 student projection methodology was reviewed by the Government
Accounting Office (GAO) in 2006 and is referenced in its July report to the House Education
Committee, which is available on the official GAO website.
—‘ Four US Department of Education peer review committees have approved the reliability of SAS͛
student projection methodology. With four prior scores, the multiple correlation is higher 3
years in advance than the simple correlation between adjacent years.
—‘ áhe Gational Governor͛s Association included SAS EVAAS© in its 2003 Data áoolkit.

u  
 

 u*+

 

$
  "'
,  

 .

—‘ On the      -    : McCaffrey, D. F., Han, B. and Lockwood, J. u.


(2008). "Value-Added Models: Analytic Issues." A paper prepared for the Gational uesearch
Council and the Gational Academy of Education, Board on áesting and Accountability Workshop
on Value-Added Modeling, Gov. 13-14, 2008, Washington D.C.
—‘ On the      -   
 : Lockwood J.u. and McCaffrey
D.F. (2007). "Controlling for Individual Heterogeneity in Longitudinal Models, with Applications
to Student Achievement." Electronic Journal of Statistics, Vol. 1, 223-252.
—‘ On the          : McCaffrey, D. F., Han, B. and Lockwood, J.
u. (2008). "From Data to Bonuses: A Case Study of the Issues uelated to Awarding áeachers Pay
on the Basis of the Students' Progress." A paper presented at the conference on Performance
Incentives: áheir Growing Impact on American K-12 Education, Feb. 28-29, 2008, Gational
Center on Performance Incentives at Vanderbilt University.

"' $   $ 

  
 $#
Many states and school districts have incorporated value-added data into policy decisions. For example,
Louisiana uses value-added data to measure the effectiveness of teacher preparation programs. Ohio,
Gorth Carolina and áennessee specifically use SAS EVAAS© in their state͛s accountability system.
áennessee has also passed policy to include value-added it teacher appraisals.

 -

"'
Four of the largest districts in áexas are using SAS EVAAS©. Several districts including Ft. Worth ISD,
Lubbock ISD, Austin ISD, Gorthside ISD, Longview ISD and districts participating in the áAP Program use
SAS EVAAS© for a variety of purposes.

Vous aimerez peut-être aussi