Vous êtes sur la page 1sur 31

PLANNING FOR

EFFECTIVE ASSESSMENT

Module Developed by Lenore


Adie
3 CRICOS No. 00213J
Overview
 Planning for effective assessment
 Backward mapping
 Designing quality assessment tasks
 Types of assessment

 Writing criteria sheets


Planning principles
Assessment for learning should be part of effective
planning of teaching and learning

A teacher's planning should provide opportunities for both


learner and teacher to obtain and use information about
progress towards learning goals. It also has to be flexible
to respond to initial and emerging ideas and skills.
Planning should include strategies to ensure that
learners understand the goals they are pursuing and the
criteria that will be applied in assessing their work. How
learners will receive feedback, how they will take part in
assessing their learning and how they will be helped
to make further progress should also be planned.
More planning guidelines
In addition, teachers need to produce plans with:
 An emphasis on learning intentions and how these will be
shared with pupils
 Assessment criteria for feedback and marking
 Peer and self-assessment
 Differentiated classroom groups
 Built-in review time
 Flexibility to meet learning needs
 Notes on pupils who need additional or consolidation work
 Time for guided group sessions and explicit formative
assessment opportunities
 Adjustments highlighted or crossed out: what did or did not
work and why.
Backward mapping
 Planning to ensure that all students have the
skills to successfully complete the
task/learning experience

 (Wiggins & McTighe, 1998, p. 9)

 What do you value?


 What do you want students to demonstrate as a
result of their learning?
 How are you going to get them there?
 How is assessment going to be incorporated into
your pedagogy?
ASSESSMENT TASK

LEARNING MULTILITERACIES FORMATIVE


EXPERIENCES Explicate all of the ASSESSMENT
*pedagogies (multi) literacy STRATEGIES
*thinking skills demands in each At key junctures you
*key questions learning experience. need to implement
etc Think about: some AfL strategies
Technical Language so students and
Semiotics teacher can monitor
*pedagogies Digital progress towards
*thinking skills
Numeracy learning goals, e.g.
*key questions
Oral Work In Progress
etc
Aural Diagnostic tests
Kinaesthetic Self/Peer Reports
*pedagogies Information Feedback Checklists
etc Teacher - Student
*thinking skills
interviews
*key questions
Open questions
etc
etc
Unit/lesson planning - AfL
 What do I want my students to learn? (What
are the learning intentions?)
 How will I – and they – know that they have
met the learning intention? (includes
success criteria)
 What classroom activities will help my
students to meet the learning intention?
(backward planning)
 How can I build in opportunities for the
students to receive feedback about their
progress towards the learning intention?
 What opportunities can I provide for
students to evaluate their own progress and
act on feedback?
Planning assessment tasks
 Identify the outcomes/ learning intentions that you will be
addressing.
 What will the students know and be able to do?
 What evidence needs to be collected for students to
demonstrate proficiency?
 What will students be doing?
 What assessment strategies/techniques would be suitable to use
for students to demonstrate this understanding? Design
the assessment task
 What will the final product look like?
 Who will be the audience?
 What student needs should be considered?
 What forms/types of assessment have the students already been
involved in this year?
 How long will the students have to complete this task?
 Design an assessment task sheet in student appropriate
language that is easy to read and visually appealing
 Include task title, purpose, overview and relevance, learning
intention, success criteria, instructions, due date
 Consider any support materials you need to provide, for
example, to assist students to keep track of each aspect of the
http://education.qld.gov.au/qcar/pdfs/placemat.pdf
Examples of Authentic
Assessment Practices
Collecting evidence
For example…

e-
portfolios video
portfolios

photographi
c

tests work samples


interview
Task sheet example

http://education.qld.gov.au/schools/linkup2006/docs/session3/spacestudtasksheetp234.doc
Activity
 Generate some ideas for a task sheet for
an assessment that could be conducted
in your teaching area
 Consider the skills that the students may
need to develop to complete this task
and start to backward map working
towards an outline of a unit plan
Writing guides to making
judgements
Guides to making judgement
Matrix
Continua
Elements of Guides to making
judgements
Writing criteria and standards
Basic Rubric/Matrix
The Grading Master/Continua
Guides to making judgements
A B C D
 Rubrics or matrices
 used to measure student performance
against a pre-determined set of criteria

 Grading masters/continua/poles
 similar to a rubric but contain the
necessary features to support the
nature of complex, multifaceted tasks
that assess multiple knowledges,
understandings, skills and dispositions.
(Matters, 2005)
Elements of Guides to Making Judgements

 Criteria/assessable element/success criteria


(what is being assessed)
 Clear and explicit for the types of evidence students will
produce

 Levels of performance/achievement/standard
 For example A – E

 Standards descriptors
 Sufficient and clear enough to provide advice to students
(and other assessors) for making judgements
 Describe the expected quality of performance

Remember to consider the audience of the criteria


sheet
Writing criteria and standards
Criteria
 A property, quality, characteristic or attribute of a student response
(Sadler, 2008, p. 2)
 Are directly related to the curriculum intent
 Reflect what is valued by the assessment task
 Need to be sufficiently different from, and independent of, each other
(Education Queensland, 2004)
 Beware of over-specifying criteria (breaking it down into smaller and
smaller parts)
 Results in a distorted judgement that fails to capture the complexity of the response and the
process of judgement formation
 Often important aspects of the work are overlooked
 Interrelationships between the criteria are lost as the focus shifts to a specific aspect
 Too many criteria will become unmanageable

Standards
 A particular degree or level of quality (Sadler, 2008, p. 2)

 Are sufficient in number to differentiate between performances

 Standards descriptors involve two variables – element and degree

 Standards descriptors are written in positive terms (Education Queensland,


2004)
Developing task-specific descriptors

Quality word(s) + assessable elements in the


particular task

For example:
 Insightful application of science procedures to
plan and conduct investigations

 Clear and accurate communication using


illustrations, representations and terminology

 Perceptive reflection on science investigations,


values, perspectives and learning

Consider and discuss the effectiveness of each


of the ‘quality’ words used in this example.
http://www.qsa.qld.edu.au/assessment/3166.html
Descriptors - qualities
This is an example from the Queensland Studies Authority of the types of words that
could be used to describe quality in student work.
A B C D E
Comprehensive Thorough Satisfactory Narrow Rudimentary
Insightful Thoughtful Suitable Variable Minimal
Proficient Logical Competent Disjointed Unclear
Discerning Coherent Relevant Superficial Cursory
Well-reasoned Effective Credible Vague
Clear Logical Sound
Perceptive Purposeful Appropriate
Controlled Informed Functional
Skilful Accurate
Accurate Proficient
Significant
Well-justified

http://www.qsa.qld.edu.au/assessment/3166.html
Bloom’s taxonomy verbs
Bloom’s taxonomy provides words that help to distinguish the quality of a response.

Count, Define, Describe, Draw, Find, Identify, Label, List, Match, Name, Quote,
Knowledge
Recall, Recite, Sequence, Tell, Write

Conclude, Demonstrate, Discuss, Explain, Generalize, Identify, Illustrate,


Comprehension
Interpret, Paraphrase, Predict, Report, Restate, Review, Summarize, Tell

Apply, Change, Choose, Compute, Dramatize, Interview, Prepare, Produce,


Application
Role-play, Select, Show, Transfer, Use

Analyze, Characterize, Classify, Compare, Contrast, Debate, Deduce,


Analysis Diagram, Differentiate, Discriminate, Distinguish, Examine, Outline, Relate,
Research, Separate,

Compose, Construct, Create, Design, Develop, Integrate, Invent, Make,


Synthesis
Organize, Perform, Plan, Produce, Propose, Rewrite

Appraise, Argue, Assess, Choose, Conclude, Critic, Decide, Evaluate, Judge,


Evaluation
Justify, Predict, Prioritize, Prove, Rank, Rate, Select,

http://www.teach-nology.com/worksheets/time_savers/bloom/
Basic Rubric/Matrix

Criteria

Level of
Achievement
BEST WORST

Standards Descriptors
Some things to consider about rubrics

 Rubrics help ensure that assessment is criteria-based and that


standards are transparent and explicit.

 The format is familiar for most intended audiences.

 However, the simplicity of the matrix format can disguise real


complexities in its design and use
 The traditional matrix format requires the number of significant and
discernible differences used in judging quality be the same for all criteria →
‘manufacturing’ distinctions in quality where they may not exist →obfuscating
standards, biasing grades and making discussion of standards more difficult

 Formats require the ‘quantum’ of achievement between adjacent standards


descriptors is also the same (or close to it), i.e. the gap between each level of
achievement is the same →results are biased if standards descriptors do not
have this quantum property

(Matters, 2005)
http://school.discoveryeducation.com/schrockguide/assess.html

This is an excellent guide to rubrics and is linked to several


The Grading
master/continua
Standards descriptors are positioned along
a continuum (‘pole’)
 This allows the number of standards and their
placements to vary for each criterion
 Judgements can be placed anywhere along the
pole
A student could have achieved all of one descriptor
at, for example, a C level and some of the
descriptors of a B level, therefore the judgement
would be placed between a B and a C depending
on the evidence of the quality of the work
 In reporting about this student's achievement, the
teacher would state that the child has achieved all
of [C standard descriptors] and some of [B
Features of a grading
master
A grading master contains at least the follow
features:
 Standards referents which are located on each
pole with the highest descriptor of the desirable
features located in the top region
 The number of standards referents will vary
depending on the task; and may vary in number
on different poles
 The size of the dots on the poles are
representative of the difficulty of making decisions
of the quality of student work. The smaller the dot,
the lower the order of skills being assessed
 To reach an overall judgement, trade offs need to
be made in terms of what the task was designed to
capture
Desirable features
of the assessment
response

Aspects of the task


being assessed

Brick wall –
students must
achieve all of these
criteria to receive a
grade above the
wall

Non-discountable
pole
Expanders and
clarifiers
Another example of a grading master No clear ‘B’
standard

Two non-
discountable poles
From QSA Assessment Bank, Science, Years 6 & 7, The force of friction
A B C

Activity

Using the assessment task sheet that you


have designed, start to develop a grading
master for this task.
You will need to consider the criteria that
you are assessing and the standards
descriptors to define these.
Use the Bloom’s taxonomy verbs to assist
you in differentiating the levels of
References
 Bloom’s taxonomy verbs. (2007). Retrieved February 04, 2009 from
http://www.teach-nology.com/worksheets/time_savers/bloom/
 Glasson, T. (2009). Improving student achievement: a practical guide to assessment for
learning. Carlton, Australia: Curriculum Corporation. (See Chapter 1 Learning Intentions)
 Kathy Schrock’s Guide for Educators. (2008). Teacher helpers: Assessment and rubric
information. Retrieved February 12, 2009 from
http://school.discoveryeducation.com/schrockguide/assess.html
 Klenowski, V. (2006). Evaluation report of the pilot of the 2005 Queensland Assessment
Task (QAT). Retrieved from http://www.qsa.qld.edu.au/research/reports.html.
 Matters, G. (2005). The grading master: A simpler way. EQ Australia(2), 12-15.
 Queensland. Department of Education, Training and the Arts. (2004). Growing an
assessment culture. Issue No. 4. Queensland Government: AccessEd.
http://education.qld.gov.au/curriculum/assessment/pdfs/gac4in.pdf
 Queensland Studies Authority (2009). Assessment Bank. Retrieved January 22, 2009 from
http://www.qsa.qld.edu.au/assessment/3162.html
 Sadler, D. R. (2007). Perils in the meticulous specification of goals and assessment
criteria Assessment in Education, 14(3), 387-392.
 Sadler, D. R. (2008). Indeterminacy in the use of preset criteria for assessment and
grading [Electronic Version]. Assessment and Evaluation in Higher Education. Retrieved
May 02, 2008 from http://dx.doi.org/10.1080/02602930801956059
 Wiggins, G. & McTighe, J. (1998). Understanding by design. Upper Saddle River, NJ:
Prentice Hall. (See chapter 1 What is backward design?)
More resources
 Jon Mueller Authentic Assessment
Toolbox website provides information on
constructing authentic tasks, designing
rubrics for marking and using portfolios
http://jonathan.mueller.faculty.noctrl.edu/to

Vous aimerez peut-être aussi