Vous êtes sur la page 1sur 3

Isaiah 41:10

“God walks with you.”

Assessment of Learning
DEFINITION OF TERMS
ASSESSMENT Process of gathering, describing or quantifying information about student performance.
May include paper and pencil test, extended responses and performance assessment are
usually referred to as “authentic assessment” tasks
MEASUREMENT A process of obtaining a numerical description of the degree to which an individual
possesses a particular characteristic. It answers the question “How much?”
EVALUATION Process of examining the performance of student. It also determines whether or not the
students has met the lesson instructional objectives.
TEST Instrument or systematic procedure designed to measure the quality, ability, skill or
knowledge of students by giving a set of questions in a uniform manner. It answers the
question “How does an individual student perform?”
TESTING Method used to measure the level of achievement or performance of the learners. It also
refers to the administration, scoring and interpretation of an instrument designed to
elicit information about performance in a sample of a particular area of behavior.
TYPES OF MEASUREMENT
NORM REFERENCED CRITERION REFERENCED
 Designed to measure the performance of a student  Designed to measure the performance of students
compared with other students. with respect to some particular criterion or
 Each individual is compared with other examinees and standard.
assigned a score, usually expressed as a percentile, a  Each individual is compared with a pre-determined
grade equivalent score or a stanine. The achievement of set of standards for acceptable achievement. The
student is reported for broad skill areas, although some performance of the other examinees is irrelevant. A
norm-referenced tests do report student achievement for student’s score is usually expressed as a
individual. percentage and student achievement is reported
 The purpose is to rank each student with respect to the for individual skills.
achievement of others in broad areas of knowledge and  The purpose is to determine whether each student
to discriminate high and low achievers. has achieved specific skills or concepts and to find
out how much students know before instruction
begins and after it has finished.
 AKA Objective-referenced, domain-referenced,
content referenced and universe referenced
DIFFERENCES
Typically covers a large domain of learning tasks, with just Typically focus on a delimited domain of learning
a few items measuring each specific task. tasks, with a relative large number of items
measuring each specific task
Emphasize discrimination among individuals in terms of Emphasize what individuals can and cannot perform
relative of level of learning
Favor items of large difficulty and typically omits very easy Match item difficulty to learning tasks, without
and very hard items altering item difficulty or omitting easy or hard items
Interpretation requires a clearly defined group Interpretation requires a clearly defined and
delimited achievement domain
TYPES OF ASSESSMENT
Placement Assessment Diagnostic Assessment Formative Assessment Summative
Assessment
 Concerned with the entry  Type of assessment given  Used to monitor the  Usually given at the
performance of student. before instruction. learning progress of the end of a course or
 Purpose is to determine  Aims to identify the students during or after unit.
the prerequisite skills, strengths and instruction.  Purposes:
DR. CARL E. BALITA REVIEW CENTER TEL. NO. 735-4098/7350740 -1-
degree of mastery of the weaknesses of the  Purposes: 1. Determine the
course objectives and the students regarding the 1. Provide feedback extent to which
best mode of learning. topics to be discussed. immediately to both the
 Purposes: student and teacher instructional
1. Determine level of regarding the success objectives have
competence of the and failures of been met
students learning. 2. Certify student
2. Identify students who 2. Identify the learning mastery of the
already have errors that is in need intended
knowledge about the of correction outcome and
lesson 3. Provide information to used for
3. Determine the causes the teacher for assigning
of learning problems modifying instruction grades
to formulate a plan to improve learning. 3. Provide
for remedial action. information for
judging
appropriateness
of the
instructional
objectives
4. Determine the
effectiveness of
instruction.
MODES OF ASSESSMENT
 Assessment in which students typically select answer or recall information
to complete the assessment. Tests may be standardized for teacher-
made, and these tests may be multiple choice, fill in the blanks, matching
type.
 Indirect measures of assessment since the test items are designed to
represent competence by extracting knowledge and skills from their real-
Traditional Assessment
life context.
 Items on standardized instrument tend to test only the domain of
knowledge and skill to avoid ambiguity to the test takers.
 One-time measures are used to rely on a single correct answer to each
item. There is a limited potential for traditional test to measure higher
order thinking skills.
 Assessment in which students are asked to perform real-world tasks that
demonstrate meaningful application of essential knowledge and skills
 Direct measures of student performance because tasks are designed to
incorporate contexts, problems, and solution strategies that students will
use in real life.
 Designed ill-structured challenges since the goal is to help students
prepare for the complex ambiguities in life.
Performance Assessment
 Focus on processes and rationales. There is no single correct answer;
instead, students are led to craft polished, thorough and justifiable
responses, performances and products.
 Involve long-range projects, exhibits and performances are linked to the
curriculum
 The teacher is an important collaborator in creating tasks, as well as in
developing guidelines for scoring and interpretation.
 A collection of student’s works, specifically selected to tell a particular
story about the student.
 A portfolio is not a pile of student work that accumulates over a semester
or year.
Portfolio Assessment  A portfolio contains a purposefully selected subset of student work.
 It measures the growth and development of students

DR. CARL E. BALITA REVIEW CENTER TEL. NO. 735-4098/7350740 -2-


FACTORS TO CONSIDER: GOOD TEST ITEM
VALIDITY RELIABILITY ADMINISTRABILITY SCORABILITY
Degree to which the test Consistency of scores Test should be administered Test should be easy
measures what it intends to obtained by the same uniformly to all students so to score. Directions
measure. It is the person when retested using that the scores obtained will for scoring should be
usefulness of the test for a the same instrument or one not vary due to factors clear. The test
given purpose. A valid test that is parallel to it. other than differences of the developer should
is always reliable student’s knowledge and provide the answer
skills. There should be a sheet and the answer
clear provision for key.
instruction for the students,
proctors and the scorer.
APPROPRIATENESS ADEQUACY FAIRNESS OBJECTIVITY
Mandates that the test items The test should contain a Mandates that the test Represents the
that the teacher construct wide sampling of items to should not be biased to the agreement of two or
must assess the exact determine the educational examinees. It should not be more raters or a test
performances called in for in outcomes or abilities so that offensive to any examinee administrators
the learning objectives. The the resulting scores are subgroups. A test can only concerning the score
test items should require the representative of the total be good if it is also fair to all of a student. If the
same performance of the performance in the areas test takers. two raters who assess
student as specified in the measured. the same student on
learning objectives the same test cannot
agree on the score,
the test lacks
objectivity, and the
score of neither the
judge is valid, thus,
lack of objectivity
reduces test validity
in the same way that
the lack of reliability
influences validity.

DR. CARL E. BALITA REVIEW CENTER TEL. NO. 735-4098/7350740 -3-

Vous aimerez peut-être aussi