Vous êtes sur la page 1sur 11

AUTHENTIC AND PERFORMANCE ASSESSMENT

IN THE CLASSROOM
CLASSROOM
Course Instructor

Dr. Arturo Olivárez, Jr., Professor


Education Building Office # 801D
(915) 747-5341
aolivarez3@utep.edu
Office Hours: W 1:00-5:00 p.m. and Th. 2:00 -5:00 p.m. By appointment

Course Information

TED 5303 (3 Credit Hours)


Fall Semester 2008
Thursday (5:00 - 7:50 pm)
Liberal Arts Building (Rm. 105)

Course Description

The course is an introduction to the nature of authentic and performance assessment practices in the
constructivist classroom; use of instruments, such a rubrics, portfolios and individual and group
assessment-related projects. More specifically the course will emphasize the importance of authentic
assessment of student learning within the educational classroom context. We will examine such topics as
the historical roots of assessment, uses of evaluation tests, ethical considerations, and technical and
methodological principles involved in developing and evaluating assessment materials.

Course Purpose

An introductory course focusing on the presentation of basic assessment concepts including the
construction of measures of cognitive achievement and ability typical of educational settings. Topics
include test planning, item writing, test tryout, item analysis, reliability, validity, criterion-referencing,
norm-referencing, item banking, test equating, and item bias. Students write items, critique items written
by others, construct tests, try out and revise tests, and develop test manuals to document the process of
test development and the quality of their tests.

Required Reading and Other Instructional Resources

Nitko, A. J. & Brookhart, S. M. (2007). Educational assessment of students (5th ed.) Upper Saddle
River, NJ: Prentice Hall/Merrill Education.

Optional reading textbook

Marzano, R. J., Pickering, D., & McTighe, J. (1993). Assessing student outcomes: Performance
assessment using the dimensions of a learning model. Alexandria, VA: Association for Supervision
and Curriculum Development.

Optional research documents will be made available throughout the semester, accordingly.
Student Learning Outcomes

The course’s objectives will require the student to acquired and build upon several skill that will enable
the individual to ascertain a degree of mastery and competence by the end of the instructional period. To
that end, the course will emphasize the evaluation of mastered material by delineating targeted outcomes
of performance and their respective assessment. The following table provides a list of the most relevant
student learning outcomes for the course.

Table 1. Student learning outcomes and assessment

Student Learning Outcomes Assessments

By the end of course, the student will be able To evaluate these outcomes, the faculty member
to: will use the following assessment procedures:

1. Understand and apply basic theories and a. Course Graded Assignments,


practices of assessment in the classroom b. quizzes, and
including the validity and reliability of c. unit exams
assessment results
2. Understand, craft, and use a variety of a. Course Graded Assignments,
classroom assessment techniques including b. quizzes, and
true-false items, multiple-choice items, essay c. unit exams
assessment tasks, higher-order thinking, and
problem solving and critical thinking
3. Demonstrate understanding of performance, a. Course Graded Assignments,
portfolio, and authentic assessment typically b. quizzes, and
found in classrooms including rating scales, c. unit exam
and scoring rubrics
4. Compare, contrast, and apply fundamental a. Course Graded Assignments,
concepts in formative evaluations using b. quizzes, and
diagnostic assessments to evaluate student c. unit exams
progress
5. Understand and apply standardized tests and a. Course Grade Assignments,
their interpretation. Use of scholastic b. quizzes, and
aptitude, career interests, attitudes and c. final exam
personality test for classroom use d. Individual class project

Assignments, Evaluation Procedures, and Grading Policy

Chapter exercises: These exercises will be in the form of discussion application and short-answer
questions will be assigned periodically from each chapter as assignment exercises and for class
discussion.

Exams: There will be four major in-class unit exams. These exams will be based on assigned
textbook readings and class discussions/activities. (10 points = per exam; 40 possible exam
points)
Individual course project: Each person will be involved in developing or crafting an assessment
tool. You may choose which type of assessment you would like to work on during the semester.
See handouts # 1 and # 2.

This assignment is designed to foster your learning about concepts learned in the class and your
experiences as a classroom teacher by having you think practically and use the concepts of
assessment and to help you focus on an area of research that could lead to actual data collection
and/or a master’s thesis. I advise you to follow the deadlines (to be given later)

Student who wishes to make fifteen minute oral presentation regarding their individual project
may do so for extra credit.

Class Evaluation

Chapter exercises 19 @ 1% 10%


Individual project 1 @ 30% 30%
Unit exams 4 @ 15% 60%

100%
Grade Assignment

90-93.9 A- 94.0-96.9 A 97.0-99.9 A+


80-83.9 B- 84.0-86.9 B 87.0-89.9 B+
70-73.9 C- 74.0-76.9 C 77.0-79.9 C+
60-63.9 D- 64.0-66.9 D 67.0-69.9 D+
<60 F

Course Instructional Methods

The instructional methods pertinent to the efficient delivery of the material will focus on the following
didactical processes and procedures:
1. Introduction and Exposition of new material via instructor-led presentation using chapter
syntheses of significant chapter content using MS Powerpoint presentations.
2. Instructor-led illustration of chapter material using educational and data-driven problems
from textbook exercises or other relevant sources.
3. Student-led solution of similar chapter material exercises or problems with opportunities to
work individually and/or collaborate in groups.
4. Student-led question and answer sessions
5. Instructor-led summary and discussion of presented chapter material or evaluation of material
taught.

Class Policies / Statements

Course Expectations

Class participation. You are expected to attend class and participate in discussions and activities.
Although there are no points derived from class participation, it may become important in assigning
borderline grades. (Not to exceed 1percentage point)
Workload Policy. The class is a 3 credit course. This indicates that there should be at least a 3 contact
hours per week with a minimum of 6 additional hours of work outside of class per week for the student to
receive an average grade in the course. Assignments in some chapters in the textbook may be easily
finished within couple hours; however, the material in the later chapters does increases in difficulty and
the time burden in tackling some of the assigned exercises increases accordingly. Please, make
appropriate adjustments!

Course Preparation. Student is expected to come prepared before each class meeting. The student is
expected to 1) read the assigned chapter content and material and 2) complete any class or homework
assignments, if any.

Attendance. Due to the nature of the course and amount of material to be covered, attendance in this
class is mandatory, and will greatly influence your overall grade performance in the course. Please,
inform the instructor of your impending absence in advance. Class attendance plays an important role in
expressing your commitment and professionalism and it is a critical factor in your successful completion
of the course.

Civility in the classroom

Students are expected to assist in maintaining a classroom environment that is conducive to learning. In
order to assure that all students have an opportunity to gain the most from time spent in class, unless
otherwise approved by the instructor; students are prohibited from using cellular phones or beepers,
eating or drinking in class, making offensive remarks, internet surfing (those with laptops), reading
newspapers, sleeping or engaging in any other form of distraction. Ad hominem remarks or disparaging
comments about gender, ethnicity, religion, and etcetera will not be tolerated. Inappropriate behavior in
the classroom shall result in, minimally, a request to leave the class.

American with Disability Act

The university is committed to the principle that in no aspect of its programs shall there be differences
in the treatment of persons because of race, creed, national origin, age, sex, or disability, and that equal
opportunity and access to facilities shall be available to all. If you require special accommodations in
order to participate; please contact me, as soon as possible for necessary accommodations. The student
should present appropriate verification from UTEP Office of Compliance. No requirement exists that
accommodation be made prior to completion of this approved university process.

Honor Code

For those courses where student is assigned to generate reports, literature reviews, and research projects, I
take our standards of professional ethics seriously, as I expect all members of the academic community to
do. Any form of cheating or plagiarism will result in the receipt of a failing grade for this course. Any
paper or class assignment you submit for this class must not have been submitted for any other class.
Resubmission of any paper in this class will result in an "F" grade for that paper. No written work may be
submitted for academic credit more than once. If you have any questions about how this may apply to a
paper you are considering for this class, please ask. You may work in teams but the work that is turned in
for credit is entirely the fruit of yours effort and time expenditure.
Cell phones, Wireless Laptops and PDAs

Please turn off cell phones and beepers so as not to disturb others during class time. Please, advise
instructor when personal circumstances require any different communication access. Laptops may be
used to capture lecture notes and materials but not to access the internet sites and ports not linked to the
course lecture.

Final Word

1) Please be courteous to your classmates and instructor. See “civility in the classroom below”
2) I reserve the right to change procedures, readings and topics as necessary, with ample warning.
3) If you must miss more than three class meetings, I advise that you take the course at some other
time.
4) Class attendance is expected and strongly encouraged. Contact instructor if you have to be
absent so that material, assignments and exams missed by student are to be made up in timely
manner.
Tentative Class Schedule. Fall Semester 2008

Date Topic Reading


Introduction, syllabus, plan of course
Aug. 28 Classroom Decision making and using assessment: Introduction Text: Chapter 1
Classroom Decision making and using assessment: Continued
Sept. 3 Describing the Goals and Learning targets of instruction Text: Chapter 2 and 3
Validity of Assessment Results (Introduction)
Validity of Assessment Results (Conclusion)
Sept. 11 Reliability of Assessment Results (Introduction) Text: Chapter 3, 4 & 5

Reliability of Assessment Results (Conclusion)


Sept. 18 Review of previous material Text: Chapter 4, 5, & 6
Planning for integrating assessment and instruction (Introduction)

EXAM # 1 CHAPTERS 1- 5 (two hour exam) Text: Chapter 6 & 7


Sept. 25
Planning for integrating assessment and instruction (Conclusion)
Completion, Short-answer, and True-False items (Introduction)

Completion, Short-answer, and True-False items (Conclusion) Text: Chapter 7 & 8


Oct. 2 Multiple-choice and Matching exercises (Introduction)

Multiple-choice and Matching exercises (Conclusion) Text: Chapter 8 & 9


Oct. 9 Essay Assessment tasks

Essay Assessment tasks (Overview) Text: Chapter 9 & 10


Oct. 16 Higher-Order thinking, Problem solving, and Critical thinking
Higher-Order thinking, Problem solving, and Critical thinking Text: Chapter 10
Oct. 23 Review of previous material
EXAM # 2 CHAPTERS 6-10 Text: Chapter 11& 12
Oct. 30 Introduction to next unit material: Performance, Portfolio, and
Authentic assessment
How to craft performance tasks, projects portfolios, rating scales
and scoring rubrics.

Formative Evaluation using informal diagnostic assessments Text: Chapter 13


Nov. 6
Preparing your students to be assessed and using students’ results Text: Chapter 14 and 15
Nov. 13 to improve your assessments
Evaluating and grading student progress
Review of previous material
EXAM # 3 CHAPTERS 11-15
Nov. 20
Standardized Achievement tests Text: Chapter 16, & 17
Interpreting Norm-referenced scores (Introduction)
THANKGIVING BREAK
Nov. 27
Findings and Evaluating published assessments Text: Chapter 18 & 19
Dec. 4 Scholastic aptitude, career interests, attitudes, and personality tests

Exam 4 (CHAPTERS 16-19 with selected chapter materials


Dec. 11 included in this test only.
Additional Bibliography

APA, AERA, NCME (1999). Standards for educational and psychological tests. Washington, DC:
American Psychological Association.

Bejar, I. I., Embretson, S., & Mayer, R. E. (1987). Cognitive psychology and the SAT: A review of
some implications. Princeton, NJ: ETS

Bloom, B. S., Hastings, J. T., & Madaus, G. F. (1971). Handbook on formative and summative
evaluation of student learning. New York: McGraw-Hill.

Bloom, B. S., Madaus, G. F., & Hastings, J. T. (1981). Evaluation to improve learning. New York:
McGraw-Hill.

Dunbar, S. B., Koretz, D. M., & Hoover, H. D. (1991). Quality control in the development and use of
performance assessments. Applied Measurements in Education, 4(4), 289-304.

Gagne, R. M., Briggs, L. J., & Wager, W. W. (1988). Principles of instructional design (3rd ed.). new
York: (Chapter 7, Defining Performance Objectives and Chapter 13, Assessing Student
Performance)

Haladyna, T. M., & Downing,S. M. (1989). A taxonomy of multiple-choice item-writing rules. Applied
Measurement in Education, 2, 37-50.

Harnisch, D. L. (1983). Item response patterns: Applications for educational practice. Journal of
Educational Measurement, 20, 191-206.

Joint Committee on Testing Practices. (1988). Code of fair testing practices in education. Washington,
DC: National Council on Measurement in Education.

Kolen, M. J. (1988). Traditional equating methodology. Educational Measurement: Issues and Practice,
6(4), 29-37. 7(3), 29-36.

Lane, S. (1993). The conceptual framework for the development of a mathematics performance
assessment. Educational Measurement: Issues and Practice, 12(2), 16-23.

Linn, R. L., Baker, E. L., & Dunbar, S. B. (1991). Complex performance-based assessments:
Expectations and validation criteria. Educational Researcher, 20(8), 15-21.

Mehrens, W. A. (1992). Using performance assessment for accountability purposes. Educational


Measurement: Issues and Practice, 11(1), 3-9.

Messick, S. (1989). Meaning and values in test validation. Educational Researcher, 18(2), 5-11.

Millman, J., & Greene, J. (1989). Test specification and development of tests of achievement and ability.
In R. L. Linn (Ed.)., Educational Measurement (3rd ed.). New York: Macmillan.
Millman, J., & Arter, J. A. (1984). Issues in item banking. Journal of Educational Measurement, 21(4),
315-330.

Nitko, A. J. (1995). A model for curriculum-based national examinations for decisions about minimum
learning competence, certification, and selection of students. In A. J. Nitko, Beyond Catchwords:
Congruence and Articulation in Curriculum, Distribution, and Assessment. Jakarta: Madecor
Career Systems in association with Pusat Pengembangan Agribisnis.

Quellmalz, E. S. (1991). Developing criteria for performance assessments: The missing link. Applied
Measurement in Education, 4(4), 319-331.

Scheuneman, J. D., & Bleistein, C. A. (1989). A consumer’s guide to statistics for identifying
differential item functioning. Applied Measurement in Education, 2, 255-275.

Shavelson, R. J., & Bater, G. P. (1991). Performance assessment in science. Applied Measurement in
Education, 4(4), 319-331.

Stiggins, R. J. (1987). Design and development of performance assessments. Educational Measurement:


Issues and Practice, 6(3), 33-42.

Stiggins, R. J., Griswold, M. M., Wikelund, K. R. (1989). Measuring thinking skills through classroom
assessment. Journal of Educational Measurement, 26, 233-246.

Wright, B. D., & Bell, S. R. (1984). Item banks: What, why, how. Journal of Educational Measurement,
21(4), 331-345.

Anastasi, A. (1988). Psychological Testing (6th ed.). New York: Macmillian Publishing Co.

Allen, M. J., & Yen, W. M. (1979). Introduction to measurement theory. Monterey: Brooks/Cole.

Angoff, W. H. (1964). Technical problems of obtaining equivalent scores on tests. Educational and
Psychological Measurement. 1, 11-13.

Berk, R. a. (Ed.). (1982). Handbook of methods for detecting test bias. Baltimore. John Hopkins
University.

Boring, E. G. (1950, June 6). Intelligence as tests test it. The New Republic, pp. 35-37.

Campbell, D. T., & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait-
multimethod matrix. Psychological Bulletin, 56, 81-105.

Cohen, R. J., Montague, P., Nathanson, L. S., & Swerdlik, M. E. (1988). Psychological testing: An
introduction to tests & measurement. Mountain View, CA: Mayfield Publishing Co.

Cohen, R. J. & Swerdlik, M. E. (1999). Psychological Testing and Assessment(4th Ed.), Mountain View,
CA: Mayfield Publishing Company.

Crocker, L. & Algina, J. (1986).Introduction to Classical & Modern Tet Theory.

Cronbach, L. J. & Gleser, G. C. (1965). Psychological test and personnel decisions (2nd ed.). Urbana:
University of Illinois.
Ebel, R. L. (1973). Evaluation and educational objectives. Journal of Educational Measurement, 10, 273-
279.

Gorsuch, R. L. (1997). Exploratory factor analysis: Its role in item analysis. Journal of Personality
Assessment. 68, 532-560.

Guilford, J. P. (1954a). Psychometric Methods. New York: McGraw-Hill.


Hambleton, R. K., & Novick, M. r. (1973). Toward an integration theory and method for criterion-
referenced tests. Journal of Educational Measurement. 10, 159-170.

Kaufman, A.S., & Kaufman, N. L. (1990). Kaufman Brief Intelligence Test (K_BIT): Manual. Circle
Pines, MN: American Guidance Service.

Lord, F.M., & Novick, M. R. (1968). Statistical theories of mental test scores. Menlo Park, CA: Addison-
Wesley.

Messick, S. (1995). Validity of psychological assessment. American Psychologist, 50, 741-749.

Nunnaly, J. C. (1978). Psychometric theory. (2nd ed.). New York: McGraw-Hill.

Pedhazur, E. J., & Schmelkin, L. P. (1991). Measurement, Design, and Analysis: An Integrated
Approach. Lawrence Erlbaum Associates, Publishers: Hillsdale, New Jersey.

Thorndike, R. L. (1982). Applied psychometrics. Boston: Houghton Mifflin Co.

Wainer, H., & Braun, H. I. (Eds.). (1988). Test validity. Hillsdale, NJ: Lawrence Erlbaum Associates,
Publishers.
Graduate-Level Performance Assessment Project

ACHIEVEMENT TEST PROJECT I: Performance Assessment Project


The Assignment For this assignment you will craft six performance tasks and corresponding scoring rubrics. The
tasks do not have to represent one unit of a curriculum. However, they should all be in the same curriculum area (or
course) and for the same grade/age level of students. The tasks must be targeted for persons who are in junior high
school (age 13) or older. You do not have to administer the performance tasks but you need to develop a plan for
tryout, revision, item analysis, and test analysis.

Tasks to be developed You need to craft one of each type of the following structured, on-demand tasks and scoring
rubrics for each of them.

• A task the scoring of which focuses on assessing a process.

• A closed-response, paper-and-pencil task the scoring of which focuses on assessing a product.

• An open-response, paper-and-pencil task the scoring of which focuses on assessing a product.

• A task requiring the student to use resources/equipment beyond paper-and-pencil.

• A group task the scoring of which focuses on assessing two or more group collaboration/cooperation
learning targets.

• A plan and scoring rubric for a best works portfolio

Procedure Each performance task should be crafted following the process shown in Figure 12.2 of the textbook.
The portfolio should be crafted following the six steps described in the textbook.

The final project write-up The final project should be typed and should be consistent with the following outline.

I. Introduction and rationale


A. Description of the content/curriculum area
B. Statement of the intended use for the set of tasks and their place in the overall assessment plan for the
curriculum area/ course

II. Population
A. Describe the characteristics of the persons for whom the set of tasks is developed.
B. Describe the learning conditions (e.g., instruction, prerequisite knowledge, type of course, etc.) that the
population for which the set of tasks is proposed should have experienced.

III. The tasks (Insofar as possible, present each task on a separate page in a manner similar to the presentation of
the “Decision-Making Task” in Chapter 11 of the Nitko & Brookhart text.)
A. Task title
B. Task itself
C. Content standard(s)
D. Complex reasoning standard(s)
E. Information processing standard(s)
F. Effective communication standard(s)
G. Habits of mind standard(s) -- optional
H. Collaborative/cooperative standard(s) -- optional
I. Estimate of the time needed to complete the task
IV. The scoring rubrics (Insofar as possible, keep the scoring rubrics for each task on one page and insert that
page immediately following the task to which it applies.)
A. Scoring rubrics for each standard specified in III above

V. The portfolio
A. Description of the portfolio
B. Instructions/directions to students
C. Scoring rubrics for the portfolio

VI. Development/crafting procedures


A. Describe how the set of tasks/rubrics was developed (e.g., how content and performances were
identified, persons/sources consulted, steps taken to have tasks/rubrics evaluated by others, checklists
from text that were used, editing procedures followed, etc.) You may do this for the set of tasks as a
whole instead of for each task separately.
B. Describe how the portfolio/rubrics were developed (use similar format as A above).

VII. Proposed technical analyses


A. Item Analyses (for the set)
1. What difficulty index should be used?
2. How should it be applied to your scoring rubrics?
3. What discrimination index should be used?
4. How should it be applied to your scoring rubrics?
5. What computer program should be used?
B. Reliability analyses (for the set)
1. Stability coefficients to be used, rationale, and how to do it
2. Internal consistency coefficients to be used, rationale, and how to do it
3. Scorer reliability coefficients to be used, rationale, and how to do it
C. Validity analyses (for the set) -- What evidence should be gathered for each of the categories below and
how would you go about doing it. (Note: write no more than 500 words per category).
1. content representativeness and relevance
2. thinking processes and skills represented
3. consistency with other (criterion) assessments
4. reliability and objectivity
5. fairness to different types of students
6. economy, efficiency, practicality, instruction features
7. multiple assessment usage

VIII. Summary statement—Describe what you have learned, problems encountered, insights obtained, etc. (Note:
this section should not exceed one typed page.)

Vous aimerez peut-être aussi