Vous êtes sur la page 1sur 7

Research: Science and Education

1864 Journal of Chemical Education Vol. 82 No. 12 December 2005 www.JCE.DivCHED.org


Curriculum innovations in chemistry are often designed
to address the important goals of improving student under-
standing and improving student attitude. To determine the
effects of these innovations, therefore, assessments of both
content and attitude must be used. Most chemistry instruc-
tors would be able to engage in a hearty and deep conversa-
tion about assessing content knowledge, bringing out different
nuances concerning what it means to understand chemistry,
and how the design of the assessment may help or hinder in
that regard. The same cannot be said about assessing attitudes.
The unfortunate consequences of this are that attitude is
seen as a unidimensional construct (which it is not) and that
assessing attitude seems as simple as putting together a few
good survey questions (which it is not). This point has been
strongly argued in excellent critical reviews over the past 30
years (111).
Many curriculum studies claim We measured student
attitudes by using one of the following approaches. In one
approach, students are presented a list of statements regard-
ing the curriculum activities to which they indicate their level
of agreement (e.g., strongly agree to strongly disagree). For
instance, items in the attitude assessment of many studies
include The curriculum activities enhanced my learning
and I enjoyed the curriculum activities. Results might be
summarized by combining all items to create a single score
for each student. Another approach calculates the average re-
sponse on each item and ranks the items to show which state-
ments had the strongest agreement or disagreement. These
analyses are psychometrically weak for several reasons.
One reason is that the ability to look for changes over
time or for comparisons between student groups is limited
when items can only be answered sensibly by those who ex-
perience the curriculum activity. Hence, prepost assessment
is not possible and a control group cannot answer the ques-
tions.
To explain other weaknesses, a chemical analogy will be
helpful. A chemical equivalent of We measured student at-
titudes and report a total score for each student is: We mea-
sured the concentrations of all dissolved materials and report
the total combined concentration to be X mol/L. Few chem-
ists would find this type of reporting helpful. It would clearly
make more sense to present concentrations organized into
categories, e.g. cation species, anion species, chlorinated hy-
drocarbons, dissolved gases, turbidity. Similarly, to provide a
useful assessment of student attitudes, one must distinguish
among several different mental constructs rather than lump
everything together as attitude. That these constructs are
distinct is well recognized in the psychology, sociology, and
education literature, and failure to make these distinctions
in evaluative instruments has been criticized (111). List 1
indicates some of those distinct constructs that are often in-
appropriately lumped together.
Beyond Student Attitudes: Chemistry Self-Concept Inventory
for Assessment of the Affective Component
of Student Learning W
Christopher F. Bauer
Department of Chemistry, University of New Hampshire, Durham, NH 03824; cfb@cisunix.unh.edu
edited by
Diane M. Bunce
The Catholic University of America
Washington, DC 20064
Chemical Education Research
List 1. Distinct Mental Constructs Related to Attitude
Often Unexamined in Curriculum Assessments
Terms
Related to Attitude
Attitude
Beliefs
Interests
Values
Self-Concept
Self-Efficacy
Self-Esteem
Related to
Scientific Attitudes (3)
Scientific Habits
of Mind
Understanding of
Nature of Science
Definitions
A learned predisposition to respond
favorably or unfavorably toward an
attitude object (12)
Personal knowledge or understand-
ings that are antecedents of attitudes
and subjective norms; they establish
behavioral intentions (13, 14)
Personal or situational preferences
for particular activities (15)
Enduring beliefs regarding what
should be desired, what is impor-
tant, and what standards of conduct
are acceptable, which influence or
guide behavior (16)
Evaluation an individual makes and
customarily maintains with respect to
himself or herself in general or spe-
cific areas of knowledge (17, 18)
Self-perception of an ability to do
something very specific (17, 19)
Ones level of satisfaction with
ones self-concept (18)
The shared values, attitudes, and
skills of the cultural tradition of
science (20)
Scientific investigation as a human,
social, and cognitive activity
(2123)
Research: Science and Education
www.JCE.DivCHED.org Vol. 82 No. 12 December 2005 Journal of Chemical Education 1865
A chemical parallel for ranking survey items by average
score is to list every individual chemical substance found from
highest to lowest concentration. Again, this format is not very
useful. It is better to lump survey items into coherent groups
to provide the advantages of replication. Individual items bear
the signal-to-noise disadvantage of an n = 1 experiment,
whereas averaging several responses provides a more reliable
estimate. In psychometric terms, a larger n value increases
the reliability of a survey instrument (16, 24).
In this article, a survey instrument is described for mea-
suring a students self-concept as a learner of chemistry
(named the chemistry self-concept inventory, or CSCI). Self-
concept is a cognitive evaluation of ones ability in a domain
(15), a persons perception of self (11), an evaluation an indi-
vidual makes and customarily maintains with respect to him-
self or herself, in general or specific areas of knowledge (25).
It is a robust mental construct that psychologists and social
psychologists have delineated that is clearly pertinent to our
concern for students learning. Self-concept has also been iden-
tified as a contributing component in expectancy models of
motivation and conceptual change (15, 2632). Expectancy
models are based on the notion that individuals will choose,
and persist in doing, a task if they have a reasonable expecta-
tion for success. The CSCI instrument overcomes the short-
comings of the aforementioned approaches because its
structure includes a few, multiple-item subscales that are psy-
chometrically distinct yet have strong internal consistency.
Instrument Design
The chemistry self-concept inventory described here was
developed from the Self Description Questionnaire III
(SDQIII) (33). The SDQIII, developed for college-age ado-
lescent populations, has an extensive history of development
and strong validity and reliability characteristics (3440). It
contains a number of subscales (e.g., mathematics, verbal, aca-
demic) but none in science. Related questionnaires, devel-
oped and evaluated for younger students by Marshs group
(17, 35, 4144), sometimes include a science subscale. Rarely,
however, is self-construct delineated for a specific science
domain, such as chemistry, with strong psychometric char-
acteristics (45). Yet college students clearly are able to distin-
guish their feelings and performance among the various
scientific disciplines, such as chemistry versus biology versus
physics. Thus, it made sense to focus the instrument specifi-
cally for chemistry. There has been no concerted effort to
develop a generally applicable instrument with strong valid-
ity and reliability characteristics for the college chemistry
population.
Ten chemistry items were created by modestly reword-
ing the ten mathematics items in the SDQIII. Thirty addi-
tional items were taken from the SDQIII from the subscales
named mathematics, problem solving, and general aca-
demics. The forty items were placed on the survey page in a
pattern similar to that of the SDQIII. Every fourth item be-
longs to the same subscale. Otherwise items are randomly
distributed by direction (positive or negative) and specific
meaning. The instrument is available in the Supplementary
Material.
W
The survey form described in this manuscript has im-
portant physical design features intended to help a subject
focus carefully on item statements and on the response range.
Seven choices help strengthen the reliability of the instru-
ment and take advantage of the ability of adults to draw dis-
tinctions (16). Statements and choices are placed on the same
line. The central question is repeated on each page. The ex-
treme 1 and 7 positions are labeled very inaccurate of me
and very accurate of me several times down the page as an-
chors for the scale that are always in the field of view. In-
structions are brief, with the most important focus words
highlighted. Responses may be placed directly on the sheet,
or on a scanning form. Seven-choice scan sheets may be pur-
chased from a business forms company, such as NCS Pearson.
Evaluation Procedure
Typically, in the design of a new instrument, one starts
with a large number of possible items, and then seeks the
opinions of experts and conducts field tests to winnow the
item list (16, 46). Since the well-documented SDQIII in-
strument was already available, the sensible strategy was to
adapt appropriate parts. Nevertheless, when an instrument
is modified and applied to new populations it is necessary to
establish its validity and reliability in that setting (45). This
is analogous to good analytical chemistry practice in that one
should test the performance of a standard method after modi-
fication for application to new substrates (47, 48).
Student Populations
Students elected to participate through informed con-
sent. The instrument was initially tested using two cohorts
of students in a non-science majors college chemistry course
at a large, public, research-intensive university. Factor analy-
sis suggested a subscale structure similar to the SDQIII, but
the number of subjects in the sample was too small (n = 50)
to establish this reliably. Results were helpful in identifying
two items for which rewording improved clarity.
The revised instrument was subjected to factor analysis
using results from a larger group of college students at the
same institution. These general chemistry students (about
two-thirds in first year) represented a diverse set of majors in
engineering, sciences, human services, and liberal arts. The
inventory was administered in one specific lab room over an
entire week near the end of the first semester. Because stu-
dents are scheduled into lab sections independently from lec-
ture section, this included a cross section with respect to
lecture instructor, laboratory teaching assistant, academic
major, lab time preference, and day preference. The validity
of factor analytic results is strengthened when based on a rep-
resentative sample of respondents from the population of in-
terest (16, 46). The number of complete surveys was 379.
A different student cohort was surveyed in a subsequent
year from this same course to gather testretest reliability data.
Students were selected in the same manner. The first survey
was presented during the first week of lab. The retest was com-
pleted the second week, using the false excuse that accidental
water damage from a building flood ruined the first survey
papers. The number of complete pairs of surveys was 65.
Additional student populations at the same institution
chemistry majors (n = 13) and students acting as study group
leaders for the course (n = 19)provided other results for
comparison.
Research: Science and Education
1866 Journal of Chemical Education Vol. 82 No. 12 December 2005 www.JCE.DivCHED.org
e l i f o r P g n i d a o L d n a s r o t c a F y r o t n e v n I t p e c n o C - f l e S y r t s i m e h C . 1 e l b a T
m e t I
r e b m u N
s t n e m e t a t S y r o t n e v n I
t p e c n o C - f l e S t n e d u t S g n i d r a g e R
e d u t i n g a M g n i d a o L y b d e k n a R s r o t c a F
1 2 3 4 5
t p e c n o C - f l e S s c i t a m e h t a M
7 1 . h t a m t a d o o g e t i u q m a I 1 8 . 0 3 1 . 0 4 2 . 0 6 0 . 0 2 1 . 0
5 2 . s e s s a l c h t a m n i l l e w e n o d s y a w l a e v a h I 0 8 . 0 6 0 . 0 9 2 . 0 2 0 . 0 3 1 . 0
* 1 2 . h t a m n o d e s a b g n i h t y n a g n i d n a t s r e d n u e l b u o r t e v a h I 8 7 . 0 3 2 . 0 3 0 . 0 0 1 . 0 9 0 . 0
0 * 5 . h t a m e v l o v n i t a h t s e s r u o c e k a t o t d e t a t i s e h e v a h I 5 7 . 0 3 1 . 0 7 0 . 0 4 0 . 0 5 0 . 0
* 3 1 . e t a u q e d a n i l e e f e m s e k a m h t a M 4 7 . 0 6 1 . 0 5 0 . 0 6 1 . 0 3 0 . 0
09 . s e s r u o c r e h t o n i n a h t s e s r u o c h t a m n i r e t t e b e n o d y l l a r e n e g e v a h I 1 7 . 0 1 1 . 0 1 1 . 0 4 1 . 0 7 1 . 0
* 9 2 . g n i n o s a e r h t a m e r i u q e r t a h t s t s e t n o l l e w o d r e v e n I 8 6 . 0 0 3 . 0 6 0 . 0 8 1 . 0 6 0 . 0
* 7 3 . h t a m t u o b a d e t i c x e y r e v n e e b r e v e n e v a h I 4 6 . 0 3 2 . 0 8 1 . 0 4 1 . 0 9 0 . 0
3 3 . h t a m n i p l e h r o f e m o t e m o c s y a w l a s d n e i r f y m , l o o h c s t A 1 6 . 0 2 0 . 0 5 2 . 0 2 1 . 0 7 0 . 0
* 9 1 . g n i v l o s m e l b o r p t a d o o g h c u m t o n m ' I 5 5 . 0 2 3 . 0 2 0 . 0 7 1 . 0 6 2 . 0
01 . g n i g n e l l a h c d n a g n i t s e r e t n i s m e l b o r p h t a m y n a m d n i f I 7 4 . 0 5 2 . 0 7 1 . 0 9 0 . 0 2 0 . 0
t p e c n o C - f l e S y r t s i m e h C
4 2 . s a e d i l a c i m e h c h t i w g n i l a e d t a d o o g e t i u q m a I 3 2 . 0 5 7 . 0 3 3 . 0 4 0 . 0 4 0 . 0
* 8 2 . e m s e t a d i m i t n i y r t s i m e h C 3 3 . 0 1 7 . 0 2 1 . 0 7 0 . 0 7 0 . 0
* 0 2 . y r t s i m e h c e v l o v n i t a h t s e s r u o c n i l l o r n e o t e t a t i s e h d l u o w I 9 2 . 0 0 7 . 0 5 1 . 0 7 0 . 0 4 0 . 0
6 3 . s e s r u o c t s o m n i n a h t y r t s i m e h c e v l o v n i t a h t s e s r u o c n i r e t t e b e n o d s y a w l a e v a h I 6 1 . 0 9 6 . 0 7 0 . 0 8 1 . 0 2 2 . 0
6 1 . t r a p t a h t n o l l e w o d s y a w l a I , s e s r u o c y m n i s c i p o t l a c i m e h c o t n i n u r I n e h W 3 2 . 0 9 6 . 0 8 3 . 0 4 0 . 0 7 0 . 0
* 0 4 . y r t s i m e h c n o d e s a b g n i h t y n a g n i d n a t s r e d n u e l b u o r t e v a h I 9 2 . 0 8 6 . 0 7 0 . 0 4 1 . 0 3 1 . 0
0 * 4 . y r t s i m e h c t u o b a d e t i c x e n e e b r e v e n e v a h I 1 1 . 0 6 6 . 0 6 1 . 0 7 0 . 0 3 0 . 0
2 1 . g n i g n e l l a h c d n a g n i t s e r e t n i s t p e c n o c y r t s i m e h c d n i f I 3 0 . 0 5 6 . 0 4 1 . 0 8 1 . 0 9 0 . 0
* 2 3 l a c i m e h c e r i u q e r t a h t s t n e m u g r a g n i d n a t s r e d n u y t l u c i f f i d d a h s y a w l a e v a h I
. e g d e l w o n k
8 1 . 0 3 6 . 0 6 0 . 0 9 1 . 0 3 1 . 0
08 . s c i p o t l a c i m e h c t u o b a s d n e i r f l o o h c s h t i w s n o i s s u c s i d n i y l t n e d i f n o c e t a p i c i t r a p I 1 0 . 0 1 5 . 0 6 3 . 0 6 0 . 0 2 0 . 0
t p e c n o C - f l e S c i m e d a c A
8 1 . s t c e j b u s c i m e d a c a t s o m t a d o o g m ' I 9 0 . 0 6 0 . 0 2 7 . 0 9 2 . 0 5 0 . 0
6 2 . s t c e j b u s c i m e d a c a t s o m n i y l k c i u q n r a e l I 8 1 . 0 0 1 . 0 4 6 . 0 6 2 . 0 3 0 . 0
4 3 . s t c e j b u s c i m e d a c a t s o m n i s k r a m d o o g t e g I 4 1 . 0 8 0 . 0 3 6 . 0 4 3 . 0 4 0 . 0
3 2 . y t i s o i r u c l a u t c e l l e t n i f o t o l a e v a h I 5 0 . 0 9 0 . 0 6 . 0 7 1 . 0 4 3 . 0
9 3 . s k s a t e n i t u o r g n i o d f o s y a w r e t t e b e e s n e t f o n a c I 5 1 . 0 5 0 . 0 1 5 . 0 5 0 . 0 4 3 . 0
07 . d e i r t t o n e v a h s r e h t o t a h t s y a w n i s a e d i g n i n i b m o c t a d o o g m a I 0 1 . 0 4 2 . 0 8 4 . 0 5 0 . 0 8 3 . 0
t p e c n o C - f l e S t n e m y o j n E c i m e d a c A
* 0 3 . s t c e j b u s c i m e d a c a t s o m e t a h I 4 0 . 0 4 0 . 0 0 0 . 0 4 7 . 0 8 0 . 0
* 2 2 . s t c e j b u s c i m e d a c a t s o m n i d e t s e r e t n i y l r a l u c i t r a p t o n m ' I 3 0 . 0 3 0 . 0 6 0 . 0 3 7 . 0 2 1 . 0
0 * 6 . s t c e j b u s c i m e d a c a y n a m g n i y d u t s e t a h I 2 0 . 0 5 1 . 0 2 0 . 0 5 6 . 0 6 0 . 0
0 1 . s t c e j b u s c i m e d a c a t s o m e k i l I 8 0 . 0 2 0 . 0 4 3 . 0 7 5 . 0 8 0 . 0
4 1 . s t c e j b u s c i m e d a c a t s o m h t i w e l b u o r t e v a h I 3 2 . 0 9 0 . 0 7 1 . 0 4 5 . 0 2 0 . 0
02 . s t c e j b u s c i m e d a c a t s o m r o f k r o w g n i o d y o j n e I 5 0 . 0 6 1 . 0 3 3 . 0 2 5 . 0 8 0 . 0
* 8 3 . r e d r a h d e k r o w I f i n e v e , s r o n o h c i m e d a c a e v e i h c a r e v e n d l u o c I 7 2 . 0 6 0 . 0 6 2 . 0 5 4 . 0 3 0 . 0
t p e c n o C - f l e S y t i v i t a e r C
* 1 1 . y t i l a n i g i r o d n a n o i t a n i g a m i e r o m d a h I h s i w I 2 0 . 0 1 0 . 0 4 0 . 0 6 0 . 0 8 6 . 0
* 7 2 . s n o i t c a d n a , s t h g u o h t , s a e d i y m n i l a n i g i r o y r e v t o n m a I 1 0 . 0 7 0 . 0 7 0 . 0 - 0 2 . 0 6 6 . 0
1 3 . n o s r e p e v i t a n i g a m i n a m a I 2 1 . 0 9 1 . 0 4 . 0 7 0 . 0 9 5 . 0
* 5 3 . r o t n e v n i n a g n i e b n i t s e r e t n i o n e v a h d l u o w I 1 0 . 0 5 3 . 0 6 0 . 0 3 0 . 0 1 5 . 0
*Items that load with opposite signs must be reversed on the scale.
Research: Science and Education
www.JCE.DivCHED.org Vol. 82 No. 12 December 2005 Journal of Chemical Education 1867
Data Analysis
Survey responses were manually transcribed or machine
scanned to numerical values from 1 to 7. Statistical tests were
performed using a combination of tools, particularly Micro-
soft Excel, Minitab 11, and SPSS 7.5 and 11.5.
Results
Exploratory Factor Analysis
Exploratory factor analysis is a statistical approach that
helps identify survey items that seem to belong together be-
cause of the similarity in patterns of responses by students.
Each group of items defines what is called a factor. The
procedure does not provide a black and white categorization.
It is necessary to run several analyses each with different con-
straints, and then to evaluate the results for interpretability
(16). Detailed discussion of the procedures available and the
decisions that must be made can be found in many sources
(16, 49).
To assess whether the data set would factor well, the
KaiserMeyerOlin (KMO) measure of sampling adequacy
(compares observed correlation coefficients with partial cor-
relation coefficients) was calculated. That calculated value was
0.89; a KMO measure greater than 0.5 is considered adequate.
Also, the matrix of simple correlations among CSCI items con-
tained a large number of values in midrange (0.3 to 0.7), in-
dicating the likelihood that the data set would factor well.
Factors were extracted by the principal components method.
A scree plot of eigenvalues and inspection of the percent of
explained variance indicated about 45 strong factors, with 7
having eigenvalues >1. Structure was explored by extracting
37 factors using varimax (orthogonal) rotation, and study-
ing the pattern and magnitude of the loading (degree of asso-
ciation) of each survey item on each factor. In an ideal case,
one would find that loadings would be close to 1 or 1 for
three or more survey items on one factor, and load near 0 for
the other factors, and that other factors would have a differ-
ent set of survey items with loadings near 1 and 0. The sta-
bility of the loadings of individual items was compared
manually as the number of factors extracted was varied.
This process helped identify five distinct factors, labeled
chemistry self-concept, mathematics self-concept, academic
self-concept, academic enjoyment self-concept, and creativ-
ity self-concept. Table 1 shows the survey statements orga-
nized by factor and ranked by loading magnitude within the
factor. These five factors explain 52% of the variance in the
data set. The high degree of relatedness of the items within
each factor permit the scores of these items to be combined
into a single subscale score. For instance, creativity self-con-
cept for one student consists of the sum of the values for items
11, 27, 31, and 35, with one adjustment. Items that load
with opposite signs (indicated by asterisk) must be reversed
on the scale. Subscale scores may also be calculated as an av-
erage of the item ratings. Here we report subscale scores as a
percent of the available range (i.e., a range of 17 becomes a
range of 0100%).
Two items (3 and 15) have no strong association (load-
ings between 0.01 and 0.37) with any factor. These two items
are not included in the subscale calculations and could be
removed from the instrument.
Validity was evaluated by comparing groups of students
whose responses on the subscales should be predictable: stu-
dents taking general chemistry, chemistry majors, and stu-
dents who were discussion group leaders for the course
(peer-led team learning model) (50). The group of peer lead-
ers contained no chemistry majors and about half had taken
one or two semesters of organic chemistry. Table 2 contains
the percentage score on each subscale for the three popula-
tions of students. Validity was also evaluated by comparing
instrument subscale scores with student course performance
(Table 3) and by comparing subscale scores with each other.
*Significant differences among the three groups at p < 0.02 by analy-
sis of variance.
n o i t a l u p o P t n e d u t S y b e r o c S t p e c n o C - f l e S . 2 e l b a T
) % 0 0 1 o t 0 ( e g n a R e l a c S
e l a c s b u S
) t p e c n o c - f l e S (
l a r e n e G
y r t s i m e h C
r e e P
s r e d a e L
y r t s i m e h C
s r o j a M
* s c i t a m e h t a M 8 5 0 7 0 8
* y r t s i m e h C 8 4 3 7 1 8
* c i m e d a c A 8 6 5 7 7 7
c i m e d a c A
t n e m y o j n E
3 7 8 7 8 6
y t i v i t a e r C 4 6 1 6 8 6
s e l a c s b u S y r o t n e v n I t p e c n o C - f l e S y b t n e m e v e i h c A d n a e r o c S t p e c n o C - f l e S n e e w t e b s n o i t a l e r r o C . 3 e l b a T
s e u l a V t n e i c i f f e o C n o i t a l e r r o C n o s r a e P
e l a c s b u S y r t s i m e h C c i m e d a c A t n e m y o j n E c i m e d a c A y t i v i t a e r C e d a r G e s r u o C
s c i t a m e h t a M * 0 5 . 0 * 6 2 . 0 * 1 2 . 0 1 0 . 0 * * 9 3 . 0
y r t s i m e h C * 6 2 . 0 * 2 2 . 0 9 0 . 0 * * 8 3 . 0
c i m e d a c A * 1 5 . 0 * 4 3 . 0 * 1 2 . 0
t n e m y o j n E c i m e d a c A * 2 2 . 0 * 8 1 . 0
y t i v i t a e r C 5 0 . 0 *
*Significant at p < 0.01.
Research: Science and Education
1868 Journal of Chemical Education Vol. 82 No. 12 December 2005 www.JCE.DivCHED.org
Two types of reliability were evaluated (16, 24): subscale
internal consistency and testretest replication (Table 4). In-
ternal consistency is the relationship of the items in a subscale
to each other. A statistic typically used is Cronbachs !, which
is based on the average correlation of items. Testretest reli-
ability requires a second administration of the instrument to
the same population of students, and simple correlation co-
efficients calculated.
Discussion
Validity
The item loadings in Table 1 show a fairly clean distinc-
tion for five factors. Items that load strongly do so with val-
ues greater than |0.5|. Items that load weakly do so with values
less than |0.25|. The ten chemistry items created for this sur-
vey coalesce into a single factor. The ten mathematics items
are joined by item 19, which was grouped in the SDQIII
with a factor called problem solving. Students may be inter-
preting problem solving as mathematical because that is the
typical usage in a chemistry class.
The remaining three factors of the CSCI differ from the
SDQIII, where the same items are associated with only two
SDQIII factors: academic self-concept and problem-solving
self-concept. Three items split from each of these two SDQIII
factors and fall together under what is called here academic
self-concept. All of these items carry a sense of self-assess-
ment of intellectual ability. They also are all positively worded,
which may indicate that a response set effect contributes to
the distinctness of these items. Seven items load strongly un-
der what has been named academic enjoyment self-concept,
a factor that is not identified in the SDQIII. These items
express an emotional response regarding academics, as indi-
cated by the words like, hate, trouble, enjoy. Four
additional items are called creativity self-concept, because
all suggest a sense of creation or invention. Both of these lat-
ter factors contain both positive and negative items.
Evidence for content validity comes from comparing the
three student populations (general chemistry, study group
leaders, and chemistry majors) in Table 2. The mathematics
and chemistry subscale scores increase significantly (based on
analysis of variance) as the students contact-time with chem-
istry increases, whereas the other subscales are essentially equal
across the groups. One would expect thiscontact and study
of chemistry should enhance both chemistry and mathemati-
cal self-concepts without necessarily affecting more general
concepts of self. General academic self-concept is significantly
stronger in the peer leader and chemistry majors groups.
These students were all sophomores or beyond and had aca-
demic records of success. One would expect these groups to
have stronger academic self-concepts than the average first-
year student.
Additional evidence for content validity comes from
comparing self-concept score with chemistry course grade.
Based on expectancy theories of motivation, one should
expect achievement to lead to enhanced self-concept and
enhanced self-concept to motivate working to enhance
achievement (15). Thus there should be a significant rela-
tionship between self-concept in chemistry and chemistry
achievement. The correlations (Table 3), while significant,
are not large, indicating that the CSCI inventory is not sim-
ply another measure of academic ability. However, the cor-
relations are greatest for mathematics and chemistry
self-concepts, weaker for general academic self-concept and
enjoyment, and uncorrelated with the global self-concept
scale of creativity. The trend in correlations is consistent with
expectations.
Subscales are correlated at a statistically significant but
weak level (Table 3). The highest are between chemistry and
mathematics self-concepts, and between the academic and
academic enjoyment self-concepts. The magnitude of these
subscale correlations are consistent with those found with the
SDQIII subscales.
Instrument vocabulary and reading level may be valid
for upper-level secondary students through adults, but the
instrument has not at this time been tested beyond the col-
lege populations. The tone of presentation is also important.
Students are more likely to take the survey seriously if it is
presented as an opportunity to help the instructor gain in-
sight into students response to the curriculum.
The survey takes students about 10 minutes to complete.
One could shorten the survey to just the chemistry items and
thus shorten administration time. How this may affect re-
sults has not been tested. Interspersing the chemistry items
among other items may prevent subjects from adjusting their
responses to be consistent with how they responded to a par-
allel itemeach item should be responded to independently.
A second advantage of multiple subscales is that the more
global self-concept scales provide an internal control against
which to compare the chemistry and mathematics subscales
scores. The chemical analysis equivalent is using an internal
standard.
Reliability
Reliability coefficient values (Table 4) are mostly above
0.7. For measuring affective domains, values at or above 0.7
are considered strong (16). The strongest reliabilities lie with
the specific content subscales, mathematics and chemistry.
The Cronbach ! values here are consistent with those re-
ported for the SDQIII (33).
s e t a m i t s E y t l i b a i l e R . 4 e l b a T
g n i t s e t e R d n a y c n e t s i s n o C l a n r e t n I e l a c s b u S r o f
e l a c s b u S
) t p e c n o c - f l e S (
s h c a b n o r C
! s e u l a V
t s e t e R t s e T
s e u l a V n o i t a l e r r o C
s c i t a m e h t a M 1 9 . 0 0 9 . 0
y r t s i m e h C 0 9 . 0 6 8 . 0
c i m e d a c A 7 7 . 0 4 6 . 0
c i m e d a c A l a n o s r e P
t n e m y o j n E
7 7 . 0 3 7 . 0
y t i v i t a e r C 2 6 . 0 5 8 . 0
Research: Science and Education
www.JCE.DivCHED.org Vol. 82 No. 12 December 2005 Journal of Chemical Education 1869
Applicability
The chemistry self-concept inventory, a spreadsheet tem-
plate that automates calculations, and a Users Guide are all
included as Supplementary Material.
W
The subscale scores
may be read as the degree to which students perceive them-
selves to be adept in:
Learning, understanding, and using chemistry knowl-
edge (chemistry self-concept)
Learning, understanding, and using mathematics
knowledge (mathematics self-concepts)
Overall academic ability (academic self-concept)
Personal enjoyment of academic learning (academic
enjoyment self-concept)
Creativity (creativity self-concept)
The subscale scores do not mean much in an absolute
sense. It will be most informative to compare before and
after scores for individuals or groups over time, or to com-
pare different groups with each other.
Why would instructors want to measure the self-con-
cepts (i.e., attitudes) of their chemistry students? Three po-
tential applications are suggested.
First, students in entry-level courses have diverse inter-
ests, backgrounds, and learning approaches. Instructors of-
ten have only a diffuse awareness of the classroom atmosphere
based on interactions with a few individual students. Hav-
ing self-concept data pre- and post-instruction would pro-
vide more precise insight regarding how all students are
responding to the class: Does self-concept increase for some
and decrease for others, and what characteristics distinguish
those two groups of students?
Secondly, an instructor may implement a new teaching
approach and wonder whether it affects student self-concept
relative to the old approach. The old approach might have
been the same course last year, be a parallel section this year,
or be another time segment within the same course this year.
Appropriate consideration must be given to the equivalen-
cies of the student populations.
Finally, good teachers are interested in the intellectual
and emotional growth of their students. Exam scores pro-
vide information only about one aspect of students. The
CSCI provides a different way of coming to know students
individually and collectively, and provides instructors with
the challenge of improving student outcomes in this regard
also.
Acknowledgments
I thank postdoctoral research associates Kimberly Rickert
and Laurie Langdon, and Laboratory Coordinator Amy Lind-
say, for their assistance and advice.
W
Supplemental Material
The chemistry self-concept inventory, a spreadsheet tem-
plate that automates calculations, and a users guide are avail-
able in this issue of JCE Online.
Literature Cited
1. Munby, H. An Investigation into the Measurement of Attitudes
in Science Education, ED237347, ERIC Clearinghouse for Sci-
ence, Mathematics, and Environmental Education; Ohio State
University: Columbus, OH, 1983.
2. Mayer, V. J.; Richmond, J. M. Science Education 1982, 66,
4966.
3. Blosser, P. E. Attitude Research in Science Education. Informa-
tion Bulletin No. 1, ED259941, ERIC Clearinghouse for Sci-
ence, Mathematics, and Environmental Education; Ohio State
University: Columbus, OH, 1984.
4. Schibeci, R. A. Studies in Science Education 1984, 11, 2659.
5. Osborne, J.; Simon, S.; Collins, S. Int. J. Sci. Educ. 2003, 25,
10491079.
6. Haladyna, T.; Shaughnessy, J. Science Education 1982, 66,
547563.
7. Shrigley, R. L. Science Education 1983, 67, 425442.
8. Shrigley, R. L.; Koballa, T. R.; Simpson, R. D. J. Res. Sci. Teach.
1988, 25, 659678.
9. Gardner, P. L. Studies in Science Education 1975, 2, 141,75.
10. Koballa, T. R. Science Education 1988, 72, 115126.
11. Shavelson, R. J.; Hubner, J. J.; Stanton, G. C. Rev. Educ. Res.
1976, 46, 407441.
12. Mager, R. F. Developing Attitude Toward Learning, 2nd ed.;
Lake Publishing Co.: Belmont, CA, 1984.
13. Ajzen, K.; Madden, T. J. J. Exp. Soc. Psychol. 1986, 22, 453
474.
14. Fishbein, M.; Ajzen, I. Beliefs, Attitude, Intention, and Behav-
ior: An Introduction to Theory and Research; Addison-Wesley:
Reading, MA, 1975.
15. Pintrich, P. R.; Schunk, D. H. Motivation in Education;
Prentice-Hall: Englewood Cliffs, NJ, 1996.
16. Gable, R. K. Instrument Development in the Affective Domain;
Kluwer-Nijhoff Publishing: Boston, MA, 1986.
17. Marsh, H. W.; Walker, R.; Debus, R. Contemp. Educ. Psychol.
1991, 16, 331345.
18. Beane, J. A.; Lipka, R. P. Self-Concept, Self-Esteem, and the
Curriculum; Teachers College Press: New York, 1986.
19. Bandura, A. Psychol. Rev. 1977, 84, 191215.
20. Rutherford, F. J.; Ahlgren, A. Science for All Americans; Ox-
ford University Press: New York, NY, 1990.
21. Schibeci, R. A.; Murcia, K. J. Coll. Sci. Teach. 2000, 29, 205
209.
22. Giere, R. N. Explaining Science: A Cognitive Approach; Uni-
versity of Chicago Press: Chicago, 1988.
23. National Science Education Standards; National Research Coun-
cil: Washington, 1996.
24. Nunnally, J. C. Educational Measurement and Evaluation;
McGraw-Hill: New York, 1972.
25. Coopersmith, S. The Antecedents of Self-Esteem; Freeman: San
Francisco, 1967.
26. Covington, M. V. A Motivational Analysis of Academic Life
in College. In Higher Education: Handbook of Theory and Re-
search, Vol. 9, J. C. Smart, Ed.; Agathon Press: New York,
1993.
27. Markus, H.; Nurius, P. In Self and Identity: Psychosocial Per-
spectives, Yardley, K., Honess, T., Eds.; Wiley and Sons: New
York, 1987, pp 157172.
Research: Science and Education
1870 Journal of Chemical Education Vol. 82 No. 12 December 2005 www.JCE.DivCHED.org
28. House, D. J. International Journal of Instructional Media 1994,
21, 113.
29. Pintrich, P. R.; Marx, R. W.; Boyle, R. A. Rev. Educ. Res. 1993,
63, 167199.
30. Ziegler, A.; Heller, K. A. Journal of Secondary Gifted Educa-
tion 2000, 11, 144153.
31. Shrigley, R. L. J. Res. Sci. Teach. 1990, 27, 97113.
32. Marsh, H. W.; Yeung, A. S. J. of Educ. Psychol. 1997, 89, 41
54.
33. Marsh, H. W.; ONeill, R. Journal of Educational Measurement
1984, 21, 153174.
34. Byrne, B. M.; Shavelson, R. J. J. of Educ. Psychol. 1986, 78,
474481.
35. Marsh, H. W.; Smith, I. D.; Barnes, J.; Butler, S. J. of Educ.
Psychol. 1983, 75, 772790.
36. Marsh, H. W.; Shavelson, R. Educational Psychologist 1985,
20, 107123.
37. Byrne, B.; Shavelson, R. J. Am. Educ. Res. J. 1987, 24, 365
385.
38. Marsh, H. W.; Barnes, J.; Hocevar, D. Journal of Personality
and Social Psychology 1985, 49, 13601377.
39. Atlas, J. A.; Gable, R. K.; Isonio, S. In The Thirteenth Mental
Measurements Yearbook; Impara, J. C., Plake, B. S., Eds.; Uni-
versity of Nebraska Press: Lincoln, Nebraska, 1998.
40. Wylie, R. C. Measures of Self-Concept; University of Nebraska
Press: Lincoln, 1989.
41. Marsh, H. W.; Yeung, A. S. Am. Educ. Res. J. 1997, 34, 691
720.
42. Marsh, H. W. Multivariate Behav. Res. 1987, 22, 457480.
43. Marsh, H. W. Am. Educ. Res. J. 1993, 30, 841860.
44. Larocque, L. Preadolescent Self-Concept and Self-Concept / Aca-
demic Achievement Relations: Investigating Multidimensional and
Hierarchical Structures within and across Gender. Ph.D. Dis-
sertation, University of Ottawa, Ottawa, Canada, 1999.
45. Byrne, B. American Psychologist 2002, 57, 897909.
46. Misiti, F. L. J.; Shrigley, R. L.; Hanson, L. Science Education
1991, 75, 525540.
47. Taylor, J. Analytical Chemistry 1981, 53, 1588A.
48. Taylor, J. Analytical Chemistry 1983, 55, 600A.
49. Kline, P. An Easy Guide to Factor Analysis; Routledge: Lon-
don, 1994.
50. Gossser, D. K.; Roth, V. J. Chem. Educ. 1998, 75, 185187.

Vous aimerez peut-être aussi