Vous êtes sur la page 1sur 121

QUANTITATIVE RESEARCH

DESIGNS
1. DESCRIPTIVE RESEARCH
2. CORRELATIONAL RESEARCH
3. EXPERIMENTAL RESEARCH
4. HISTORICAL QUANTITATIVE
RESEARCH
5. CAUSAL RESEARCH
Identify the most appropriate
quantitative research design:
1. A study on the effect of gender in
the attitude of youth toward sports
drinks.
2. A study on thoughts of Grade 12
students on difficulties in solving
mathematical problem
Identify the most appropriate
quantitative research design:
3. A study on the effect of remediation
activities to the performance of
students in mathematics
4. A study on the probable number of
students in a school the next school
year based on their enrolment
statistics in a school for 5-year period
METHODS OF COLLECTING DATA

1. Interview Method
-data are obtained through oral
exchange of questions and answers by
the researcher and the respondents.
-done face to face, through
telephone, and mobile phone
METHODS OF COLLECTING DATA
2. Questionnaire Method
-data are provided by the
respondents in a set of questions
provided by the researcher.
-handed in printed form, sent
through e-mail or other form of
technology
-if the study needs large sample
size
METHODS OF COLLECTING DATA

3. Document Method
-data are previously gathered and
stored may be availed by the
researcher
METHODS OF COLLECTING DATA

4. Observation Method
-data are acquired on an actual
situation and recorded through direct
observation by the researcher
METHODS OF COLLECTING DATA

5. Experiment Method
-data are gathered through
experimentation process
-used when the cause/s of the
phenomenon being studied is
scrutinized
Identify the most appropriate
way of collecting data using the
quantitative method:
1. A study on the most preferred
internet search engine
2. A study on the effect of sunlight on
the growth of seedlings
Identify the most appropriate
way of collecting data using the
quantitative method:
3. A study on the average family
income and food expenditures
4. A study on a baby’s reaction to
different types of toys
DepEd National Training of
Trainers VTalisayon
 Simple Random Sampling – each subject (from whom data will
be collected) has equal chance of being selected (usually
without replacement)
 Purposive Sampling – sample is selected based on
researcher’s purpose or criterion (often easy access to subjects
or data)
 Stratified random sampling – based on stratum, random
selection of group w/in stratum (e.g., Region 4 for stratum
region)
 Cluster random sampling – based on cluster (natural grouping),
random selection of group w/in cluster (e.g., street within
 cluster village)
DepEd National Training of
Trainers VTalisayon
 Description of the subjects, number & grade
level (for students), naming school and its
location, unless w/ confidentiality conditions.
Otherwise, only location, type, and size of
school are stated.
 Number of classes, class size, and number of
teachers are indicated.
 How students and teachers are selected
(sampling procedure) is described.

DepEd National Training of


Trainers VTalisayon
After data collection, put actual sample:
sample size, distribution of subjects by sex &
average age.
If applicable, describe relevant characteristics
of teacher (s) like sex, teaching experience,
educational attainment, & related training.
Use past tense.

DepEd National Training of


Trainers VTalisayon
RESEARCH TECHNIQUES OR
INSTRUMENTS
 Enumerate instruments to be used. Indicate
if researcher-made, adopted or modified (give
author, year, & reliability coefficient).
 Quickly describe pilot-testing of instruments.
 Indicate how researcher-made instruments will
be content-validated.
 Indicate computing reliability coefficient,
commonly Cronbach alpha, for rating scale and
tests. Google in Internet for online calculator of
Cronbach alpha

DepEd National Training of


Trainers VTalisayon
Commonly Used Research Instruments
INTERVIEW AS A TOOL TO GATHER DATA
1. In quantitative research, interview is
structured-elicit responses which will be
treated statistically
2. An interview schedule is prepared
3. Questions should be properly worded
4. Each question should be limited to
single idea


5. Pilot test the questions to be used


6. Remove or reword ambiguous
questions
7. devise coding system
DepEd National Training of
Trainers VTalisayon
Commonly Used Research Instruments
QUESTIONNAIRE AS A TOOL TO GATHER DATA
-to be able to reach a bigger number of
respondents in a shorter period of time
2 Basic Types of Questions in Conducting Survey
1. OPEN-ENDED QUESTIONS
-used when there are many possible
responses to a particular problem

-respondents may answer freely and provide
a detailed response to a query
-easy to construct but difficult to tabulate
and analyze
e.g. What is your reason for joining extra-
curricular activities
2 Basic Types of Questions in Conducting Survey

2. CLOSED-ENDED QUESTIONS
-have limited responses which are pre-
determined (e.g. gender- male or female
-more difficult to construct but easier to
tabulate and analyze
-easier to answer by the respondents

TYPES OF CLOSED-ENDED QUESTIONS

1. DICHOTOMOUS QUESTIONS
-require respondents to answer two-point
questions such as “yes” or “no”; “satisfied” or
“unsatisfied”
2. MULTIPLE CHOICE QUESTIONS
-require the respondents to choose one
among

the different choices enumerated
“ How many hours per day do you spent studying?”
• Respondents maybe required to check one
among the five choices:
• Less than 1 hour * 2 to less than 3 hours
• 1 to less than 2 hours * 3 to less than 4 hours
TYPES OF CLOSED-ENDED QUESTIONS

3. RANK ORDER QUESTIONS


-require respondents to indicate their order
of preference from a list of options
Example: Respondents maybe asked to rate a
given list of fruit juices according to preference,
with 1 as the most preferred and 6 as the least
preferred

TYPES OF CLOSED-ENDED QUESTIONS

4. RATING SCALE
-require respondents to rate their
agreement or disagreement with a particular
statement.
-uses Likert scale
Example: 5 point likert scale

strongly agree to strongly disagree
great extent to least extent
very often to never
Observation Activity
• Observe the research instruments. List down
descriptions regarding its parts, questions and
overall makeup. (5mins)

parts questions Overall makeup


CRITICAL ELEMENTS THAT NEED TO BE
CONSIDERED IN A QUESTIONNAIRE
• WHICH MUST COME
FIRST, VALIDITY OR
RELIABILITY?
• CAN THEY BE DONE
SIMULTANEOUSLY?
No matter how well the
objectives are written,
or how clever the items,
the quality and usefulness
of an examination is
predicated on Validity
and Reliability

Validity & Reliability


Validity
BOTH CONCEPTS
RELATE TO THE
SCORE NOT
THE TEST!

Reliability

Validity & Reliability


Validity Reliability
We don’t say “an exam is valid and reliable”

We do say “the exam score is reliable and


Valid for a specified purpose”

KEY ELEMENT!

Validity & Reliability


Reliability
“Was the measure of student
ability reached by accident or
chance, or was it reached through
a clear and stable, meaningful
examination?”

Validity & Reliability


Reliability
Answers questions relative to
the stability and clarity of an
test questionnaires.

Validity & Reliability


Reliability
Reliability is a STATISTIC.
Specifically, a correlation
ranging from
0 = Completely Unreliable
1 = Perfectly Reliable

Validity & Reliability


Reliability
Reliability is also a measure of
error within an examination.

Poor wording, ambiguity,


correct answer issues, failure
to link with objectives, etc.
Validity & Reliability
Content/
Action + Error

Validity & Reliability


The information we seek and our
best hope for obtaining it.

Content/
Action + Error

Validity & Reliability


The information we seek and our
best hope for obtaining it.

Content/
Action + Error

Our human frailty and inability to


write effective questions.

Validity & Reliability


Reliability
Reliability comes in multiple
forms, including:

Test-Retest Reliability
Split-Half Reliability

Validity & Reliability


Reliability
Test-Retest
Administer to students.
After time, administer again.

Are the students performing in


a consistent manner?
Validity & Reliability
Reliability
Split-Half
Create 2 smaller content
Balanced tests and administer.

Do students perform similarly


on each test half?
Validity & Reliability
Validity
“Did the test we created really
measure what we hoped it
would measure?”

Validity & Reliability


Validity
Relates to questions about the
reasonability of content,
criterion and construct
associated with the learning
objectives and assessment.

Validity & Reliability


Validity & Reliability
Validity & Reliability
Validity
Content Related Validity

Does the content we have selected


appropriately represent our
construct (top level domain)?

Validity & Reliability


Validity
Construct Validity

Is there really such a concept as


what we created as a construct?
(e.g. Does “reading ability” really
exist?)
Validity & Reliability
Validity
Criterion Related Validity

Does our test adequately link the


content to the construct?
Is the test a good measure of the
construct?
Validity & Reliability
Validity

Examination Quality

Reliability

Validity & Reliability


How do we determine
content validity of
research instrument?
• For Validity, present the instrument to
experts for suggestions on content; then
incorporate suggestions.
• DO WE STILL NEED TO SUBJECT A
RESEARCH TOOL DIRECTLY LIFTED FROM
TESTING CENTERS AND PREVIOUS
STUDIES?
Guide in the Selection of Validators
• For SHS research, at least three (3) validators
are desired.
• In selecting validators, consider work
experiences and educational qualifications.
(e.g. If the study is along student behavior, the
validators must be experts on student
behavior (GCs), advisers, teachers.
SAMPLE VALIDATION TOOL
Indicators 5 4 3 2 1

The items are valid representative of the scope.

The items are clear and do not warrant


misconceptions.

The items lead to an acceptable answer.

The items are free from any errors

Generally, the questionnaire measures what it intends


to measure
SAMPLE COMPUTATION ON VALIDITY
Indicators V1 V2 V3 Mean

The items are valid representative of the 5 5 5


scope.
The items are clear and do not warrant 4 5 4
misconceptions.

The items lead to an acceptable answer. 4 4 3

The items are free from any errors 4 4 4

Generally, the questionnaire measures what 4 4 4


it intends to measure

Mean
VALIDITY CALCULATOR
How do we determine
reliability of research
instrument?
• Subject the instrument to pilot testing;
then use appropriate statistical tools
• KR-20
• Split-half/ Parallel forms
• Cronbach-alpha
• KR-21
When do we subject an instrument to
reliability?
• Who and How many must compose the
reliability testing? What must be considered?
• How many must be considered in reliability
testing?
SAMPLE COMPUTATIONS
SAMPLE COMPUTATIONS

• SAMPLE RELIABILITY CALCULATOR


• reliability calculator.xls
Level of Reliability
Range Interpretation

1.00 Perfect

0.71-0.99 Very High

0.51-0.70 High

0.21-0.50 Low

0.01-0.20 Negligible

0.00 No
HOW DO WE ADMINISTER?
• Face-to-face questionnaire administration, where an
interviewer presents the items orally.
• Paper-and-pencil questionnaire administration, where
the items are presented on paper.
• Computerized questionnaire administration, where
the items are presented on the computer.
• Adaptive computerized questionnaire administration,
where a selection of items is presented on the
computer, and based on the answers on those items,
the computer selects following items optimized for the
testee's estimated ability or trait.
 Add structure/parts and number of items of each
instrument.
 Add Table of Specifications for researcher-made
tests.
Add details of content validation: who the experts
are without naming them.
Add details of pilot-testing (when and who), and
describe group (similar to group to which
instruments will be eventually administered).


DepEd National Training of
Trainers VTalisayon
 Clearlyand completely describe how the
intervention will be implemented, such that
the reader can replicate the intervention.
 Describe what happens in comparison group.

DepEd National Training of


Trainers VTalisayon
Quicklydescribe whose permission will be
sought and arrangements to make to
administer instruments.
Describe when instruments will be
administered and who will administer them.
Add details on arrangements and
administration of instruments, if needed.

DepEd National Training of


Trainers VTalisayon
Describe analysis to be done for each
research question, following sequence in
Statement of Problem.
State if tests of hypotheses will be done and
for what purpose.
Indicate that tests of hypotheses will be done
at . 05 level of significance.
 Level of significance or p value = no. of
cases out of 100 cases that results are due to
chance alone.

DepEd National Training of


Trainers VTalisayon
p value of .05 = in 5 out of 100
cases , results are due to chance alone. Good
news?
Add scoring system for instruments.

Avoid giving formulas or standard procedures


for statistical tools (reader is expected to
know these or look them up in statistics
references).

DepEd National Training of


Trainers VTalisayon
 Descriptive Survey
One Internet site for online computation for Chi Square test
http://graphpad.com/quickcalcs/chisquared2/
Key in actual and expected frequencies or percentages of each
category of a group. Outputs are chi square value and p value.
 Correlational Causal

One Internet site for online computation of correlation


coefficient and p value
http://www.socscistatistics.com/tests/pearson/Default2.aspx
Key in data per person on two variables, x and y. Outputs are
correlation coefficient and p value

DepEd National Training of


Trainers VTalisayon
 Causal Study
Example: Linear regression represented by the equation for a
line,
y = mx + b
where: (Since x predicts y or y depends on x)
x = independent variable, y = dependent variable
m or the slope of the line = regression coefficient

 In
Social Sciences, statistical software is SPSS (Statistical
Package for Social Sciences).

DepEd National Training of


Trainers VTalisayon
STUDENTS’ FAMILIARITY, INTEREST,
CONCEPTUAL KNOWLEDGE AND
PERFORMANCE ON BASIC AND
INTEGRATED SCIENCE PROCESS
SKILLS
METHODS

This chapter presents the

following: research methods used, the

respondents and sampling procedures,

research locale, , the instruments

used, and procedure of analysis.


Research Design

Descriptive method of research was employed in the utilization of this

study using questionnaire-checklist and multiple choice test on basic and

integrated science process skills.

Descriptive research is used to obtain information concerning the

current status of the phenomena to describe “what exists” with respect to

variables or conditions in situation. Its principal aims are to describe the

nature of the situation, as it exists at the time of the study and to explore the

causes of the particular phenomena (Travers, 2000).

A set of questionnaires was utilized to adequately describe the

competencies on basic and integrated science process skills and the data were

presented on a descriptive-interpretative form.


Research Design

This study gathered data on students’

familiarity, interest and conceptual knowledge using

survey questionnaire and multiple choice format test

to gather data on students’ performance on basic and

integrated science process skills. These variables

are correlated to the attainment of students’

competency on basic and integrated science process

skills.
Participants
The respondents of the study were 776 secondary students of Juan R.

Liwag Memorial High School and San Roque National High School in the

Division of Gapan City during the school year 2012-2013. Two (2) Secondary

Schools were selected from the Division of Gapan City namely Juan R. Liwag

Memorial High School and San Roque National High School since these schools

offered the curriculum for special science classes using the purposive

sampling. Purposive sampling is a design which is based on choosing

individuals as samples according to the purposes of the researcher as his

control and they were chosen as part of the sample because of good evidence

that they are representative of the total population (Calmorin and

Calmorin, 1996).
Participants
• The sample was drawn through random

sampling. There were 194 grade 7, 194 second

year, 193 third year and 195 fourth year

students for a total of 776 respondents.

• Table 1 presents the breakdown of the

sample.
Table 1

Subjects of the Study

Total

Schools Schools
Special Regula Total
Class r
JRLMHS SRHS JRLMHS SRHS Class

N N N n
n N

Year Special Regular Special Regular Special Regular Special Regular


Level Class Class Class Class Class Class Class Class
76 118 194

Grade 7 80 1065 43 283 44 62 32 56


76 118 194

Second 81 1023 37 276 44 62 32 56


76 117 193

Third 92 833 33 251 44 62 32 55


78 117 195

Fourth 93 835 54 238 45 61 33 56


306 470 776

Total 346 3756 167 1048 177 247 129 223


Research Site

The study was conducted at the Department

of Education of Gapan City with two (2)

secondary schools purposively selected namely

Juan R. Liwag Memorial High School and San

Roque National High School since these schools

offered the curriculum for special science

classes.
Materials and Instruments
The research instruments that were utilized in

this study were adapted from the work of Miles

(2010) and Mungandhi (2005). The validated survey

questionnaires was adapted to investigate the

familiarity, interest, conceptual knowledge of and

performance on basic and integrated science process

skills of secondary students.


Materials and Instruments
• The validated instrument has three (3) sections. The first

part which includes the students’ familiarity with and interest

in science process skills which measure the respondents’

awareness on the different science process skills and to the

extent they think they are interested in.


• Adjacent to students’ familiarity with and interest in science process skills

is the secondary students’ conceptual knowledge on science process skills which can

be determined using the section 2 of the instrument. It includes the definitions,

explanations on which students will select the terms in relation to science process

skills.

• The two factors should be complementary to the students’ performance on

science process skills which can be determined using the test of basic and

integrated process skills consisting of 50 items test.


Materials and Instruments
• Familiarity with Science Process Skills Questionnaire

• The research instrument used in the study was adapted from

the Familiarity with science process skills survey

questionnaires of Miles (2010). This research instrument was

found to have a Cronbach alpha reliability coefficients of

0.953. This questionnaire contains 13 terms on science process

skills which was consisted of three responses to choose from

each skill. The Likert-format was used in this instrument using

the following response mode:

• Term not familiar to me 1

• Term familiar to me but not understood 2

• Term familiar to me and I understand its meaning 3


Materials and Instruments
• Interest in Science process Skills Questionnaire

The research instrument used in the study was adapted

from the Interest in science process skills survey

questionnaires of Miles (2010). This research instrument

was found to have a Cronbach alpha reliability coefficients

of 0.957. This questionnaire contains 13 terms on science

process skills which was consisted of three responses to

choose from each skill. The Likert-format was used in this

instrument using the following response mode:

Not at all interested in learning more 1

Interested in learning more 2

Very interested in learning more 3


Materials and Instruments
• Interest in Science process Skills Questionnaire

The research instrument used in the study was adapted

from the Interest in science process skills survey

questionnaires of Miles (2010). This research instrument

was found to have a Cronbach alpha reliability coefficients

of 0.957. This questionnaire contains 13 terms on science

process skills which was consisted of three responses to

choose from each skill. The Likert-format was used in this

instrument using the following response mode:

Not at all interested in learning more 1

Interested in learning more 2

Very interested in learning more 3


Materials and Instruments
• Conceptual knowledge of Science process skills test

• The conceptual knowledge of science process skills

test was also adapted from the Conceptual Knowledge of

science process skills survey questionnaires of Miles

(2010) and Sta Ines (2012).It was found to have a Cohen’s

kappa inter-rater reliability of 0.250. This instrument

determines the participants’ conceptual knowledge of

science process skills through definitions or explanations

of twelve (12) science process skills in which participants

were asked to mark the definitions or explanations which

correspond to the terms on science process skills.


Materials and Instruments
• Conceptual knowledge of Science process skills test

The conceptual knowledge of science process skills

test was also adapted from the Conceptual Knowledge of

science process skills survey questionnaires of Miles (2010)

and Sta Ines (2012).It was found to have a Cohen’s kappa

inter-rater reliability of 0.250. This instrument determines

the participants’ conceptual knowledge of science process

skills through definitions or explanations of twelve (12)

science process skills in which participants were asked to

mark the definitions or explanations which correspond to the

terms on science process skills.


Materials and Instruments
• Conceptual knowledge of Science process skills test

The conceptual knowledge of science process skills

test was also adapted from the Conceptual Knowledge of

science process skills survey questionnaires of Miles (2010)

and Sta Ines (2012).It was found to have a Cohen’s kappa

inter-rater reliability of 0.250. This instrument determines

the participants’ conceptual knowledge of science process

skills through definitions or explanations of twelve (12)

science process skills in which participants were asked to

mark the definitions or explanations which correspond to the

terms on science process skills.


Materials and Instruments
Conceptual knowledge of Science process skills test

The definitions or explanations of the science process skills was

adapted by using a variety (one to ten) of other definitions or

explanations found in published research which was developed by Miles

(2010). These variety of definitions were taken and developed by Miles

(2010) were guided by the following:

1. The definitions referenced were taken from four research articles

(Lancour, 2004; Valentino, 2000; Longfield, 2002; Emereole, 2008); five

books devoted to the science process skills (Ostlund, 1992; Funk et. al,

1985; Gega and Peters, 1998; Rezba et. al., 2007; Chiappetta & Koball,

2010); and from the National Science Teachers Association (Padilla, 1990).

2. Based from item analysis done by Miles, 2010, this instrument was

found for inter-rater reliability and yielded a Cohen Kappa rating of

0.250.
Materials and Instruments
Conceptual knowledge of Science process skills

test

The Likert-format was used in this instrument

using the following response mode:

Incorrect 1

Partially Correct 2

Correct 3
Materials and Instruments
Conceptual knowledge of Science process skills

test

The Likert-format was used in this instrument

using the following response mode:

Incorrect 1

Partially Correct 2

Correct 3
Materials and Instruments
Performance on Science Process Skill Test

This instrument was a compilation of test

items from published reliable and valid science

process skill performance test constructed and

developed by different authors. The first nineteen

items (19) multiple choice questions are categorized

as basic science process skills taken from the test

on basic and integrated science process skills

developed by Miles, 2010 with Cohen’s kappa inter-

rater reliability score of 0.764.


Materials and Instruments
Performance on Science Process Skill Test

This instrument was a compilation of test

items from published reliable and valid science

process skill performance test constructed and

developed by different authors. The first nineteen

items (19) multiple choice questions are categorized

as basic science process skills taken from the test

on basic and integrated science process skills

developed by Miles, 2010 with Cohen’s kappa inter-

rater reliability score of 0.764.


Materials and Instruments
Performance on Science Process Skill Test

Test item number 20-49 were taken from the work of

Mungandhi (2005) which consists of test on integrated

science process skills with Cohen’s kappa inter-rater

reliability score of 0.81. The researcher included another

one (1) item to complete fifty (50) items test on basic

and integrated science process skills. This item is

classified as integrated science process skill which was

also taken from the work of Miles, (2010).The table below

shows the summary of the content of the test of basic and

integrated science process skills.


Materials and Instruments
Table 2

Content of the Test of Basic and Integrated Science Process Skills

ITEM NUMBER NO. OF SCIENCE PROCESS SKILLS TYPE OF

ITEMS SKILL

1, 2, 3 3 CLASSIFICATION BASIC

4, 5, 6 3 PREDICTING BASIC

7, 8, 9 3 INFERRING BASIC

10, 11, 12, 13 4 MEASURING BASIC

14, 15, 16 3 COMMUNICATING BASIC

17,18,19 3 OBSERVING BASIC

23, 24, 27, 29, 31, 10 GRAPHING AND INTERPRETING INTEGRATED

34, 37, 44, 47, 50 DATA

22, 33, 35 3 EXPERIMENTING INTEGRATED

28, 32, 36, 40, 43, 46 6 STATING HYPOTHESIS INTEGRATED

20, 26, 30, 38, 41, 42 6 OPERATIONAL DEFINITIONS INTEGRATED

21, 25, 39, 45, 48, 49 6 IDENTIFYING AND INTEGRATED

CONTROLLING VARIABLES
Materials and Instruments
Table 2

Content of the Test of Basic and Integrated Science Process Skills

ITEM NUMBER NO. OF SCIENCE PROCESS SKILLS TYPE OF

ITEMS SKILL

1, 2, 3 3 CLASSIFICATION BASIC

4, 5, 6 3 PREDICTING BASIC

7, 8, 9 3 INFERRING BASIC

10, 11, 12, 13 4 MEASURING BASIC

14, 15, 16 3 COMMUNICATING BASIC

17,18,19 3 OBSERVING BASIC

23, 24, 27, 29, 31, 10 GRAPHING AND INTERPRETING INTEGRATED

34, 37, 44, 47, 50 DATA

22, 33, 35 3 EXPERIMENTING INTEGRATED

28, 32, 36, 40, 43, 46 6 STATING HYPOTHESIS INTEGRATED

20, 26, 30, 38, 41, 42 6 OPERATIONAL DEFINITIONS INTEGRATED

21, 25, 39, 45, 48, 49 6 IDENTIFYING AND INTEGRATED

CONTROLLING VARIABLES
Materials and Instruments
Reliability of the Instruments

Since this instrument is a compilation of test items

from the performance test constructed and developed by

different authors, the researcher ensures the reliability of

the test. According to Rudner (1994), fundamental to the

evaluation of any test instrument is the degree to which test

scores are free from measurement error and are consistent

from one occasion to another when the test is used to the

target group. It should also be sufficiently reliable to

permit stable estimates of the ability of individuals in the

target group.
Materials and Instruments
Reliability of the Instruments

Since this instrument is a compilation of test items

from the performance test constructed and developed by

different authors, the researcher ensures the reliability of

the test. According to Rudner (1994), fundamental to the

evaluation of any test instrument is the degree to which test

scores are free from measurement error and are consistent

from one occasion to another when the test is used to the

target group. It should also be sufficiently reliable to

permit stable estimates of the ability of individuals in the

target group.
Materials and Instruments
Reliability of the Instruments

• The instrument was subjected to a test-retest to

fifteen (15) third year students from Special Program in

the Arts (SPA) at Juan R. Liwag Memorial High School.It

was administered during the 2nd Semester of the School

Year 2012-2013. Student scores were processed with QI

Macros 2010. The reliability coefficient was computed on

the basis of answers given by the students. After the

first test, the same instrument was given to the same

group of SPA students after three weeks.


Materials and Instruments
Reliability of the Instruments

The reliability is the correlation between the scores on

the two instruments. Using the Pearson-product moment correlation

coefficient, the Performance test was found to have a reliability

coefficient of 0.89 which denotes a high relationship between the

result of the first test and the second test. This means that the

results are consistent and the scores of the two tests are similar.

This reliability coefficient is within the recommended range of

values 0.7-1.0 (Adkins, 1974). Gall and Borg (1996) however,

proposed a reliability coefficient of 0.8 or higher to be

sufficiently reliable for most research purposes.


Materials and Instruments
Reliability of the Instruments

The reliability is the correlation between the scores on

the two instruments. Using the Pearson-product moment correlation

coefficient, the Performance test was found to have a reliability

coefficient of 0.89 which denotes a high relationship between the

result of the first test and the second test. This means that the

results are consistent and the scores of the two tests are similar.

This reliability coefficient is within the recommended range of

values 0.7-1.0 (Adkins, 1974). Gall and Borg (1996) however,

proposed a reliability coefficient of 0.8 or higher to be

sufficiently reliable for most research purposes.


Materials and Instruments
Data Collection

The table below shows the timeline for the completion

of the instrument, preparation, data collection and data

analysis of this research study.


Materials and Instruments
DATE METHOD

JUNE-SEPTEMBER 2012 GATHERING EXISTING DATA

COLLECTION OF INSTRUMENTS

OCTOBER 2012 HUMAN SUBJECTS COMMITTEE REQUEST AND APPROVAL

OCTOBER 2012 ADAPTING AND MODIFYING EXISTING INSTRUMENTS FOR

FAMILIARITY, INTEREST, CONCEPTUAL KNOWLEDGE AND

PERFORMANCE ON SCIENCE PROCESS SKILLS

NOVEMBER 2012 ADMINISTERING DEMOGRAPHIC FORM FAMILIARITY WITH AND

INTEREST IN SCIENCE PROCESS SKILLS QUESTIONNAIRE AND

CONCEPTUAL KNOWLEDGE OF SCIENCE PROCESS SKILLS

DECEMBER 2012 ADMINISTERING THE PERFORMANCE TEST ON BASIC AND INTEGRATED

SCIENCE PROCESS SKILLS

JANUARY-FEBRUARY 2012 DATA ANALYSIS OF DEMOGRAPHIC FORM FAMILIARITY WITH AND

INTEREST IN SCIENCE PROCESS SKILLS QUESTIONNAIRE

JANUARY-FEBRUARY 2012 DATA ANALYSIS OF CONCEPTUAL KNOWLEDGE OF SCIENCE PROCESS

SKILLS TESTS

JANUARY-FEBRUARY 2012 DATA ANALYSIS OF PERFORMANCE ON SCIENCE PROCESS SKILLS

TEST
Data Analysis

Familiarity with and Interest in Science Process Skills


Questionnaire
To determine the secondary students’ familiarity
with science process skills, the data were analyzed for its
mean on individual process skills. The over-all familiarity
with science process skills was determined by computing
the mean scores. The data were then analyzed for
differences among participant sub-groups (year level,
curriculum enrolled), using t-test for two uncorrelated
samples, and one-way analysis of variance (ANOVA) test.
Data Analysis
Familiarity with and Interest in Science Process Skills
Questionnaire
The same data analysis procedure described above was
used to determine students’ interest in science process skills.The
weighted means of the response was computed. The scales used
to verbally describe the weighted means are as follows:
Familiarity with Science Process Skills
High Familiarity 2.4 - 3.00
Moderate Familiarity 1.61 - 2.39
Low Familiarity 1.00 - 1.60

Interest in Science Process Skills


Very Interested in Learning More 2.4-3.00
Moderately Interested in learning more 1.61-2.39
Not at all Interested in learning more 1.00-1.60
Data Analysis
Conceptual Knowledge of Science Process Skills Test
Students’ conceptual knowledge was examined using the
Conceptual Knowledge of Science Process Skills Test, in which
participants were asked to mark the definitions or explanations
which correspond to the terms on science process skills.
Definitions or explanations of the science process skills was
adapted by using a variety (one to ten) of other definitions or
explanations found in published research which was developed
by Miles(2010).
.
Data Analysis
Conceptual Knowledge of Science Process Skills Test
Correct responses received a coding point value of 3
while partially correct responses received a point value of 2 and
incorrect responses received a point value of 1. The same data
analysis procedure described above was used to determine
students’ conceptual knowledge on science process skills. The
weighted means of the response was computed. The scales used
to verbally describe the weighted means are as follows:
Conceptual Knowledge on Science process Skills
High Conceptual Knowledge 2.4 - 3.00
Moderate Conceptual Knowledge 1.61 - 2.39
Low Conceptual Knowledge 1.00 - 1.60
Data Analysis
Performance Test on Science Process Skills

Performance on science process skills was measured using scores

from the Performance Test on Basic and Integrated Science Process Skills.

Each item was evaluated based on the number of participants correctly

answering the question using frequencies and percentages. Items that

addressed the same process skill were then analyzed together, yielding

percentages incorrect for the skill. This was done by calculating the over-

all incorrect results for all items that represented the skill. The data

were analyzed for trends among sub-groups using both the t-test for two

uncorrelated sample and one-way analysis of variance (ANOVA), as described

in the familiarity and interest data analysis.


Data Analysis
Relationship among Demographic Profile, Familiarity, Interest,

Conceptual Knowledge and Performance

Pearson Product Moment Correlation was used in determining the

correlation among the four variables (familiarity, interest,

conceptual knowledge and performance) as well as the relationship

between the demographic profile (curriculum and year level) and the

students’ performance on the test on basic and integrated science

process skills.

Vous aimerez peut-être aussi