Vous êtes sur la page 1sur 12

Computers & Education 58 (2012) 775–786

Contents lists available at SciVerse ScienceDirect

Computers & Education


journal homepage: www.elsevier.com/locate/compedu

Embedding game-based problem-solving phase into problem-posing system


for mathematics learning
Kuo-En Chang a, *, Lin-Jung Wu a, Sheng-En Weng a, Yao-Ting Sung b
a
Graduate Institute of Information and Computer Education, National Taiwan Normal University, 162 Ho-Ping East Road, Sec. 1, Taipei 10610, Taiwan
b
Department of Educational Psychology and Counseling, National Taiwan Normal University, 162 Ho-Ping East Road, Sec. 1, Taipei 10610, Taiwan

a r t i c l e i n f o a b s t r a c t

Article history: A problem-posing system is developed with four phases including posing problem, planning, solving
Received 20 January 2011 problem, and looking back, in which the “solving problem” phase is implemented by game-scenarios.
Received in revised form The system supports elementary students in the process of problem-posing, allowing them to fully
4 October 2011
engage in mathematical activities. In total, 92 fifth graders from four different classes were recruited. The
Accepted 6 October 2011
experimental group used the problem-posing system, whereas the control group followed the traditional
paper-based approach. The study investigates the effects of the problem-posing system on students’
Keywords:
problem-posing ability, problem-solving ability, and flow experiences. The results revealed more flow
Interactive learning environments
Elementary education experiences, and higher problem-solving and problem-posing abilities in the experimental group.
Teaching/learning strategies Ó 2011 Elsevier Ltd. All rights reserved.
Applications in subject areas

1. Introduction

To help students actively engage in mathematical learning, numerous researchers have proposed incorporating problem-posing as
a supplement to instructional activities. It has been found that problem-posing activities can enhance students’ thinking and create more
learning opportunities for students at different levels (Baxter, 2005; Silver & Cai, 2005; Whitin, 2004). Tsubota (1987) noted that students
who usually appear inactive in classes become surprisingly active during problem-posing activities and adopt a positive attitude toward the
curriculum. The students also cross into other subject fields when generating problems. Tsubota believed that problem-posing is a cognitive
and metacognitive strategy. Students required to focus on important concepts in the learning materials in the process of problem-posing
improves their comprehension of the materials and allows them to monitor their understanding.
Numerous researchers have attempted to understand the relationship between problem-posing and problem-solving (Crespo, 2003;
_
English, 1998; Kar, Özdemir, Ipek, & Albayrak, 2010; Keil, 1965; Leung & Silver, 1997; Nicolaou & Philippou, 2004; Silver & Cai, 1996). Keil
(1965) delivered a problem-posing instruction to approximately 800 sixth graders. He found that for mathematical problem-solving
ability, those in the experimental group, who generated problems on their own and then solved them, performed better than those in
the control group, who only solved problems from the textbook. Silver and Cai (1996) investigated the relationship between problem-posing
and problem-solving in a group of 509 middle school students. Problem-posing ability (scored according to the complexity of steps required
to solve the problem) was compared to the students’ performances on solving eight open-ended questions. They found a highly positive
correlation between problem-posing and problem-solving abilities. English (1997a, 1997b) also found a close relationship between the two
abilities. This suggests that the inspiration for problem-posing could come from the process of problem-solving. Problem-posing not only
trains students to identify the key elements in a question, but also enhances their problem-solving ability. Nicolaou and Philippou (2004)
investigated the relationship between problem-posing and mathematical achievement, and found a strong correlation between the ability
in problem-posing and general mathematical performance. Additionally, it was found that efficacy beliefs in problem-posing could predict
mathematical achievement fairly well. Kar et al. (2010) also found a significant relationship between the problem-posing and problem-
solving skills of prospective elementary mathematics teachers. Prospective teachers who posed two or more problems have higher
levels of problem-solving than the prospective teachers who posed less than two problems. A parallel was established between the number

* Corresponding author. Fax: þ886 2 2392 2673.


E-mail address: kchang@ntnu.edu.tw (K.-E. Chang).

0360-1315/$ – see front matter Ó 2011 Elsevier Ltd. All rights reserved.
doi:10.1016/j.compedu.2011.10.002
776 K.-E. Chang et al. / Computers & Education 58 (2012) 775–786

of posed problems and success in problem-solving. Thus, problem-posing is not independent from problem-solving (Cai & Hwang, 2002;
English, 2003; Lewis, Petrina, & Hill, 1998; Silver, 1994; Silver & Cai, 1996).
Leung (1993) adapted the four phases of problem-solving, introduced by Polya (1945), to the four phases of problem-posing: problem-
posing, planning, problem-solving, and looking back. She found that the problem-posing activities helped students solve problems that they
generated by themselves. When solving their own posed problems, they performed calculations and looked back at the process. They then
revised their posed problems and generated new ones. Through the process of problem-posing, students could clarify and enhance their
understanding of the concepts of the subject matter.
With the rapid advancements in information technology and the discovery of the boosting effect of problem-posing on problem-solving
abilities and motivation in mathematical learning, researchers have attempted to develop web-based problem-posing systems by applying
information technology (Barak & Rafaeli, 2004; Denny, Hamer, Luxton-Reilly, & Purchase, 2008; Fellenz, 2004; Hirai & Hazeyama, 2007;
Wilson, 2004; Yu, 2011; Yu & Liu, 2009; Yu, Liu, & Chan, 2005). Most of them focused on problem-posing by using different media formats,
and some included a peer-assessment function during students’ problem-posing activities (Yu, 2011; Yu & Liu, 2009; Yu et al., 2005).
However, the same difficulties that are encountered in traditional problem-posing activities also occur in web-based problem-posing
activities. Yu et al. (2005) investigated four web-based learning activities (problem-posing, peer-assessing, item-viewing, and drill-and-
practice) in mathematics, natural science, and social science. They found that most students (69%) felt that problem-posing was the
most difficult task of the four, especially in mathematics. Observations made during the study showed that lower-achieving students had an
especially difficult time generating questions. The students begin to tire as the number of problem-posing exercises increased, which
adversely affected the learning outcomes. Furthermore, because the students usually only solve questions posed by themselves, there is
a lack of variation in problem types. As a result, students are not motivated to perform the activity, which makes it difficult to promote
problem-posing during the instruction (English, 1998; Mestre, 2002). Because of the drawbacks of problem-posing, most researchers believe
that problem-posing should be integrated with other activities to sustain students’ intrinsic motivation. For example, games could be added
to the original problem-posing activities (Umetsu, Hirashima, & Takeuchi, 2002; Yu et al., 2005).
Computer games as educational tools have been suggested as an intrinsic motivational factor that encourages curiosity and allows
learners to be in control of their own learning (Dickey, 2007; Huizenga, Admiraal, Akkerman, & ten Dam, 2009; Kumar, 2000; Papastergiou,
2009). Recent studies have shown that a game-based learning environment can facilitate cognitive performance such as geometric thinking
(Chang, Sung, & Lin, 2007), multiplication (Chang, Sung, Chen, & Huang, 2008), and taxonomic concept development (Sung, Chang, & Lee,
2008). Several studies on computer gaming have examined learning engagement (Barab, Sadler, Heiselt, Hickey, & Zuiker, 2007; Gee, 2003,
2005; Huizenga et al., 2009; Ketelhut & Schifter, 2011; Sanchez & Olivares, 2011; Squire, 2003; Squire & Jan, 2007; Squire & Jenkins, 2004),
and numerous researchers have investigated whether game-based learning increases learning motivation (Burguillo, 2010; Charsky &
Ressler, 2011; Huang, Huang, & Tschopp, 2010; Liu and Chu, 2010). Some researchers found game-based learning can bring a flow
learning experience (Admiraal, Huizenga, & Akkerman, 2011; Liu, Cheng, & Huang, 2011). Several researchers also suggest that being in
a state of flow could augment students’ motivation and learning process (Chen, Wigand, & Nilan, 1999; Keller, 2009).
Flow occurs when people fully engage in a task with personal satisfaction (Csikszentmihalyi & Csikszentmihalyi, 1988). Game users will
immerse in a flow experience during game play because motivation and flow experience are positively related. If the game can produce flow,
it enhances their motivation to play the game. Several studies have investigated what sort of design features would enhance learning
engagement and motivation by measuring students’ flow experiences in a game-based learning context (Inal & Cagiltay, 2007; Kiili, 2005;
Lim, Nonis, & Hedberg, 2006).
This study proposed a system with four problem-posing phases: posing problem, planning, solving problem, and looking back, in which
the “ solving problem” phase is implemented by game-scenarios to support students in the process of problem-posing, allowing them to
fully engage in the problem-posing activities. The study also investigated the effect of the problem-posing system on students’ problem-
posing ability, problem-solving ability, and flow experience by examining the processes of problem-posing system.

2. Problem-posing activities and system design

2.1. Problem-posing activities

The design of problem-posing activities in the study is based on the following two models: (1) that proposed by Polya (1945), which
entails four phases of problem-solving (understand / plan / carry out /look back), and (2) the four phases of problem-posing that Leung
(1993) adapted from Polya’s work (pose problems / plan / carry out /look back).
The problem-solving model of Polya shows that, when solving problems posed by others, the problem solver should understand the
content of the problem before planning the solution. Four phases are implemented as a top-down process. However, if the problem solver is
also the problem poser, he or she can immediately plan the solution and execute it (Leung, 1993). Finally, when looking back at the solving
process, the student might be prompted to create new problems. Once again, this induces the cycle of the four phases of problem-posing:
problem-posing, planning, carrying out, and looking back.
Therefore, the phases of problem-posing and problem-solving are connected to form a continuous cycle (Leung, 1993), through which
problem posers are able to reflect on whether the posed problems are appropriate. Since students either pose problems or solve those
generated by others, they have to identify key points in the question, determine the solution, solve it, and reflect upon it. After sufficient
practice, students will acquire the skills needed to solve similar problems, consequently increasing the processing speed in problem-solving.
The procedures of problem-posing and problem-solving are closely tied to each other and foster students’ synthesis and induction abilities
(Dillon, 1982).

2.2. System design

We implemented the system and developed its procedure based on Leung’s problem-posing model (1993), adapted from Polya. It has
four steps: posing problem, planning, solving problem, and looking back (see Table 1).
K.-E. Chang et al. / Computers & Education 58 (2012) 775–786 777

Table 1
Procedure of problem-posing.

Sequence Four phases of Four phases of problem-posing Procedure of the proposed problem-posing
problem-solving adapted by Leung (1993) from
according to Polya’s model
Polya (1945)
1 Understand Pose problems Self-posed problems
2 Plan Plan 1. Attempt to solve self-posed problem
2. Obtain feedback from the teacher
3. Judge whether the solution is reasonable
4. Refine problems
3 Carry out Carry out Solve posed problems in the game-based setting
4 Look back Look back 1. More feedback from the teacher
2. Get new ideas and be prompted to create
new problems

Instructions are provided at each interface to help students understand the process of problem-posing. When they log into the system
and enter the interface of the problem-posing activity, they encounter four further phases.

1. Posing problems. Students are asked to pose problems in a given range and type them into the system. They type in the content of their
problems and the correct answers (see Fig. 1).
2. Planning. After students send out the posed problems, they proceed to the interface in which the posed problems are verified. All the
posed problems are listed, and students can refine them as they wish. By clicking on the trial button, students can try out the questions
they have created in the game-based setting. They obtain feedback from their teacher and judge whether the solution is reasonable.
They can use the refine button to refine their posed problems (see Fig. 2).
3. Solving problem. In this phase, students solve the posed problems in the game-based setting. The posed problems and their detailed
response options are shown on the left of the interface. The game field is displayed on the right of the interface, with response options
embedded into it. The system contains six interactive games: Coby the Cat, Millionaire, Star Wars, Rescuing the Mice, Uncle Tu Climbing
the Coconut Palm, and Racers.

For example, “Millionaire” (see Fig. 3) is a game with lifelines (contestants can obtain help in three ways if they need to) such as
elimination, call in, call out. Based on Prensky’s elements of educational game design, the six games have three elements which include
continuous game scenario, immediate feedback, and reward characteristics (Prensky, 2000) to pull students into a state of flow. Students try
to win the game, and at the same time they revisit and review the learning material (Lepper, Iyenger, & Corpus, 2005). Then, students
activate their prior knowledge and transfer knowledge from other venues (Oblinger, 2004). The game-scenarios in the “solving problem”
step were designed for helping students achieve the state of flow, so we can thus investigate students’ flow experiences during game play.

Fig. 1. The first phase of problem-posing: posing problems.


778 K.-E. Chang et al. / Computers & Education 58 (2012) 775–786

Fig. 2. The second phase of problem-posing: planning.

4. Looking back. After students complete the phase of solving problem, they are taken to the feedback interface, which provides
immediate information about their performance, such as the time spent, the number of correct answers, and the scores on the phase of
solving problem. After the entire phase of solving problem is completed, students get feedback from their teacher. Finally, students
return to the phase of posing problem and start generating a new problem.

3. Method

The purpose of the study is to understand the effect of the system on the problem-posing and problem-solving abilities and flow
experiences of elementary students. The topic used for research was the addition and subtraction of fractions with different denominators,
presented in the form of word problems. We designed the study to detect changes in the four dimensions of problem-posing (that is,
accuracy, flexibility, elaboration, and originality), problem-solving (that is, equation-listing and calculation), and flow experiences between
before and after treatments.

3.1. Participants

The study process involved four fifth grade classes from an elementary school in Taipei, with 92 participants consisting of 48 males and
44 females with an average age of 11 years old. The students were assigned to groups according to their classes. The two experimental
classes used the problem-posing system while the two control classes took the traditional paper-based approach to the problem-posing

Fig. 3. Screenshot of “Millionaire”.


K.-E. Chang et al. / Computers & Education 58 (2012) 775–786 779

activity. The experimental and control groups contained 45 and 47 students, respectively. Both the experimental and the control groups had
the same teacher for the experiment.

3.2. Experimental design

For this research we implemented a quasi-experimental design. The independent variable was the group treatment (that is, the problem-
posing system vs. the traditional paper-based instruction), and the dependent variables were the posttest scores for the four dimensions of
problem-posing and on overall problem-solving ability. The experiment was conducted once a week for two weeks, with the activity divided
into two sessions of 80 min. Before the experiment, both groups took the pretests for the problem-posing and problem-solving tests. Then,
the experimental classes were given an additional 40 min to try out the system on their own. Next, as shown in Fig. 4, the experimental
classes posed problems using the system for 20 min, and verified these self-posed problems for another 20 min. Students got feedback from
their teacher and were given 10 min to refine their self-posed problems. They were then asked to put their revised posed problems into the
system again. In the following 20 min, students were asked to solve posed problem in the game-based setting. Finally, students obtained
feedback from the teacher and were given new ideas to create new problems for 10 min. The control classes were involved in the same
problem-posing activity by using the traditional paper-based method as shown in Fig. 4. Students in control group first carried out the
paper-based problem-posing activity for 20 min. Then, they were asked to verify their self-posed problems for 20 min and revise the
questions for another 10 min. For the next 20 min they solved the posed problems using paper and pencil. Finally, they received feedback
from the teacher, and over the next 10 min generated new questions. The Flow Experience Scale was given to both groups at the end of the
experiment. After the two-week experiment was finished, each participant was given the posttest of problem-posing and problem-solving.

3.3. Instruments

The problem-posing system was implemented on a tablet PC, which allowed users to use handwriting input in the classroom.

3.3.1. Pre- and posttests for problem-posing


The study developed pre- and posttests for problem-posing to examine the changes in students’ problem-posing ability after the
experiment. In the pre- and posttests, participants were asked to pose five free questions and five semi-structured questions (with given
stem) for the addition and subtraction of fractions with different denominators. The posed problems were scored for accuracy, flexibility,

Experimental group Control group


Total 160mins Total 160mins
pre-test

Week 1

Phase 1: “Posing problems” Phase 1: “Posing problems”


Problem-posing activity (system-based) Problem-posing activity (paper-based)
20mins 20mins

Phase 2: “Planning” Phase 2: “Planning”


Verifying self-posed problems (system-based) Verifying self-posed problems (paper-based)
20mins 20mins
Revising self-posed problems according to the Revising self-posed problems according to
teacher’s feedback (system-based) 10mins the teacher’s feedback (paper-based) 10mins

Phase 3: “Solving problem” Phase 3: “Solving problem”


Solving posed problems (system-based) Solving posed problems (paper-based)
20mins 20mins

Phase 4: “Looking back” Phase 4: “Looking back”


Obtain teacher’s feedback and get new ideas Obtain teacher’s feedback and get new ideas
to create new problems (system-based) to create new problems (paper-based)
10mins 10mins

Week 2: Repeat phases 1-4

post-test

Fig. 4. Procedure of the experimental group and control group.


780 K.-E. Chang et al. / Computers & Education 58 (2012) 775–786

elaboration, and originality (Torrance, 1966). For each problem, the scores from all the dimensions were summed up to form the score for the
overall problem-posing.

(1) Accuracy: If a problem was posed accurately and independently from any help, one point would be awarded. Yet, it would be considered
as 0.
(2) Flexibility: It refers to the number of correctly posed problem types. For problem-posing, although the amount of posed problems
matters, the variety of problem types is also encouraged since it is an important indication of creativity showing participants are not
constrained in a certain model. For problem-posing, high flexibility refers to the ability of applying a variety of models. For example,
problem A states, “Mike has two apples, and Helen has three apples. How many apples do they have in total?” Problem B states, “Mike has
two apples, and Helen has three more apples than Mike. How many apples do they have in total?” Problem B contains two problem types,
comparison and addition, while problem A only consists of addition. Hence, problem B is regarded to be more flexible.
(3) Elaboration: It denotes the solving steps of the posed problem. Problems which require one translation (i.e., representation),
comparison or operation, would be given a point. Problems not awarded for accuracy would not be awarded for elaboration either. In the
past, problem-posing was assessed according to the complexity of posed problems and required solving steps (Getzels & Jackson, 1962).
The schema concept from cognitive psychology can be applied here for evaluating elaboration of posed problems-how many mathe-
matical schemas are incorporated into a problem? For example, problem A states, “Mike has two apples, and Helen has three apples. How
many apples do they have in total?” Problem B states, “Mike has two apples, and Helen has three more apples than Mike. How many apples do
they have in total? In the case of problem B, two problem types, comparison and addition, are incorporated, which two steps of
calculation are needed. In contrast, problem A only involves addition requiring only one-step operation. Therefore, problem B can be
deemed more elaborative.
(4) Originality: Another important characteristic of creativity is originality meaning that the posed problem types are different from others
(Mayer, 1982). The scoring for originality was determined by the occurring percentage of problem types found in the pilot. If the
occurring percentage of the posed problem type was in the range of 5% and above, one point would be given. For the ones in the range of
2–4.99%, and 2% and below, 2 and 3 points were awarded respectively. Problems not awarded for accuracy would not be awarded for
originality either.

The initial draft of the test was reviewed by experts in mathematics education and four experienced elementary school teachers of grade
five and six mathematics. The revised draft of the test was later piloted on 70 participants from an elementary school in Taipei. From these,
10 participants were randomly selected to analyze their problem-posing performance according to the four dimensions (accuracy, flexibility,
elaboration, and originality). The scoring for problem-posing ability was performed by one of the researchers and an experienced
elementary school teacher. The correlation coefficients were .863 for accuracy, .821 for flexibility, .782 for elaboration, and .835 for origi-
nality. There was thus a fair reliability in tests for problem-posing ability.

3.3.2. Pre- and posttests for problem-solving


The study developed pre- and posttests to examine the changes in students’ problem-solving ability after the experiment. The subject for
the pre- and posttests was addition and subtraction of fractions with different denominators, and both tests included twelve questions.
Experts in mathematics education and four elementary school teachers experienced in teaching mathematics to the fifth and sixth grades
were invited to comment on the initial draft of the tests. The tests were analyzed for internal consistency using the Kuder-Richardson
Formula 20. The reliability coefficients were .69 and .76 for the pre- and posttests, respectively.

3.3.3. The Flow Experience Scale


In this study, the Flow Experience Scale was adapted from Hoffman and Novak method (2009). The scale was divided into three sections.
The first section handled participants’ experiences of the problem-posing activity and computer usage. The second section assessed
dimensions of the flow state and participants’ perceptions of the problem-posing activity, which included three factors: antecedents of flow,
structural properties of flow, and the experience of flow. The third section handled the flow experience in the problem-posing activity.
Prior to the actual assessment, participants were required to read three descriptions of a flow experience. Then, the scale posed three
questions about the flow experience during the problem-posing activity. These questions were: Did you experience the aforementioned
description of a flow experience? What is the frequency of experiencing a flow? Did you feel that you were in a flow during the problem-
posing activity?
The scale ranged from 1 to 5, where 1 represents strongly disagree and 5 represents strongly agree. Unanswered items were considered
as missing data. The adapted scale of the flow experience was piloted with 198 sixth graders, with the obtained reliability coefficient of .791
indicating a moderate internal consistency.

4. Results

4.1. Analysis of problem-solving scores

Table 2 presents the mean and SD values of the pre- and posttests. One-way ANCOVA was used to examine the posttest scores when
excluding the effect of the pretest (that is, covariance). Prior to the one-way ANCOVA, the test of homogeneity of within-class regression was
conducted for overall problem-solving, equation-listing, and calculation scores, for which the F values were 10.04 (p ¼ .002 < .05), 9.88
(p ¼ .00 < .05), and 5.59 (p ¼ .02 < .05), respectively. These significant differences indicated that the two within-class regression slopes were
nonparallel and that the presumption of homogeneity of the within-class regression coefficient was not valid. Therefore, ANCOVA could not
be used to analyze the differences in the scores of the posttest. Instead, the Johnson–Neyman method was implemented to examine the
interaction effect of the instruction methods.
K.-E. Chang et al. / Computers & Education 58 (2012) 775–786 781

Table 2
Mean (SD) values of pretest and posttests for problem-solving.

Test score groups Equation-listing Calculation Overall


problem-solving

Mean (SD) Mean (SD) Mean (SD)


Experimental group
Pretest (n ¼ 45) 10.16 (1.86) 8.11 (2.69) 18.27 (4.39)
Posttest (n ¼ 45) 10.71 (1.63) 9.24 (2.35) 19.95 (3.70)
Control group
Pretest (n ¼ 47) 9.97 (2.32) 7.83 (2.43) 17.77 (4.48)
Posttest (n ¼ 47) 9.68 (2.42) 7.87 (2.89) 17.55 (5.11)
Total
Pretest (n ¼ 92) 10.04 (2.10) 7.97 (2.55) 18.01 (4.42)
Posttest (n ¼ 92) 10.18 (2.13) 8.54 (2.71) 18.73 (4.61)

Figs. 5–7 show the regression intersection point and significant difference point for the posttest scores in overall problem-solving,
equation-listing and calculation as derived by the Johnson–Neyman method and the regression lines of the two groups.
Examination of the interactions revealed that the posttest scores (maximum score ¼ 24) for overall problem-solving did not differ
between the two groups for pretest scores between 19.39 and 22.27. However, for pretest scores lower than 19.39, the posttest performance
of the experimental group was better than that of the control group, indicating that students with pretest scores lower than 19.39 were more
likely to improve in overall problem-solving after using the system.
The posttest scores for tests of equation-listing (maximum score ¼ 12) did not significantly differ between the groups with pretest scores
between 8.72 and 10.94, whereas the posttest performance of the experimental group was better than that of control group students with
pretest scores lower than 8.72. This indicates that students with pretest scores lower than 8.72 were more likely to exhibit improved
equation-listing after using the system.
The overall posttest score (maximum score ¼ 12) revealed that when total pretest scores are lower than 10.56, the posttest performance of
the experimental group was better than that of the control group. There was no difference between the two groups when pretest scores were
between 10.56 and 11.87, indicating that students with pretest scores lower than 10.56 were more responsive to the experimental group.
For the scores of overall problem-solving, equation-listing and calculation from the posttest, the interaction and the points of significant
differences of the within-class regression slopes situated on a 24-point, 12-point, and 12-point scale (with 24, 12,12 as the full mark) were
examined using the Johnson–Neyman method. The two groups showed insignificant differences in their pretest scores (between 19.39 and
24, 10.56 and 12, 8.72 and 12). Therefore, neither the experimental group nor the control group appeared to be more effective in promoting
overall problem-solving, equation-listing, and calculation. However, when the pretest score was lower than 19.39 (overall problem-solving
score, see Fig. 5), 10.56 (equation-listing score, see Fig. 6), or 8.72 (calculation score, see Fig. 7) the experimental group was found to be more
effective than the control group. In other words, students with lower pretest scores benefited more from the problem-posing system.

Fig. 5. Regression lines of the experimental and control groups on the test scores for overall problem-solving.
782 K.-E. Chang et al. / Computers & Education 58 (2012) 775–786

Fig. 6. Regression lines of the experimental and control groups on the test scores for equation-listing.

4.2. Analysis of problem-posing scores

Table 3 presents the results of the pre- and posttests, including the sample size, the mean, and SD values of the scores in the control and
experimental groups. One-way ANCOVA was used to examine the posttest scores when excluding the effect of the pretest (that is,
covariance). First, the homogeneity of the regression coefficient of the posttest scores were conducted for the overall problem-posing,

Fig. 7. Regression lines of the experimental and control groups on the test scores for calculation.
K.-E. Chang et al. / Computers & Education 58 (2012) 775–786 783

Table 3
Results of examining the homogeneity of regression coefficients across the experimental group and control group for problem-solving.

Test score groups Accuracy Flexibility Elaboration Originality Overall


problem-posing

Mean (SD) Mean (SD) Mean (SD) Mean (SD) Mean (SD)
Experimental group
Pretest (n ¼ 45) 2.89 (1.53) 2.78 (1.72) 3.36 (2.01) 4.96 (3.08) 13.98 (8.03)
Posttest (n ¼ 45) 4.47 (.79) 5.4 (1.83) 6.56 (2.05) 9.18 (3.33) 25.60 (7.30)
Control group
Pretest (n ¼ 47) 2.94 (1.67) 3.15 (1.73) 3.26 (2.02) 5.38 (3.43) 14.72 (8.52)
Posttest (n ¼ 47) 3.53 (1.32) 3.91 (1.75) 5.13 (2.39) 7.15 (3.54) 19.72 (8.49)
Total
Pretest (n ¼ 92) 2.91 (1.59) 2.97 (1.73) 3.30 (2.00) 5.17 (3.26) 14.36 (8.25)
Posttest (n ¼ 92) 3.99 (1.18) 4.64 (1.93) 5.83 (2.33) 8.14 (3.57) 22.60 (8.43)

Table 4
Summary data of overall problem-posing, accuracy, flexibility, elaboration, and originality from ANCOVA.

Source of variance SS df MS F p
Overall problem-posing Pretest 1217.71 1 1217.71 24.36* .00
Group 884.08 1 884.08 17.69 .00
Error 4448.50 89 49.98
Accuracy Pretest 31.19 1 31.19 36.67* .00
Group 20.84 1 20.84 24.49 .00
Error 75.71 89 .85
Flexibility Pretest 45.69 1 45.69 16.75* .00
Group 60.99 1 60.99 22.36* .00
Error 242.78 89 2.73
Elaboration Pretest 73.26 1 73.26 17.38* .00
Group 43.94 1 43.94 10.43* .00
Error 375.09 89 4.21
Originality Pretest 185.90 1 185.90 18.83* .00
Group 112.49 1 112.49 11.39* .00
Error 878.63 89 9.87

*p < .05.

Table 5
Summary data of flow experience from t-test.

Group Total score on Flow Experience Scale

N Mean SD t
Experimental 41 78.93 12.54 5.39***
Control 39 62.85 14.15
Total 80 71.08 15.53

***p < .001.

accuracy, flexibility, elaboration, and originality scores, for which the F values were 1.46 (p ¼ .23 > .05), 2.68 (p ¼ .11 > .05), .05
(p ¼ .48 > .05), 2.03 (p ¼ .16 > .05), and .32 (p ¼ .57 > .05), respectively. The statistical results demonstrated that the outcomes of the
aforementioned tests match the basic hypothesis of the homogeneity of the regression coefficient. ANCOVA was used to examine these data
(see Table 4). The overall scores of problem-posing, accuracy, flexibility, elaboration, and originality were significantly higher in the
experimental group than in the control group (F ¼ 17.69 and p < .01 for the overall scores of problem-posing, F ¼ 24.49 and p < .01 for
accuracy, F ¼ 22.36 and p < .01 for flexibility, F ¼ 10.42 and p < .01 for elaboration, and F ¼ 11.39 and p < .01 for originality).

4.3. Analysis of the Flow Experience Scale

Table 5 shows the results of the Flow Experience Scale, including the sample size, the mean, and SD values in the control and experi-
mental groups. Independent t-tests indicated that the flow experience score was significantly higher in the experimental group than in the
control groups (t ¼ 5.39, df ¼ 78, p ¼ .00 < .001).

5. Discussion and conclusions

This study involved the development and implementation of a problem-posing system and explored the effectiveness of the system on
the development of problem-posing, problem-solving, and flow experience. Three conclusions were reached. Firstly, the overall scores of
problem-posing, accuracy, flexibility, elaboration, and originality were significantly higher in the experimental group than in the control
group. Secondly, the results of the Flow Experience Scale indicates that the score of flow experience was significantly higher in the
experimental group than in the control groups. Finally, the problem-posing system was effective at promoting students with lower problem-
solving scores.
784 K.-E. Chang et al. / Computers & Education 58 (2012) 775–786

For the four dimensions of the problem-posing ability and their scores in total, the study found that the experimental group performed
significantly better than the control group. It can be inferred that compared with the traditional problem-posing activities, the scenarios in
the system might provide students with more opportunities to reflect on their problem-posing techniques by examining posed problems.
The features of the system allowed students to improve their problem-posing skills. Students in the experimental group repeatedly returned
to the interface of solving problem to improve their scores. Once they did so, they had more opportunity to view more posed problems and
solve them. Through system function, comments and feedback given by the teacher, the student can refine his or her problem-posing skill.
Thus, students in experimental group performed better in the problem-posing posttest.
For example, a student from the experimental group posed the following problem in the pretest: “John has 2/3 of a pack of cookies. After
giving his classmate 1/6 of the pack of cookies, how much is left for John?” For the posttest, the same student posed the following question:
“Jenny has 1/4 of a box of chocolates, which is less than Mark who has 3/8 of the box. How much of a box of chocolates do they have
altogether?” The examples show that the student progressed in terms of flexibility, from using one problem type to integrating two problem
types. The elaboration dimension evolved from one-step to two steps, and the student also employed the keywords “less” and “altogether”
to evoke problem-solving. A problem solved by the student was: “Peter ate 4/9 of a cake. Leo had 2/5 more than Peter. How much of the cake
did they have altogether?” From this question, it is evident that the students reflected on and adjusted problem-posing skills during the
process of game-based problem-solving, and that these behaviors in turn influenced their problem-posing performance. This result is
compatible with the study by Kaberman and Dori (2009), which found that stimulating students to pose problems enables them to be aware
of their own cognitive processes and to self-regulate themselves with respect to the learning task.
The problem-posing activities played an important role in students’ understanding of mathematics concepts. To succeed in problem-
posing activities, students must integrate their own experience and knowledge. When their knowledge was inadequate to accomplish
the activity, they had to revisit the materials to add to information temporarily stored in their memories. In other words, the system
encouraged students to focus again on materials they had already encountered, and apply strategies such as review, elaboration, organi-
zation, planning, and adjustment to further process information and make more connections with the mathematical concepts.
The results of the Flow Experience Scale indicate that the experimental group was in the higher state of flow than the control group. This
result is in agreement with Barak and Rafaeli’s (2004) contention that the scores of the students in their final examination were higher in
those who were highly motivated to engage in on-line problem-posing activities. Thus, during the state of flow, they would spend more time
participating in the activities to accomplish a satisfying learning outcome (Ellington, Adinall, & Percival, 1982). In contrast with the
experimental group, the traditional problem-posing instruction in the control group was constrained by the classroom environment.
Participants became tired of the task as they continued to pose or solve more problems, which adversely affected their learning outcomes
(Yu et al., 2005). Conversely, the experimental group was infused with challenges, which aroused and sustained students’ interest in the
problem-posing activities. The result is consistent with the contention that computer games can induce a higher interest level in students,
which in turn enables them to experience a higher level of flow (Inal & Cagiltay, 2007; Raybourn & Bos, 2005).
Finally, we found that the system was effective at promoting students with lower problem-solving scores (see Figs. 5–7). This is
inconsistent with several results suggesting that problem-posing instruction can enhance the problem-solving ability (English, 1997a,
1997b; Skinner, 1991; Tsubota, 1987). This discrepancy might be caused by the ceiling effect for high performers in both groups (Berger,
1992). Therefore, despite the use of the system in both groups, the posttest scores of high performers from both groups were already
close to the full mark, which resulted in insignificant differences in the posttest.
Conversely, there is a significant difference in the problem-solving posttest scores for the low performers from both groups. The
motivation of the experimental group might be enhanced by the system so as to improve their problem-solving ability. Immediate feedback
and reward characteristics from games (for example, the game scores and rankings) will encourage students to complete the problem-
solving tasks. This is consistent with previous research pointing out that game-based activities could induce and sustain motivation in
learning (Burguillo, 2010; Huang et al., 2010; Liu and Chu, 2010), and also with the research by Yu et al. (2005) on finding ways to improve
low achievers’ problem-solving performance. In accordance with the results on the Flow Experience Scale, the problem-posing system
indeed augmented motivation and engagement in learning, which in turn could induces better performance.
Being capable of correctly posing mathematical problems is important to the understanding of mathematical concepts. Enhancing
learning engagement by organizing thoughts and posing new problems has positive effects on learners. The experimental results indicate
that the system can improve the learning engagement (that is, flow experiences) of the fifth graders and their problem-posing ability in
word problems dealing with the addition and subtraction of fractions with different denominators. The study has shown that problem-
posing activities enhanced by technological tools can be beneficial to the learning process. This is consistent with Freitas and Jameson’s
contention (2006) that technology can contribute to increase meaningful learning experiences and result in successful learning outcomes.
We propose several issues for future studies. First, scaffoldings could have been beneficial to help students with lower prior knowledge to
improve their problem-posing performance. Therefore, further research may design scaffoldings for problem-posing processes and examine
their effectiveness. Second, more game mechanisms to assist problem-posing could be included such as brain-storming, mind tools, and
sentence puzzles. Finally, because gender difference was also an important factor in the prediction of performance on problem-posing and
problem-solving, future research might investigate the influence of the implementation of the problem-posing system on gender
differences.

Acknowledgements

This research project was supported by grants from the National Science Council, Taiwan, Republic of China (Contract No. NSC99-2511-S-
003-026-MY3, NSC99-2631-S-003-001, NSC99-2631-S-003-003, NSC100-2631-S-003-007).

References

Admiraal, W., Huizenga, J., & Akkerman, S. (2011). The concept of flow in collaborative game-based learning. Computers in Human Behavior, 27(3), 1185–1194.
K.-E. Chang et al. / Computers & Education 58 (2012) 775–786 785

Barab, S. A., Sadler, T., Heiselt, C., Hickey, D., & Zuiker, S. (2007). Relating narrative, inquiry, and inscriptions: a framework for socio-scientific inquiry. Journal of Science
Education and Technology, 16(1), 59–82.
Barak, M., & Rafaeli, S. (2004). On-line question-posing and peer-assessment as means for web-based knowledge sharing in learning. International Journal of Human-Computer
Studies, 61, 84–103.
Baxter, J. A. (2005). Some reflections on problem posing: a conversation with Marion. Teaching Children Mathematics, 12, 122–128.
Berger, M. J. (1992). The Department of Labor’s glass ceiling initiative: a new approach to an old problem. Labor Law Journal, 7, 421–429.
Burguillo, J. C. (2010). Using game theory and competition-based learning to stimulate student motivation and performance. Computers & Education, 55(2), 566–575.
Cai, J., & Hwang, S. (2002). U.S. and Chinese students’ generalized and generative thinking in mathematical problem solving and problem posing. Journal of Mathematical
Behavior, 21(4), 401–421.
Chang, K. E., Sung, Y. T., Chen, Y. L., & Huang, L. H. (2008). Learning multiplication through computer-assisted learning activities. Computers in Human Behavior, 24(6), 2904–2916.
Chang, K. E., Sung, Y. T., & Lin, S. Y. (2007). Developing geometry thinking through multimedia learning activities. Computers in Human Behavior, 23(5), 2212–2229.
Charsky, D., & Ressler, W. (2011). “Games are made for fun”: lessons on the effects of concept maps in the classroom of computer games. Computers & Education, 56, 604–615.
Chen, H., Wigand, R. T., & Nilan, M. (1999). Flow activities on the Web. Computers in Human Behavior, 15(5), 585–608.
Crespo, S. (2003). Learning to pose mathematical problems: exploring changes in preservice teachers’ practices. Educational Studies in Mathematics, 52, 243–270.
Csikszentmihalyi, M., & Csikszentmihalyi, I. S. (1988). Optimal experience: Psychological studies of flow in consciousness. Cambridge, New York: Cambridge University Press.
Denny, P., Hamer, J., Luxton-Reilly, A., & Purchase, H. (2008). PeerWise: students sharing their multiple choice questions. In Proceeding of the Fourth International Workshop on
Computing Education Research (pp. 51–58), Sydney, Australia.
Dickey, M. D. (2007). Game design and learning: a conjectural analysis of how massively multiple online role-playing games (MMORPGs) foster intrinsic motivation.
Educational Technology Research and Development, 55(3), 253–273.
Dillon, J. T. (1982). Problem finding and solving. Journal of Creative Behaviour, 16, 97–111.
Ellington, H., Adinall, E., & Percival, F. (1982). A handbook of game design. London: Kogan.
English, L. D. (1997a). Promoting a problem-posing classroom. Teaching Children Mathematics, 4, 172–179.
English, L. D. (1997b). The development of fifth-grade children’s problem-posing abilities. Educational Studies in Mathematics, 34, 183–217.
English, L. D. (1998). Children’s problem posing within formal and informal contexts. Journal for Research in Mathematics Education, 29, 83–106.
English, L. D. (2003). Problem posing in elementary curriculum. In F. Lester, & R. Charles (Eds.), Teaching mathematics through problem solving. Reston, Virginia: National
Council of Teachers of Mathematics.
Fellenz, M. R. (2004). Using assessment to support higher level learning: the multiple choice item development assignment. Assessment & Education in Higher Education,
29(6), 703–719.
Freitas, S., & Jameson, J. (2006). Collaborative e-support for lifelong learning. British Journal of Educational Technology, 37(6), 817–824.
Gee, J. P. (2003). What video games have to teach us about learning. New York: Palgrave.
Gee, J. P. (2005). Why video games are good for your soul: Pleasure and learning. Melbourne: Common Ground.
Getzels, L. W., & Jackson, P. W. (1962). Creativity and intelligence Exploration with gifted students. New York: Wiley.
Hirai, Y., & Hazeyama, A. (2007). Concerto II: a learning support system based on question-posing. In Proceedings of the 7th IEEE International Conference on Advanced Learning
Technologies (pp. 338–339), Niigata, Japan.
Hoffman, D. L., & Novak, T. P. (2009). Flow online: lessons learned and future prospects. Journal of Interactive Marketing, 23(1), 23–34.
Huang, W. H., Huang, W. Y., & Tschopp, J. (2010). Sustaining iterative game playing processes in DGBL: the relationship between motivational processing and outcome
processing. Computers & Education, 55(2), 789–797.
Huizenga, J., Admiraal, W., Akkerman, S., & ten Dam, G. (2009). Mobile game-based learning in secondary education: engagement, motivation and learning in a mobile city
game. Journal of Computer Assisted Learning, 25(4), 332–344.
Inal, Y., & Cagiltay, K. (2007). Flow experiences of children in an interactive social game environment. British Journal of Educational Technology, 38(3), 455–464.
Kaberman, Z., & Dori, Y. J. (2009). Metacognition in chemical education: question posing in the case-based computerized learning environment. Instructional Science, 37,
403–436.
_
Kar, T., Özdemir, E., Ipek, A. S., & Albayrak, M. (2010). The relation between the problem posing and problem solving skill of prospective elementary mathematics teachers.
Innovation and Creativity in Education, 2(2), 1577–1583.
Keil, G. E. (1965). Writing and solving original problems as a means of improving verbal arithmetic problem solving ability. Doctoral dissertation, Indianan University.
Keller, J. M. (2009). Motivational design for learning and performance: The ARCS model approach. Springer.
Ketelhut, D. J., & Schifter, C. C. (2011). Teachers and game-based learning: improving understanding of how to increase efficacy of adoption. Computers & Education, 56, 539–
546.
Kiili, K. (2005). On educational game design: Building blocks of flow experience, Vol. 571. Tampere, Finland: Tampere University of Technology, Tampere University of Technology
Publication.
Kumar, D. (2000). Pedagogical dimensions of game playing. ACM Intelligence Magazine, 10(1), 9–10.
Lepper, M. R., Iyenger, S. S., & Corpus, J. H. (2005). Intrinsic and extrinsic motivational orientations in the classroom: age difference and academic correlates. Journal of
Educational Psychology, 97(2), 184e–196.
Leung, S. S. (1993). The relation of mathematical knowledge and creative thinking to the mathematical problem posing of prospective elementary school teachers on tasks differing in
numerical information content. Unpublished doctoral dissertation, University of Pittsburgh.
Leung, S. K., & Silver, E. A. (1997). The role of task format, mathematics knowledge, and creative thinking on the arithmetic problem posing of prospective elementary school
teachers. Mathematics Education Research Journal, 9(1), 5–24, (Australia).
Lewis, T., Petrina, S., & Hill, A. M. (1998). Problem posing-adding a creative increment to technological problem solving. Journal of Industrial Teacher Education, 36(1), Retrieved
from: http://scholar.lib.vt.edu/ejournals/JITE/v36n1/lewis.html on 23 March 2011.
Lim, C. P., Nonis, D., & Hedberg, J. (2006). Gaming in a 3D multiuser virtual environment: engaging students in science lessons. British Journal of Educational Technology, 37(2),
211–231.
Liu, C. C., Cheng, Y. B., & Huang, C. W. (2011). The effect of simulation games on the learning of computational problem-solving. Computers & Education, 57(3), 1907–1918.
Liu, T. Y., & Chu, Y. L. (2010). Using ubiquitous games in an English listening and speaking course: impact on learning outcomes and motivation. Computers & Education, 55(2),
630–643.
Mayer, R. E. (1982). The psychology of mathematical problem solving. In F. K. Lester, & Garofalo. (Eds.), Mathematical problem solving: Issues in research (pp. 1–13). Phila-
delphia: Franklin Institute Press.
Mestre, J. P. (2002). Probing adults’ conceptual understanding and transfer of learning via problem posing. Journal of Applied Developmental Psychology, 23, 9–50.
Nicolaou, A. A., & Philippou, G. N. (2004). Efficacy beliefs, ability in problem posing, and mathematics achievement. In Proceedings of the 3rd International Biennial SELF
Research Conference Berlin, Self-concept, motivation and identity: Where to from here? 4–7 July, 2004.
Oblinger, D. (2004). The next generation of educational engagement. Journal of Interactive Media in Education, 2004(8), 1–18.
Papastergiou, M. (2009). Digital Game-Based Learning in high school Computer Science education: impact on educational effectiveness and student motivation. Computers &
Education, 52(1), 1–12.
Polya, G. (1945). How to solve it (2nd ed.). New York: Doubleday.
Prensky, M. (2000). Digital game-based learning. New York: McGraw-Hill Book.
Raybourn, E. M., & Bos, N. (2005). Design and evaluation challenges of serious games. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, USA (pp.
2049–2050).
Sanchez, J., & Olivares, R. (2011). Problem solving and collaboration using mobile serious games. Computers & Education, 57, 1943–1952.
Silver, E. A. (1994). On mathematical problem posing. For the Learning of Mathematics, 14(1), 19–28.
Silver, E. A., & Cai, J. (1996). An analysis of arithmetic problem posing by middle school students. Journal for Research in Mathematics Education, 27, 521–539.
Silver, E. A., & Cai, J. (2005). Assessing students’ mathematical problem posing. Teaching Children Mathematics, 12, 129–135.
Skinner, P. (1991). What’s your problem: Posing and solving mathematical problems, K-2. Portsmouth, New Hampshire: Heinemann.
Squire, K. D. (2003). Gameplay in context: Learning through participation in communities of civilization III players. Unpublished PhD thesis. Instructional Systems Technology
Department, Indiana University.
Squire, K. D., & Jan, M. (2007). Mad city mystery: developing scientific argumentation skills with a place-based augmented reality game on handheld computers. Journal of
Science Education and Technology, 16(1), 5–29.
786 K.-E. Chang et al. / Computers & Education 58 (2012) 775–786

Squire, K., & Jenkins, H. (2004). Harnessing the power of games in education. Insight, 3(1), 5–33.
Sung, Y. T., Chang, K. E., & Lee, M. D. (2008). Designing multimedia games for young children’s taxonomic concept development. Computers & Education, 50(3), 1037–1051.
Torrance, E. P. (1966). Torrance tests o/creative thinking: Norms-technical manual. Englewood Cliffs, NJ: Personell Press.
Tsubota, E. (1987). On children’s problem posing (grades 1 to 3). (Japan: Author).
Umetsu, T., Hirashima, T., & Takeuchi, A. (2002). Fusion method for designing computer-based learning game. ICCE 2002 Proceedings of the International Conference on
Computers in Education, 124–128.
Whitin, P. (2004). Promoting problem-posing explorations. Teaching Children Mathematics, 11, 180–186.
Wilson, E. V. (2004). ExamNet asynchronous learning network: augmenting face-to-face courses with student-developed exam questions. Computers & Education, 42(1),
87–107.
Yu, F. Y. (2011). Multiple peer-assessment modes to augment online student question-generation processes. Computers & Education, 56(6), 484–494.
Yu, F. Y., & Liu, Y. H. (2009). Creating a psychologically safe online space for a student-generated questions learning activity via different identity revelation modes. British
Journal of Educational Technology, 40(6), 1109–1123.
Yu, F. Y., Liu, Y. H., & Chan, T. W. (2005). A Web-based learning system for question-posing and peer assessment: pedagogical design and preliminary evaluation. Innovations in
Education and Teaching International, 42, 337–348.

Vous aimerez peut-être aussi