Vous êtes sur la page 1sur 24

Clicking towards complex learning: Eluding the seduction of the Technopoly

Inaugural Professorial Lecture of Prof Duan van der Westhuizen, University of Johannesburg

31 October 2007

INTRODUCTION

Today I will show that the near evangelical fervour by which information technology (IT) has been slipped into society, and especially educational institutions at all levels, has deified not only the very bearers of the good tiding, but also the good tiding itself. The accompanying noise of the near-zealots has confounded the real issue that is at stake in institutions of learning: how do we help people to learn? I will explain how the misaligned (but not necessarily ill-intended) focus on the technology, and for the sake of the technology, has seduced the bearers of the technology into becoming part of what Neil Postman (1992) calls the Technopoly or what Todd Oppenheimer (2003) refers to as Technopia. In the Technopoly, those teachers, masters and sages have perhaps lost sight of that which is their calling: to teach, to support, to scaffold, to mentor, to guide. They have, possibly unbeknown to them, been seduced by the dazzle of the technology, by the very nature of the hyperbolised promises that the increasingly powerful technology holds, and by the pervasiveness and often invisibility of the tool. This tool seamlessly fits into the very fabric of our daily existence, and in the context of teaching, giving seemingly quick and tangible results, as is so aptly evident when knowledge gets powerpointed in the classroom. However, I will question what these results are, and indeed, how valuable they are. This will constitute the final part of this address.

In the first part of my talk, the often chequered history of educational technology will be narrated, from its early and optimistic (yet painful) introduction into institutions of learning, through several iterations of extravagant and sometimes misguided (but mostly well-intended) efforts, through to the tragic collapse of e-learning in the late nineties, and onto the rationality of the early twenty-first century. This history shows that, despite the deliberate efforts to attenuate the demands of good pedagogy, improved learning has not always been the focus when technology was introduced; instead it was the dazzle of the technology itself 1

on which the attention was placed. Indeed, the lure of the technology has been powerful: whether it was the technology of 40 years ago, which by todays developments seems to have been simplistic, two dimensional and even backward, or the modern technologies that are seemingly pervasive today.

However, and I say this hesitantly, todays technology is different in many ways from the technology of even 5 years ago. New thinking has given rise to what is today commonly referred to as Web 2.0 technologies, and new thinking is precisely what Web 2.0 brings: not only new technologies and computer programmes, but also new ways of thinking, sharing, collaborating, working and indeed of socialising. Furthermore, Web 2.0 technologies are often free, and sometimes open, therefore expanding not only access to these technologies, but also the ability to customise it. I argue that new thinking about these technologies and the subsequent design of new affordances, give us the best opportunity to finally, and at long last, use technology to support learning, and to accommodate the many complexities of this very process. I will demonstrate how the technology of today may become the mediating artefact in learning in complex ways. Yet, research on learning with technology, unfortunately, gives a confusing picture. Indeed, according to research, technology is both the best and worst thing to have ever happened to learning!

The question therefore remains, and continues to haunt us: How do we know what effects technology have on learning? Clearly, the sometimes confusing data and research findings that are produced by many agencies, projects and post-graduate research do not assist us in understanding how technology impacts on learning. Countless research results and metastudies are flawed in their design, and I will show how their underlying epistemologies and theoretical frameworks are in fact so thin, if not non-existent, that their contribution to our understanding of learning with technology are rendered virtually unusable.

In this talk, I too need to make a contribution, and move beyond mere critique. Although constructive critique is healthy for the academy, I need to act as path-finding agent. The linkage between research and technology use in education is clear: if the research on computer-supported learning is inadequate, the practice of using technology for learning will be. To this end, I need to propose a research agenda and method that will precisely and accurately direct the path: What is the most appropriate way of researching the effects of technology on learning?

THE CREATION OF THE TECHNOPOLY

Thamus, a king in ancient upper-Egypt, sat listening to the god Theuth, who was the inventor of many things, including writing. Of writing, Theuth declared to Thamus: Here is an accomplishment, my lord the King, which will improve both the wisdom and the memory of the Egyptians. Thamus was far too wise to be impressed by this, and he replied:

What you have discovered is a receipt for recollection, not for memory. And as for wisdom, your pupils will have the reputation for it without the reality: they will receive a quantity of information without proper

instruction, and in consequence be thought very knowledgeable when they are for the most part quite ignorant. And because they are filled with the conceit of wisdom instead of real wisdom they will be a burden to Society.

Thamus sounds an important warning here. He fears that memory will be confused with recollection, and he worries that wisdom will become indistinguishable from mere knowledge. If he lived today, he would have warned against one-eyed prophets who see only what new technologies can do and are incapable of imagining what they will undo (Postman, 1992). Nevertheless, one should not be silent about the opportunities that new technologies make possible. Technology has undeniably impacted society in profound ways. However, a dissenting voice is sometimes needed to moderate the commotion created by the technophiles. While it is inescapable that every culture must negotiate with technology, the challenge remains that it does so intelligently.

In a sense, Thamus warns that those who will learn to write will develop an undeserved reputation for wisdom. Similarly, those who cultivate competence in the use of a new technology often become part of an elite group that are granted authority and prestige by those who do not have that competence. This I often experience in my own work situation. Simply because I have learnt certain technology skills (which I consider to be simple), colleagues would often stand in awe of the speed and efficiency by which I navigate my keyboard and mouse, and the things I can achieve using the computer. To some degree, I experienced what is alluded to in the opening sentences of my address: being technologically deified! And it is in the very deification that the problem often lies: It seems impressive to those who are less in the know than the techno-gods, but it leads to a high receptivity to new technologies,

accompanied by a belief that its benefits will eventually spread evenly among the entire population, who will get there soon. Concomitant with the surge in technological adoption is the status that information has achieved. In fact, today we often do not speak of computers, but rather of Information Technology. By some process, the computing power of the new machines has become Information Technology. But, is information the new god of culture? When one looks at the problems that the world today faces (e.g. in the Middle East or Africa, or in the atmosphere), are these problems caused due to a lack of information? Do we lack information about how to grow crops that can feed the starving millions? Do we lack information about actions that would prevent global warming? Is it the lack of information that causes crime and decay in our cities? Yet, we see how the Technopolist stands firm in the belief that what the world needs is yet more information. And in this way information is reified, and elevated to a metaphysical status: information as both the means and end of human creativity (Postman, 19921). Therefore, Technopoly refers the deification of technology. Society seeks its authorisation in technology, it finds its satisfaction in technology, and takes its orders from technology.

And so we find that the metaphor of the machine as human (or the human as machine) blurs our understanding of the world, and the blend finds its way into everyday language. In 1988, communication between the computers in the ARPANET (originating network of the Internet as we know it today) network slowed down, and then clogged completely. Almost

immediately it was established that a software programme had attached itself to other programmes. This was called (in another human-machine metaphor) a virus. Technically, it was a worm, a computer programme explicitly designed to disable computers. However, the term virus stuck: it was familiar and understandable due to its human (and animal) connections. Computers were therefore infected, while the virus was virulent and contagious. Attempts were made to quarantine the infected computers and to develop a vaccine against new attacks.

As technology seeps into our everyday existence, it is important to consider that the changes that technology brings are ecological. The use of the word ecological must be considered in the same way as the word is used by environmental scientists. Postman describes it as follows:

One significant change generates total change.

If you remove the

caterpillars from a given habitat, you are not left with the same environment minus caterpillars: you have a new environment, and you have reconstituted the conditions of survival; the same is true if you add caterpillars to an environment that has had none. This is how the ecology of media works as well. A new technology does not add or subtract something. It changes everything. Will students learn mathematics better by computers than by textbooks? Such questions have an immediate, practical value to those who ask them, but they are diversionary. They direct our attention away from the serious social, intellectual, and institutional crises that new media foster.

The sub-title of Postmans book is The surrender of culture to Technology. Clearly, he sounds a warning that we need to view technology intelligently, and not only think about what technology adds, but also what it takes away. In the next section I will describe how the introduction of technology into education was often fundamentally flawed, and that it perhaps took away more than it added.

CHEQUERED

HISTORY:

SURRENDER

OF

EDUCATION

TO

THE

TECHNOPOLY

A well-known saying goes: The only thing that we learn from history is that we learn nothing from history. When one views the history of educational technology, it is easy to see the power of the Technopoly. Since it probably began with the introduction of the written word, an iterative pattern emerges. This will become clear in subsequent paragraphs. In 1922, Thomas Edison said I believe that the motion picture is destined to revolutionize our educational system,, and that in a few years it will supplant largely, if not entirely, the use of textbooks (Oppenheimer, 2003). Ten years before this, Edison said that film makes it possible to touch every branch of human knowledge. The education of the future as I see it, will be conducted through the medium of the motion picture, a visualized education, where it should be possible to obtain one hundred percent efficiency.

A study about the use of films in Education by HA Wise in which carefully used equivalent experimental and control groups were used, and in which measures of scientific validity were included, found that the group treated to films made statistically significant gains. Films were also found to be valuable in engaging the students imagination, while boys benefited more than girls did. Finally, the film encouraged low-ability students to learn factual information while helping high-ability students in acquiring spirit and atmosphere. Overall, however, Wise found that the benefit of classroom films were dependent on the particular subject matter, the course objectives, the students knowledge base, and the skill of the teacher. It was recommended that they could be endorsed only for use as a supplement. In what may seem to be common sense, Wise concluded by stressing the need for better teacher training.

The next iteration of technology that would supposedly transform the educational landscape was the portable radio (Brown and Brown, 1994). William Levenson claimed that the time may come when a portable radio receiver will be as common in the classroom as is the blackboard. And so followed other miracle cure technologies, the overhead projector, the classroom television, the video cassette recorder (VCR), laser discs, and the behavioural teaching machines of the late 1950s and early 1960s with claims that, with the help of teaching machines and programmed instruction, students could learn twice as much in the same time and with the same effort as in a standard classroom. (Oppenheimer, 2003).

Suffice it to say that the impact of these technologies on educational practice have been limited, to say the very least. Instead, let us turn our attention to kind of technology that would change the world, the computer. In 1982, Time magazine, for the first time in fiftyfive years, chose not to put a human being on its cover for Man of the Year, but an artists rendering of a personal computer. This technology, according to Stephen Toulmin from the University of Chicago, would re-intellectualize the television generation.

In February of the same year, a company called Digital Research released a new version of a programme called LOGO. LOGO was developed by MITs South African born Seymour Papert. Papert, who have spent five years with Jean Piaget in Paris, loved helping children work with computers and was fascinated by how they learn. LOGO was a computer programme that allowed children to explore micro worlds, and in their interaction with this world, Papert believed, they would learn the rules of, among other things, Mathematics. In his book Mindstorms, Papert (1980) describes how childrens work with LOGO can make

mathematical concepts come alive. He explained that by simply exploring the programmes procedures, a child is lead to physical activities and solving logistic problems that are explored naturally, which previously required didactic teaching.

During the span of his career, Papert made several contestable statements. In 1975, for example, Papert delivered a speech in which he said that the practice of education will have undergone a fundamental change in twenty-five years In the late 1980s he said:

Nothing is more ridiculous than the idea that this technology can be used to improve school. Its going to displace school and the way we understand school. Of course, there will always be, we hope, places where children will come together with other people and will learn. But I think that every nature, the fundamental nature, of a school that we see in this process is coming to an end.

In a 1996 he backtracked slightly, and admitted that education had not changed much. However, in 1999 he claimed that the school as it has been known up to then, would have twenty more years, by which time it would be dead. He explained: I never thought a few hours a week of LOGO would make much difference, nothing will change unless its complete, that is, until computers are used intensively, pervasively, throughout the academic day.

In 1983, some states in the USA proposed making computer literacy so important that incoming teachers be denied certification if they lacked this skill. Today, most schools in the USA require prospective teachers to submit a portfolio prior to interviews in which their computer skills are demonstrated. Several standards bodies, e.g. the International Society for Technology in Education (ISTE) publishes extensive standards that prescribe the technology integration skills that teachers should have. The continued emphasis on computers as prerequisite to good teaching is evident. In 1996, teachers were surveyed and they ranked computer skills and media technology as more essential than the study of European history, biology, chemistry, and physics; than dealing with social problems such as drugs and family breakdown; than learning practical job skills; and than reading modern American writers such as Steinbeck and Hemmingway or classic authors as Plato and Shakespeare (Oppenheimer, 2003).

One of the most significant projects was initiated in 1985 by Apple Computer called Apple Classrooms of Tomorrow (ACOT). The emphasis was to teach curriculum subjects using computer technology. Printers, scanners, laser-disc and videotape players, modems, CDROM drives, and abundant choices in software were made available. Apple provided training to teachers and a staff member at each site was made available for both technical and instructional help. Unfortunately, it was found that several high-tech schools were actually faring worse on standardised test scores than low-tech schools. After ten years, Apple had found little empirical evidence that the ACOT programme produced greater student achievement.

However, in the middle nineties, the Internet, and specifically the World Wide Web hit the world. A California educational task force said this about the Internet: More than any other

single measure, computers and network technologies, properly implemented, offer the greatest potential to right whats wrong with our public schools.(Oppenheimer, 2003). The Internet brought the issue of convergence to the fore. This meant that the job of delivering curriculum into classrooms was reinforced by products and tools that could be used in homes.

On the heels of the Internet Age followed the Laptop Age. Thousands of schools world wide were fitted out with laptops, or, as is the case in some schools in South Africa, schools simply required learners to have their own laptops to bring to school. From the hallowed halls of the Massachusetts Institute of Technology (MIT), the One Laptop per Child project was given birth to under the leadership of Nicholas Negroponte. They have developed a robust laptop that would cost less that $100. This is part of the vision of the OLPC project:
OLPC is not, at heart, a technology program OLPC is a non-profit organization providing a means to an endan end that sees children in even the most remote regions of the globe being given the opportunity to tap into their own potential, to be exposed to a whole world of ideas, and to contribute to a more productive and saner world community. ( http://laptop.org/vision/mission/)

Then came the mobile learning age. With mobile learning, education is brought to small handheld devices like Personal Digital assistants and cell phones. In recent weeks, a South African school teacher won acclaim for developing a cell phone-based web environment for the teaching of Mathematics. Just what children have wanted all their lives: a Mathematics

teacher in their pockets! I say this tongue in cheek off course cell phones can be used to deliver good Mathematics content to all corners of South Africa.

Although not within the primary thrust of this address, I would be neglectful not to speak of the impact of technology on Higher Education. The sheer size, glamour and aura of online education have initially been educationally seductive (Van der Westhuizen & Henning, 2005). And like elsewhere in the world, in South Africa the harsh reality of online learning soon replaced the initial euphoria that accompanied its introduction. World-wide, the

spending of millions on technology-based learning hardware and software are being questioned. Most universities have adopted Learning Management Systems (LMS), like the WebCT/Blackboard merged product. In South Africa WebCT/Blackboard has found

particular favour and is used in universities like the University of Johannesburg, the University of Pretoria, Wits University and Stellenbosch University. Other universities are attempting to develop their own platforms, using open-source technology. I think here of Kewl, initiated at the University of the Western Cape (UWC) or Open Learning System (OLS), at the University of Kwazulu-Natal. In addition, several open-source and free tools are available, and of particular note, Moodle and Sakai.

However, Universities seemed to be primarily concerned with the costs of implementing and maintaining web-based learning systems, and inevitably the questions are: Is it worth it? What are the benefits? Are the benefits quantifiable? The question of worth is not a simple question to answer, or even to ask. Institutions may translate worth to questions like: Are our students learning more, are they learning more efficiently, are they learning better? We argue that it is not the end-of-year student performance that will shed light on the effectiveness of the online project as much as how students learn and how that reflects on their thinking and competencies (Van der Westhuizen & Henning, 2005).

We need not continue to reiterate the cynicism that has followed the initial euphoria of the advent of e-degreeing here. Suffice to say that, apart from the e-vendors themselves, many question marks are placed on just about all aspects of the enterprise. Noble (1998) blames higher education institutions for becoming digital diploma mills. He accuses university administrators and their commercial partners of forcing online education onto academia for financial and commercial benefit. The belief was that online education would cut costs, and that return on investment would be better.

The 2000 ASTD Benchmarking Service of over 950 organizations in the USA suggests that there is a growing movement away from online-learning in favour of traditional classroombased methodologies (Saunders & Werner, 2002). The survey shows that e-learning projections have fallen from 23.0 percent for the year 2000, down to 19.8 percent for 2001 and 18.2 percent for 2002. The collapse of the Internet boom meant that many commercial technology suppliers went bust, and private investment in eLearning dropped from $2.7 billion at its peak in 2000 to $400 million at the end of 2004. Amory, Dubbeld and Peters (2004) express concern over the commodification of educational content that has accompanied the increase in the use of technology in the classroom. They argue that the power associated with a computer, coupled with widespread Internet connectivity should have lead to better learning outcomes. Instead, they claim, technology supported the use of Reusable Learning Objects (RLO), which makes the deductive jump that learning could be assembled from a number of blocks just like objects can be built using LEGO blocks. To this end, new standards were defined so that learning management systems could share such objects: Sharable Content Object Reference Model (SCORM). According to Amory et al (2004), content became a product defined through a number of standards that allow complex software to deliver these objects when required by a learner.

Learning management systems such as WebCt/Blackboard that now support RLO and SCORM, have recently drastically increased their license costs. The business model of the original WebCT company certainly reflects this. I remember, as a pioneering user of WebCT on the then RAU campus, that the first license for WebCT in 1998 was less than $100. I am unsure what the costs are today, but the nature of pay-and-pay-again, increasing their profits, but tying institutions into never-ending cycles of upgrades and increasing operational costs.

What did education lose during all of these iterations of technology introduction into schools and higher education? In her 1995 book Life on the Screen: Identity in the Age of the Internet, Turkle describes a disturbing experience with a simulation game called SimLife, and she wrote, experiences with simulations do not open up questions but close them down. Turkle is concerned that simulation software fosters passivity, and ultimately dulling peoples sense of what they can create and change in the world. In another example of reticence to use computer technology, Hewlett-Packard, a large IT company, demonstrated their commitment

10

to teacher education by spending $2.6 million to help forty five school districts build math and science skills with real materials. These include dirt, seeds, water, glass vials, and magnets. They did not use computers.

The literature abounds with examples of inappropriate and even wasteful use of computers in school education. One of the major problems with computers is the speed at which they become antiquated. Teachers soon say This is too slow, we wont use it. And so the ICTbudgets of schools become black holes. In the USA, it is reported that some school districts, under instruction to cut budgets, spread their cuts across the entire curriculum, closing elementary schools, laying off teachers or freezing salaries for those who remained, adding students to already crowded classrooms, and, of course, cutting music classes and other programs in the arts. However, most did not touch their technology budgets.

Some unsettling truths become apparent in the study of technology in schools and higher education. Although, undeniably, computers can be wonderfully useful in schools, it also seems that high technology is steering youngsters away from the messy, fundamental challenges of the real world and toward the hurried buzz and the neat convenience of an unreal virtual world. It further becomes apparent that it may be teaching them that exploring two-dimensional on-screen worlds are more important than playing with real objects or sitting down in a conversation with a friend, a parent, or a teacher (Oppenheimer, 2003).

However, there is some evidence that educators may be using computer technology more wisely than they do in the United States: In Europe, there is a sense of cultural enrichment, according to Kozma (2003). Kozma led an international study in 2003 that examined 174 different school technology projects, many in Europe. He found schools often do a lot of interesting work with basic e-mail, simply by making connections with students in neighbouring countries.

It is clear from the preceding paragraphs that the use of technology in education is not without problems. It is perhaps a romanticised idea that the mere presence of computers will lead to improved learning. Yet, I am of the belief that some form of rationality is becoming apparent. Perhaps we are at a stage where the large-scale investments are being questioned, and compared to the gains. Perhaps the right questions are being asked, those questions that deal with pedagogy, and not technology. But, there are also new kids on the block. I refer

11

specifically to two fairly recent developments: the coming of age of Web 2.0 technologies, and the new validation and growth of Free and Open Source Software (FOSS).

CHANGING LANDSCAPES: THE NEW FACE OF TECHNOLOGY

The World Wide Web is the ber-technology of the last decade, and its ubiquity has made it a powerful force in modern society. In its latest manifestation, a radical shift in thinking about what the Web is, and especially who owns it, is evident. This is often referred to as Web 2.0. Web 2.0 refers to second generation web services, and includes social-networking websites, wikis and folksonomies. These are websites aimed at facilitating collaboration and sharing between users. The term was coined at the O'Reilly Media Web 2.0 conference in 2004. It is not an updated version of older web technologies or technical specifications. Instead, it refers to changes in the ways web developers and users use the web. Tim Berners-Lee, father of the World Wide Web, points to the fact that many of the technology components of Web 2.0 have existed since the early days of the Web. In many ways, Web 2.0 is "an idea in peoples heads rather than a reality (www.wikipedia.org). Whether in the head or real, Web 2.0 technologies manifest in weblogs, social bookmarking web sites, wikis, folksonomies, podcasts, RSS feeds, social networking software (Facebook and its cynical obverse: Enemybook-, MySpace, Bebo), and web application programming interfaces (APIs) (as are found in on many web services like such as eBay and Gmail). In addition, new virtual worlds are opening up for computer users. The best example of this is Second Life. Second Life is a 3-Dimensional virtual world that is entirely created by the millions of residents that inhabit it. It comprises a vast digital continent, teeming with people, entertainment, experiences and opportunity (www.secondlife.com). These web environments have one significant enhancement over read-only websites, in that it creates reciprocity between the user and the provider. So, within the Web 2.0 space, users become not only passive receivers of information, but they become active contributors and creators. In other words, it is not only about downloading anymore, but also about uploading. Time and space here prevents me from discussing all of these technologies, but it would be apt to briefly discuss four of these environments that reflect the spirit and intention of Web 2.0, namely Facebook, Wikipedia, Merlot and E-Pals. Facebook is a social networking web space that connects people with friends and others who work, study and live around them. People use Facebook to keep up with friends, upload an 12

unlimited number of photos, share links and videos, and learn more about the people they meet (www.facebook.com). It currently has 50 million users, with 250 000 new users signing on per day. 3 million users are online at any one time in a day. What Facebook aptly personifies, is the need of people to connect, to share, to collaborate, to contribute and simply to stay in touch. It is not difficult to see how thinking about social constructivism tenets becomes living technology here. Although there are reports of companies blocking access to Facebook due to the time that workers spend on it being non-productive, there is an important lesson to be learnt from the enormous popularity that these social networking sites have. This have direct implications for educational strategies, and astute lecturers will make the right connections and find ways to use these technologies productively (i.e., for learning).

Wikipedia is a web-based encyclopaedia that is free, and to which any person can contribute. In the spirit of Web 2.0, Wikis are collaborative websites that allows web users to contribute to the site by uploading information, or by altering existing information. The Wikipedia model relies on volunteers from all around the world who contribute to it. The more than 75,000 active contributors work on some 8,700,000 articles in more than 250 languages. There are more than 2 million articles in English. Within the Wikipedia model, both refereed and un-refereed articles are found, but anyone can edit articles. Although this means that information can be falsified, the sheer number of people that read and contribute pick up malicious entries very quickly, as the South-African state official who recently changed an article maliciously, found to his peril. Not only was his mischief picked up upon by many users very quickly, but he was identified right down where he sat at his desk. Nevertheless, in this model, the ultimate peer review model is found: the whole of the connected world. More directly, and within the educational sphere, Merlot (www.merlot.org) and ePals (www.epals.net) are free and open web environments that connect teachers and learners from the world over. Merlot is a provider of online educational content across several disciplines. Content within Merlot is peer reviewed, while the site facilitates communication and collaboration between members. ePals is a provider of school-safe collaborative learning products for all ages of learners, teachers and parents. It helps to connect learners from around the world where they can interact with each other online in a safe, educational environment. The global community consists of more than eight million people and more than 128,000 classrooms in over 200 countries. ePals technology enables these learners, who speak over 136 different languages, to connect, share, collaborate and learn (www.epals.net).

13

These are but a few of the technology tools that are available to educators at present. And, to a certain degree, most users of ePals, Wikipedia or Merlot remain recipients of information, albeit it collaboratively constructed. However, other online tools exist that allow and support other cognitive process: constructing, de-constructing, organisation, reflection, reviewing, collaborating, publishing, analysing, comparing, mind mapping, verifying, tracking evidence, auditing, and so forth. A key feature of much of these software tools is that they are free, and open. This is referred to as Free and Open Source Software (FOSS). FOSS are computer programmes or tools whose licenses give anyone the freedom not only to run the program for any purpose, but to gain access (or open) to the source code and modify the programme, and also to redistribute copies of either the original or modified program, without having to pay royalties to the original developers. Often, no cost is involved when obtaining the software, bar for media costs. Table 1 shows some examples of FOSS.
GNU Moodle Drupal SAKAI Hot Potaoes Audacity Open Office Word Press Mozilla Firefox Mozilla Thunderbird Operating system Course/Learning Management System Course Management System Learning Management System Assessment software Audio editing software Open source Word processor, Spreadsheet, etc. Blogging software Web browser Email client

Table 1: Examples of FOSS It is important to understand the underlying philosophy of FOSS, which has to do with the belief that people should be able to use software in ways that are socially useful. Software is different from material objects such as chairs, motorcars or houses. It can be copied and modified much more easily. It is because of the possibilities in copying and modification, that software is as useful as it is (www.gnu.org/philosophy/). In this regard, Lyotard has this to say about the commodification of knowledge:
Knowledge in the form of an informational commodity indispensable to productive power is already, and will continue to be, a major perhaps the major stake in the worldwide competition for power. It is conceivable that the nationstates will one day fight for control of information, just as they battled in the past for control over territory, and afterwards for control of access to and exploitation of raw materials and cheap labor.

In a bold move, MIT opened up all their learning materials to the world in 2002 in a mission to advance knowledge and educate students in science, technology, and other areas of scholarship, This demonstrated their commitment to the spirit of accessible and free 14

knowledge. However, this action is the antithesis of the propriety nature of other vendors of knowledge, and it is aligned with the underlying philosophy of FOSS.

To summarize, new thinking about software, as is manifested in the Web 2.0 movement, coupled with new thinking about ownership of software as encapsulated by the FOSS philosophy, gives Education new opportunities to use technology optimally and appropriately. Education now, for the first time, has tools that have not been created with profit (as opposed to pedagogy) in mind. In addition, it has tools that attenuate the requirements of pedagogy, tools that can support real constructivist learning.

COMPUTERS AFFORDANCES AND COMPLEX LEARNING

Becker and Ravitz (2001) asks the question whether computers have become more compatible with the conditions of teaching? Or, is Larry Cuban right when he says that computers are really mismatched with the requirements and conditions of teaching?

It is important to consider how the affordances of computer technology can support learning, and specifically constructivist learning. The assumption here is that Web 2.0 tools are grounded in constructivist, constructionist, and connectivist theory. Limitations in terms of time and space again prevent me from exploring this topic fully. Therefore, I selected some tenets of learning in the constructivist paradigm1 (Jonassen et al 1999) to highlight. I elected to extract constructivist principles from the work of Jean Piaget (1896-1980) (active exploration
and discovery); Lev Vygotsky (1896-1934) (social context of learning; dialogue in knowledge construction) and John Dewey (1859-1952) (Learning by doing). Other thinking on constructivism is also included.

Constructivist learning Exploration and Independent Inquiry

Computer affordance (Web 2.0) Online resources like Wikipedia, Merlot, ePals, Web Quests

Interaction (learner-learner, learner teacher, learner content)

Email clients, social networking, tagging, expert inquiry

There are different approaches to constructivist theory. Typically, a distinction is made between cognitive and social constructivism. Here, I will not make the distinction (see e.g. Fosnot 1996, Cobb 1996).

15

Shared Knowledge and Cooperative Learning

Wikis, Weblogs (blogs), discussion groups, tagging software, folksonomies

Individualised knowledge production and representation

Mind

mapping,

presentation

tools,

graphics editing tools, audio and video production tools, digital portfolios Authentic and situated Real-world simulations, application tools, web construction, online role playing Active engagement Wikis, Weblogs (blogs), discussion groups, tagging software, folksonomies Reflection and meta cognition Instant messaging, blogging, wikis,

discussion groups Problem-based learning Virtual worlds, simulations, web

construction Complexity Hypertext, publishing tools, report

writing, Web Quests, visualization tools Scaffolding Graphing tools, modeling tools,

simulations, visualization tools, intelligent tutoring systems

Table 2: Constructivist learning and Web 2.0 technologies

Technology has great potential to enhance student learning, but only if it used appropriately (Bransford, Brown & Cocking, 2000). It extends learning beyond the classroom, and it supports learning in ways that perhaps has not been possible before. It gives access to vast amounts of information, and facilitates the construction and representation of knowledge in many different ways. Yet, the mere presence of computers will not lead to better learning for that we still need good teachers, and well-founded integration strategies.

Integration means "the process of totally integrating the use of computers into the existing curriculum through learning activities that address the subject-area objectives" (Staff, 1988). Integration means that computers are in the service of the curriculum while it seeks to identify those places where the computer can increase learning effectiveness, enable learners to do what they otherwise could not. This means that there is no need to add new outcomes to the school's curriculum to deal with computers. Integration treats the computer in a natural way, as a fundamental tool for learning, and fosters invisibility (Lockard, 2001).

Although curriculum integration is a step in the right direction, in itself it may not be enough. Norton (1988) saw significant gains in the integration approach but found it insufficient 16

nonetheless. Integration carries with it a set of unspoken assumptions which fail to recognize the unique potentials of the computer. [It] defines learning and education as content specific and content oriented and presupposes an existing curriculum that is best left unchallenged." Norton argues that it is not enough to bolster a curriculum that needs to be changed. Schools attempting to create a constructivist learning environment using computers as mindtools (Jonassen, 1999) will find that the existing curriculum cannot remain static.

WHAT DO WE LEARN FROM EDUCATIONAL TECHNOLOGY RESEARCH?

In the preceding paragraphs, I have painted a picture that showed that the use of technology in education may be flawed at several levels, despite the promise that the technology holds for learning. It would make sense to refer to the findings of research that has been done in this field. Research should tell us what the learning gains are when technology is used for learning. Unfortunately, it is not as simple as this.

The research results are varied, and seemingly inconclusive. A meta-analysis of 254 studies by Kulik and Kulik (1991) reported that computers helped students learn 30 percent faster than they do when receiving traditional instruction. This study was repeated in 1999, with similar results. However, these studies have been criticised for recycling the same old pool of research, and adding new material as it comes in. Several peer-reviewed journals publish research findings on (ostensibly) the benefits of educational technology and the pedagogical gains that were achieved (Lockard, 2001).

The state of educational ICT research is generally accepted to be poor (Reeves, 2000). Dillon & Gabbard (1998), reviewed 500 papers for an article they prepared for the journal, Review of Educational Research and found that only 30 of these met the minimal criteria for good scientific studies for inclusion in their review. The reasons for the poor quality are varied, but most pertain to lack of proper research design.

In 2002, I undertook a meta-analysis of research papers about computers for learning. I found that the majority of the research reports and other published writings focussed on case studies. While the number of the reported case studies indicated that there is strong growth in using computers for learning (there were 90 reported case studies across a variety of subject fields), most of the reports were general in nature, many of them focussing on student and course

17

presenter experiences of the technology. Few of the reports used a specific pedagogical aspect, or a theoretical framework, as point of departure. No apparent theories emerged from these studies. It was also significant what was not being reported. Scant references were found related to scaffolding, deep learning, coaching and meta-cognitive support, the role of motivation in learning, the use of metaphors in online environments, navigation design and the interaction between hypermedia and cognition.

In 2005, I initiated a project to examine the scholarship of Educational ICT masters and doctoral research at seven South African universities. In the research, 103 doctoral and masters dissertations from 2000 2005 were examined by post-graduate students. Studies that are in progress (or have been completed) include studies on research design and methods of inquiry, the trends in literature reviews or topics, theoretical frameworks, and theory generation. Today I can report on two of these studies.

In the first of the series of studies, the aim was to seek an understanding of how researchers (masters and doctoral students) have employed theoretical frameworks during the studies. Particular attention was paid to the congruence (or the lack of congruence) between the purpose of the inquiry, the theoretical framework in which the study was conducted, the research design and the interpretation of the findings. It was concluded that a significant number of authors employed their theoretical frameworks in a very limited way, presenting findings that amounted to no more than descriptive results. This study into students engagement with theoretical frameworks has shown that most of the work is poor in theory and that the studies are thus also poor scholarly outputs. It is clear that a link exists between methodologically and conceptually coherent studies and the use of theory. It has also shown that technicist approaches lead to technicist results (Agherdien, 2007). It is clear that the majority of these studies would not contribute to theory generation.

In the second of the series of studies, the aim was to establish which research designs were employed in the research sample, whether these designs were appropriate and aligned with the purpose of the studies, and whether the research designs were executed correctly. The vast majority of the studies made use of qualitative data, with case study most often being cited as the research method, followed by action research methods. In some cases it was clear that the student did not have a clear or correct understanding of what the cited methodology entails. A substantial number of researchers did not justify their selected methodologies, and

18

in some cases, the selected methodologies did not match the purpose of the research. Only a few quantitative studies were undertaken, and then using the survey method. One experiment was undertaken, while a number of Design Experiments were undertaken, notably at RAU, or the University of Johannesburg. Mostly, open coding was used as data analysis technique. The studies generally resulted in guidelines and recommendations, with few explicit attempts being made to either generate or contribute to theory.

I argue then that the theoretical thinness, as described in previous paragraphs, and the often inappropriate or inadequate research designs employed in the majority of these studies, will not lead to better understanding of the effect of computers in the classroom. It would appear that a far more robust research agenda would be necessary if we wanted to know if, and how computers should be used to support learning. We would have to avoid being seduced by the technology, and research the pedagogies that would support learning.

TOWARDS APPROPRIATE EDUCATIONAL TECHNOLOGY RESEARCH

The type of research that I would like to advocate here is inquiry into the thinking that grounds learning with computers itself (Van der Westhuizen & Henning, 2005). The theoretical framework that constitutes this type of learning draws on a wide range of theoretical substance some of which are: theory of media and of semiotics (Tomaselli & Sheperson, 1996); theory of distributed cognition (Brown, 2000; Brown, Collins & Duguid, 1989; Salomon, 1999); theory of activity (Engestrm, 1991) and theory of learning and of knowledge (Bransford, Brown & Cocking (eds.) 2000, Vygotsky, 1992; Kozulin, 1990; Rogoff, 1990; Phillips, 2000).

It appears that most of the research does not look at processes of student learning sufficiently and that there is also a dearth of comprehensive integrated research. In addition, the research often confounds teaching methods and the teaching medium. As Clark (2004) and Salomon (1999) found, computers and the learning that it wishes to mediate is in itself an underdeveloped applied science (Van der Westhuizen & Henning, 2005). Research on using computers for learning will strengthen its knowledge base when the aim is to rigorously research the learning that it mediates, or is assumed to mediate, and to try to isolate the factor of instructional method itself from the analysis (Van der Westhuizen & Henning, 2005). This implies experimental or quasi experimental designs along with other mixed-methods

19

strategies.

Although Reeves (2000) argues for less positivistic (implying experimental)

research and for developmental inquiries that will shape the development of theory of computer-supported learning, I do see a place for experimental or quasi-experimental designs not aiming to measure student learning outcomes as much as to see change in learning over time in comparable computer-supported learning environments in which researchers will be seeking aspects of processes, of reading texts, of expressing understanding and problem solving, of using hypermedia to solve personal intellectual puzzles and so forth (Van der Westhuizen & Henning, 2005). The designs will thus not be experimental in the sense that they will be positivistically driven. In fact, the experiments will source qualitative data that will be interpreted qualitatively and perhaps not even be converted to statistical discourse. The thinking of such research design is not to venture into a positivistic discourse, but to systematically and comparatively capture processes and change that can be applied as scientific evidence in the design of online programmes.

In essence I am arguing for scientifically and intellectually satisfying research designs and larger research agendas that can ground the development of models situated in appropriate theory. I suggest Stokess (1997) matrix perspective on research, which presents research agendas with different envisaged outcomes: it may have high practical value, or may make a contribution to fundamental understanding (theory), or both, or none. I assert that the greatest challenges that face the use of computers for learning are situated at the interface of these two dimensions, and that research that contributes little to fundamental understanding (therefore theory) and has limited practical applications may at best be worthless.

Research inspired by considerations of use? (Practical consideration) No E.g. research by Bohr for an atomic model Yes E.g. research by Pasteur for preserving milk Discoveries and research done by Edison

Yes Research inspired by quest for fundamental understanding (Theory generation)

No

Figure 1. Stokess quadrant view of research (Adapted from Stokes, 1997).

Ultimately, computer supported learning interventions/innovations need to be engineered via an integrated theory of learning with computers, which, in turn, will have to be developed in

20

an integrated research programme. Such a theory does not exist and it may be because the research that is conducted is often not coordinated in a design logic that exemplifies the capturing of learning itself and is often also focused on the technology more than on the learning-with-the-technology (Van der Westhuizen & Henning, 2005).

One of the most favoured contemporary designs in educational research is that of design experiments. This design was originally conceived in school learning research by Brown and Colllins (Brown, 1992). They capture learning in real (non-laboratory) contexts where innovations/interventions are studied. They place educational experiments in real-world settings to find out what works in practice (Roosevelt-Haas, 2001). According to Cobb, et al (2003) design experiments entail both engineering particular forms of learning, and systematically studying those forms of learning within the context defined by means of supporting them. This designed context is subject to test and revision, and the successive iterations are similar to systematic variation in experience. Design experiments incorporate the notion of formative and summative evaluation of learner skills and knowledge demonstrated over time, penetrating into the learning processes, as instructors and researchers negotiate instructional decisions (Brown, 1992). Therefore, appropriate and rigorous

examination of computer-supported learning by means of design experiment approaches may contribute to fundamental understanding of these electronically mediated learning situations. I advocate this type of design, with some stronger emphasis on capturing individual learning processes all of which are currently possible by means of innovative methods of data capturing and analysis by specialists in the field of human learning (Henning, Van Rensburg and Smit, 2004). Therefore, I suggest that the most appropriate way to research the effectiveness of computers for learning and contributing to theory is by making use of these types of designs with intact groups. Only then will we be able to elude the seduction of the Technopoly, and click towards complex learning.

AFTERWORD

There is no doubt in my mind that, as human and cultural endeavour, we have no choice but to embrace technology. Some of us are still digital immigrants in this modern world. Yet, as the new generations of learners enter our classrooms as digital natives, we may very well find that these classrooms and the practices within them, are anachronistic. Naisbitt wrote in 1982 that it is impossible to predict technological innovation as it weaves and bobs and lurches

21

and sputters. We have to accept that the future is uncertain, and that we have much to learn. Technology will increasingly become a catalyst for change in the future, and teachers should be the pathfinders that show the route in finding meaningful and appropriate ways to use technology to support learning. In addition, we simply have to question the assumptions that propel the deployment of computers into institutions of learning. We also have to ask questions about how technology can support the values of democracy, community and citizenship. We have to, in a way that is not surrender, but pro-action, use the technologies as extensions of our human capacities and contexts.

22

List of Sources
Agherdien, N. 2007. A review of theoretical frameworks in Educational ICT research at leading South-African Universities. Unpublished Masters dissertation. University of Johannesburg. Amory, A., Dubbeld, C. & Peters, D. 2004. Open content, open access and open source? Ingede: Journal of African Scholarship. 1 (2). Becker, H.J. & Ravitz, J.L. 2001. Computer Use by Teachers: Are Cubans Predictions Correct? Paper presented at the 2001 Annual Meeting of the American Educational Research Association, Seattle. Bransford, J.D., Brown, A.L. & Cocking, R. (Eds) 1999. How people learn. Brain, mind, experience and school. Washington: National Academy Press. Brown, A.L. 1992. Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. The Journal of Learning Sciences, 2(2):141-178. Brown, F.B. & Brown, Y. 1994: Distance Education around the world. (In: WILLIS, B eds. 1994: Distance Education: Strategies and Tools. New Jersey: Educational Technology Publications). Brown, J.S., Collins, A., & Duguid, P. 1989. Situated cognition and the culture of learning. Educational Researcher, 18(1). Clarke, R.E. 1994. Media will never influence learning. Educational technology research and development, 42 (2). Cobb, P.; Confrey J.; Disessa, R. & Schauble, L. 2003. Design experiments in educational research. Educational Researcher, 32 (1):9-13. Dillon, A. & Gabbard, R. 1998. Hypermedia as an educational technology: A review of the quantitative research literature on learner comprehension, control, and style. Review of educational research, 68(3), 322-349. Henning, E., Van Rensburg, W. & Smit, B. 2004. Finding your way in qualitative research. Pretoria: Van Schaik Publishers. Jonassen, D.H. 1999. Computers as Mindtools for schools: Engaging critical thinking. Upper Saddle River: Merril. Kozma, R. B. 2003. Technology and classroom practices: An international study. Journal of Research on Technology in Education, 36(1), 114. Lyotard J-F. 1979. The Postmodern Condition. Manchester: Manchester University Press. Available online. http://www.marxists.org/reference/subject/philosophy/works/fr/lyotard.htm. Lockard, J. & Abrams, P.D. 2001. Computers for twenty-first century educators. New York: Longman. Noble, D.F. 1998. Digital diploma mills: The automation of higher education . 1998a. Available online. http://communication.ucsd.edu/dl/ddm3.html. Oppenheimer, T. 2003. The Flickering Mind. Saving education from the false promise of Technology. New York: Random House Trade Paperback Edition. Papert, S. 1980. Mindstorms: Children, computers and powerful ideas. New York: Basic Books.

Postman, N. 1992. Technopoly. The surrender of culture to Technology. New York: Vintage Books.

23

Reeves, T.C. 2000. Enhancing the worth of instructional technology research through design experiments and other development research strategies. Symposium on: International perspectives on instructional technology research for the 21,st. century (session 41.29: New Orleans, LA, USA). Roblyer, M.D. & Knezek, G.A. 2003. New millennium research for educational technology: A call for a national research agenda. Journal of research on Technology in Education, 36 (1). Rogoff, B. 1990. Apprenticeship in thinking. Cognitive development in social context. New York: Oxford University press. Roosevelt-Haas, M. 2001. The new perspectives in technology and education series. Harvard Graduate School of Education. Available online. http://www.gse.harvard.edu/news/features/tie10052001.html. Saunders, P. & Werner, K. 2002. Finding the right blend for effective learning. Available online. http://www.wmich.edu/teachlearn/new/blended.htm. Stokes, D.E. 1997. Pasteurs quadrant: Basic science and technological innovation. Washington, DC: Brookings Institution Press. Van Der Westhuizen, D. 2002. Online learning in the South African context: A meta-analysis of research trends, issues and topics. [Full paper published in the proceedings of the 2002 SASE conference. South African Society for Education, Pretoria.] Van Der Westhuizen, D. & Henning, E. 2005. The student at risk in online learning: the case for appropriate research. [Full paper published in the proceedings of the World Conference for Computers in Education (WCCE), July 2005, Stellenbosch]. Vygotsky, L. 1992. Thought and Language. (Edited and revised by A Kozulin, 6 th Edition). Cambridge, MA: MIT Press.

24

Vous aimerez peut-être aussi