Vous êtes sur la page 1sur 2

OLTD 505 Week 3- Quality of OERs

May 15, 2014



A MOOCs Tale

This week I will attempt an assessment of a MOOC that I took February 2013 however according to current thought it actually
was an xMOOC. This definition of MOOCs can be further defined according to the inherent structure as either network-
based, task-based or content-based. The MOOC I participated in would be defined as a contentbased xMOOC.

I was curious about Ivy-league Universitys offering free education and I knew I needed to experience it first hand. I
was hesitant to engage in an unknown format of online education (MOOC) as well as tackling an unfamiliar subject so
I opted for a course that required only a modest amount of time and was a topic I knew well. I actually thought I would
see what was new in the nutritional- disease knowledge base and perhaps gain some best practice concepts.
I enrolled in a MOOC called Nutrition for Health Promotion and Disease Prevention and the University of California San
Francisco (UCSF) offered it. It was under the auspices of Coursera and was six weeks in length. I completed the course
and for my efforts received a Statement of Accomplishment signed by the instructor but bearing only a Coursera logo.
It was an interesting experience and not completely foreign to me as I am very well schooled on nutrition and disease as
well as having an abundance of experience with online courses. What was unusual was discovering that as of the first
lecture there were 26,000 students enrolled from all over the world. According to the instructor this was an unexpected
response to the offering.
The MOOC consisted of pre-taped instructor led lectures on a variety of
nutrition and disease related topics. There were no interactions between the
instructor and students or student-to-student except the two required
assignments were peer assessed. Each week the instructor spoke at length,
approximately 20 minutes, about a specific disease and link to nutrition,
for example diabetes. Following each 5-minute video segment there would
be 2-3 questions posted with answers required before the video would
continue. Correct answers and explanations were given immediately
following the answers provided by the student. There were 2 or 3 weeks
where the instructor interviewed guests who spoke to a particular topic.
The two assignments required some research, creating a 2500-calorie
diabetic diet and journaling 48 hours of our own diet as well as breaking down the caloric content. Each student was
expected to peer review at least 3 student submissions and we were given 4 criteria to consider in our reviews. I peer-
reviewed at least 10 submissions for each assignment. For every peer-review I did beyond my expected 3 I garnered
automated praise extolling my virtues as a committed peer/student. I remember thinking that if each student peer-
reviewed the expected 3 submissions there may still be many submissions not reviewed given there were 26,000
students in the course. I know now that many of those that signed up initially probably did not submit any assignments.
My peer reviews were of assignments from participants with English as a second language, some assignments were
incomplete but also some that were well done.

Grainne Conole suggests in her blog that a classification as either an xMOOC or cMOOC seems too simplistic for
MOOCs and she suggests there is another classification system that better captures the nature of MOOCs. She
suggests using a 12-dimension classification system as a tool to critique MOOCs or even design them.
I chose to critique the MOOC discussed above using the 12-dimensions as suggested by Conole. I used low, medium
and high to denote the amount of evidence I garnered from the MOOC for each dimension.

Dimension Degree of Evidence
Open Low- no evidence of open educational resources
Massive High- 26,000 participants enrolled
Use of Multimedia Low- video lectures; slide show of text
Degree of Communication Low- no discussion forums; no mechanism for input
Degree of Collaboration Low- pre-designed and determined content
Learning Pathway Low- one route through course
Quality Assurance Unable to determine
Amount of Reflection Low- no reflection component
Certification Low- statement of accomplishment
Formal Learning Low- course informal and optional
Autonomy High- minimal tutor support; self-paced with deadlines
Diversity High- no prerequisites noted; open to any accessing Coursera site

The 12-dimension assessment actually breaks the MOOC into its parts and a clear picture can be realized. This MOOC
was obviously not designed with student engagement as a central tenet and perhaps it was a sheer numbers experiment.
Does the number of initial participants speak to the success of the course or does it offer a glimpse into how ready
people around the world are for free online education? I wonder if the instructor was satisfied with her course offerings
or if she felt one-dimensional? Teaching without discourse seems lonely and misguided. I also wonder how many
participants outside Europe and North America actually learned anything from the course, as the topics of the course
highlighted primarily first world issues and I cant imagine that persons living in developing countries would be
burdened with the nutritional landmines that we currently experience in North America. This course was offered again
in February 2014 and interestingly the course now has prerequisites and the topics have changed slightly but the pre-
video lecture format remains the same. Information in.?

Would I participate in another MOOC? Would I recommend this form of online learning to others? Yes and yes
because I am convinced there is something to the idea of sharing education with others online. However I am not sure
that massive (in the thousands) number of participants allows for any personalization of the learning. How do we learn,
how do we engage with limited to no contact with other students or instructor(s)?

Vous aimerez peut-être aussi