Académique Documents
Professionnel Documents
Culture Documents
1 February 2018
ENGW3307
Consider a winter sky in evening, pale blues shot through with the rosy residue of a day’s
last sunlight, lingering for scarcely an hour before the onset of night. The act of observing is so
simple, but it involves so many steps. That soft light has to strike your eyes, and the back of your
retina, where rods and cones fire off electrical signals, converted into action potentials that run
down nerve fibers to visual processing centers within the brain. At last, conscious awareness as
the information arrives in the prefrontal cortices, while at the same time, just maybe, it evokes an
emotional response by spreading activation in midbrain regions. All this, faster than thought, the
realization of something sublime. Language is a marvel all its own; with a few words I might
conjure the scene in your mind through a process no less complex than that initial experience.
In some sense, it was a foregone conclusion that I’d come to neuroscience. I always
wanted to understand why, why did the colors of the sky strike me so, why did the world appear
as it did, why this awareness? To understand the brain as a starting point on that journey made a
lot of sense to me. While the science often has mixed results, there remain commonalities that
turn up, which begin to paint a more conclusive picture of how we as human beings generally
perceive and interact with the world around us. I could wax introspective or spend hours reading
literature, and while this might help me understand nuances and cultural elements, neuroscience
gives us the tools to draw better conclusions about a general population. It’s never perfect, but I
think my coursework has done a good job of making me question the way things are done, and
pushed me to really understand the limitations and the genuine benefits of various approaches.
However, before I go delving any further into the major influences on how neuroscience
has been taught to me here, I should mention that I’ve also taken a few humanities-centric
courses, either interdisciplinary seminars or classes for my English minor. Of interest to me has
always been what art and storytelling, in its various forms, can reveal to us as forms of
self-expression. I’ve noticed that the style of teaching differs greatly from the usual COS
courses; usually they’re discussion-based and tend to be more subjective rather than centered
around a single lecturer presenting powerpoint slides and lots of factual information. The only
time this has proven false was when I took a class called Seminar in Cognitive Neuroscience last
semester, during which we read through a number of scientific papers on various topics, past and
present, and spent class periods discussing the ideas at length. I felt it was a valuable discourse to
have, and that it helped to contextualize a number of the topics and ideas, and also highlighted
In that sense, lecture style classes have struck me as a method of convenience, rather than
the best practice in education. Classes that number in the hundreds of students all have to get
through a certain amount of factual material, and so the students sit for a few hours every week
and copy down notes from a powerpoint to review at their leisure (or immediately before an
exam). That’s not to say that the lecture courses aren’t important; I’ve learned from them and
that foundation is crucial for future understanding; but a more open classroom setting to offset
the lectures occasionally, and to make learning more dynamic, could do better.
If we’re beginning with the sources for these trends in teaching, the seminar system
traces its roots to Socrates, or Plato’s descriptions of him, all the way back in ancient Greece, and
the concept of constantly questioning everything in order to learn more. The philosophical
tradition has had many more branches since then, and it would be an entire other discussion to
think about how ideas about aesthetics have changed over the centuries, or how literary thinking
in fiction has evolved since the advent of the novel at the hands of Cervantes, and how that has
influenced my coursework. I think that ideas from the humanities, be they historical or artistic in
nature, do still have bearing and relevance to the advance of science, though more as a tempering
force rather than a driving principle. The importance of empiricism, and the emergence of the
scientific method on the other hand, is tied to Francis Bacon. The idea of a standardized method
for rationally investigating and making evidence-based claims about the world was crucial in
pushing science forward. The relation to older philosophical traditions I think, remains with the
idea that learning is done by engaging with the world and with others, and I find it strange that
lectures have thus become such a dominant method within my own discipline, even with
When it comes to neuroscience, which sits at this juncture of psychology and biology,
there have been a few key actors. From the physiological perspective, Santiago Ramon y Cajal
had a huge influence in contributing evidence to neuron theory and actually visualizing neurons
in painstaking detail. That anatomical basis has served as a crucial component of the field. I took
a course called Comparative Neurobiology with Professor Ayers, who does work in developing
robotic lobsters, and it especially emphasized the importance of physiological work in terms of
understanding how behavioral circuits function, and how the nervous system in general has been
organized in other organisms and ourselves. I also think another interesting bit of development
within the field resulted from observations by someone who was not a neuroscientist, but a
doctor.
John Martyn Harlow wrote up the case study of Phineas Gage, a man who’d had an iron
bar shot through part of his skull and survived, though with large changes to personality. “The
equilibrium or balance, so to speak, between his intellectual facilities and animal propensities,
seems to have been destroyed.”(Harlow, 1869). Harlow went on to write about how vulgar and
lazy the formerly virtuous man had become. With that portion of brain missing, Gage couldn’t
self-regulate his own behavior. The importance of physical subjects makes itself more apparent
here; the inadvertent lesion and its effects led down the road to a better understanding of the
function of the frontal lobes in inhibitory and executive roles. We discussed the case in my
seminar class, and also went to see the skull (which is held at the Warren Anatomical Museum),
getting to learn some of the history of medicine in Boston. As an aside, the curator mentioned
that many of the artifacts on display came from cadavers stolen from graves by desperate
medical students and their mentors, back in 18th century America. It looks like hands-on
On the more theoretical side of things, as part of that seminar class, a classmate and I led
a discussion on the mind-body problem in cognitive neuroscience; that is, the difficulty in
determining the degree to which the experience of the mind can be related to the physical matter
of the brain/body. It’s an interesting philosophical point, and while it’s easy to dismiss certain
thoughts on the subject, it involved a number of texts that I found helpful in illuminating the role
that such philosophical positions could play in informing scientific thought and study. Of these
was an essay structured like a dialogue between three different people and their teacher, that
examined three differing theories: dualism, eliminative materialism, and functionalism (Kendler,
2001). The question of how we frame the relationship between our brains and what we define as
‘us’, has implications on treatment and even personal culpability before the law (pleading
insanity, adolescent brains being more prone to risk-taking, therefore deserving greater leniency).
In addition, as part of leading the discussion, we looked at some work written by David
Marr, who wrote extensively about vision processing. He helped begin computational
neuroscience, and did so by positing three levels of explanation for information processing
systems: computational, algorithmic, and implementational. It was respectively the ‘what’, the
‘how’, and the specific mechanism used (Marr, 1982). His influence in particular stands out,
because it reminds us that the brain’s particular hardware isn’t necessarily the most important
part; rather the organization achieved plays a more crucial role. It’s a step beyond the work of the
physiologists, and ties into the direction the field has moved with its new imaging tools. It gives
the sense that there remains a bigger picture to be seen and understood. In recent classes, it’s
become clear how much of a massive undertaking this is, and how much more we still need to
I thought that particular discussion was wonderful, as were many others we had in that
class, and the fact that I recall those ideas so clearly today means that I definitely learned them.
My personal take on the mind-body problem, in light of the various ideas we’d read, was that
even if consciousness was entirely a result of our brains, even if we could reduce all the
complexity of our minds down to electrical impulses and chemical releases, it would not
diminish the importance of meaningfulness of our individual experiences. Those qualia and our
conscious awareness is how we define and interact with one another as human beings. Some had
some more hardline takes on the topic, whereas others went in the opposite direction and didn’t
want to rule out the possibility that ‘consciousness’ could exist beyond our matter. They all left a
mark on my thinking, and pushed me to develop and defend my own ideas better.
The way we learn is rarely ideal, and education takes many forms over the course of our
lives. A lot can depend upon the specific demands of a given field. Neuroscience sits at this
concerned with psychological explanations (because of the needs of clinical patients, and people
in general), but at the same time, it relies on rigorous scientific techniques to build upon the
pre-existing knowledge base we have, or even to replace old concepts as new evidence comes in.
The ethical demands of such work are rigorous and important, and taken for granted sometimes.
I’m grateful that in the course of my learning I’ve been given space, in the past and now, to
reflect on where I’ve been. I’m still learning, still striving to understand better, and maybe one
day, I’ll know exactly why the sky can bring a trace of tears to the edge of my eyes. Or, to be
less dramatic and more scientific, why people as a whole might perceive features of the natural
Aika Misawa and Maggie Turner for their criticism in peer-reviews. The ideas I read were really
helpful for revising the draft into its current form. I particularly decided to take the suggestion to
Dr. Cecelia Musselman for pointing out ways to improve my phrasing and asking some good
References
Harlow, J., & Massachusetts Medical Society. (1869). Recovery from the passage of an iron bar
Marr, D. (1982). Vision : A computational investigation into the human representation and