Vous êtes sur la page 1sur 40

Digital Games as Programming Languages

In the mid-1960s, the RAND Corporation, a non-profit technological research institution,

developed an online "service" called JOSS that was deployed across a central server and
specialized computer terminals. The service was described (rather cryptically) in an in-house
report as a "special-purpose system designed to provide the user with a personal service through
remote computation" (Baker, 1966, p. v). To be more precise, JOSS was a scaled-down
programming system that users could interact with in real-time via a command-line interface. It
was, according to one RAND report, a "helpful assistant." Its designers stated quite emphatically
that JOSS was "not a language for programmers," but rather to solve "complex numeric
problems" for engineers and other professionals that did not necessarily have much experience
working with computers (Baker, 1966, p. 2; Gimble, 1967, p. v). The concern for simplicity
informed all aspects of its design; it featured a "simple, easy-to-learn programming language," as
well as a shared file system in later versions (Northrop, 1967, p. v).
While JOSS was quite successful in its intended role, some of its users began to explore
its full potential, and in addition to being a helpful assistant it took on a far more ominous task:
the simulated prosecution of global nuclear war. Due to its relative simplicity as compared to
programming languages like Fortran one report describes the JOSS language as "close to
conventional English," which is perhaps an exaggeration RAND's in-house war gamers
increasingly used the JOSS system to build and implement their programs. Among the most
important of these was a game that went by the (rather awkward) name of STROP, described in a
RAND report as "a highly aggregated central nuclear war game." Due to the complexity of
STROP's model, it was essential to automate the calculations required by the game's rules. The
end result was a text-only dialog session between the players and the JOSS server, with players
entering data and commands via a command-line prompt.
Over the next ten to fifteen years, these command-line, text-based strategy games became
increasingly popular not only as tools for military strategists, but also as forms of entertainment.
Early classics such as Hammurabi and Lunar Lander made the rounds across a variety of
platforms, presenting simple but nuanced game worlds that challenged players to supply values
for key variables that would then be processed within underlying mathematical models. This
back and forth would continue until an end state was reached. Such games were not overly
difficult to develop, particularly when high-level languages such as BASIC were used. With the
advent of the personal "home" computer in the late 1970s and early 1980s, text-based gaming
reached its zenith. A wide variety of books and magazines that catered to the home hobbyist
programmer were published, including David H. Ahl's classic BASIC Computer Games. The
practice of typing in a given program was not intended to be mere yeoman's work, however, as
these publications almost always encouraged users to modify and "play" with the games they
transcribed on their home machines. This process of adoption and adaption resulted in the
creation of many replicate programs, as users would borrow elements from exemplar programs
and then modify them to create new, but familiar, games and applications.
These practices were once commonplace, but began to fade as increasingly complex,
graphics-based operating systems made it much more difficult to create programs that performed
all but the most rudimentary of functions. There are now, however, many individuals and
institutions the Raspberry Pi Foundation, for example that are now trying to revive hobbyist
home computing, often with an emphasis on game programming. I believe, then, that there is a
pressing need to better understand the conditions that led to a sprawling trade in type-in text-
based games in the 1970s and 1980s. While projects such as RPi look to the past for inspiration,
they tend to overlook many of the critical technical and contextual elements that originally
fostered the growth of home hobbyist computing. A key issue here is the seeming lack of
interest in the interactive programming systems (IPS) model in which a programming language
such as Commodore BASIC could operate essentially as its own operating system, allowing
users to jump right into programming as soon as they switched on their computers. The IPS
paradigm, in its fully-realized form, provides something of a programming laboratory or
workshop in which programs at various stages of development and execution may coexist. This
sort of system, I believe, needs to serve as the foundation for modern hobbyist machines for them
to serve as tools for creative programming, particularly with respect to games.

Research Questions
To be more precise about my proposed study, I will aim to address the following research
- What developments in the history of digital computing led to the eventual rise of
hobbyist game programming?
- How and why did turn-based text adventures and strategy games become so popular
in this era? In addition, why was programming often cast as a "game" in and of itself
in hobbyist books and magazines?
- Is it possible to operationalize this history so it may inform the development of new
game tools and genres? What would be the benefit(s)?
- Is it possible to break individual games down into constituent components? How
might we use these components as "building blocks" in the development of new
games and related applications (e.g. development tools)?
To address these issues, I propose to conduct a three-stage research and design process,
though I anticipate that there will be occasions in which I will be working on two or three of
them concurrently. Regardless, I will summarize them for now as if they are three linked but
discrete units, and the refine this plan in the section on methodology later in this proposal:
- For stage one, I will delineate and analyze the early history of command-line, text-
based digital games. Note that this will take my work in a much different direction as
compared to accounts of early game history that focus on arcade-style games such as
Pong and Spacewar. I will ensure, in particular, to address the following issues: the
emergence of the digital display screen, proto-game works such as Bouncing Ball (for
the Whirlwind system), the development of interactive time-share network systems,
the creation of the FOCAL and BASIC programming languages, type-in game books
such as BASIC Computer Games and magazines such as Compute!, and the evolution
of the interactive programming system paradigm.
- For the second stage, I will examine and analyze the mechanisms by which the game
programs of the hobbyist era were distributed, collected, and reinvented by the
individuals and institutions involved in hobbyist gaming practices. Rather than being
a historical study as in the previous phase, my approach here will be more theoretical
and conceptual. Woolgar's notion of the "technology as text" will play a significant
role in framing this approach (Woolgar, 1991).
- For the third stage, I will employ Schn's notion of design worlds as a means to
reconceptualize games command-line games at first, but perhaps shifting into other
subgenres as assemblages of various design elements that can be broken down and
reassembled to create new games. These elements include game-related data, game
rules, and interface elements such as a command prompt and parser. This is the area
of my project for which I am planning to develop (i.e. design and code) my own
game-related programs, though non-executable pseudo-code might be sufficient for
the purposes of my research.
As I mentioned before, these stages are not meant to be separate silos. Schn's design world
model, for example, will likely inform my research in the first and second stages. But this
categorical approach is useful in establishing the parameters and boundaries of my work.
The details for how I plan to approach these areas may be found in the chapter on
methodology. Before that, however, I will provide a brief review of the literature that supports
my proposed work.

Review of the Literature
My research interest and approach are informed by a set of theoretical and ontological
perspectives that, I believe, work together quite effectively, even though I cannot find any
preexisting scholarship in which this has been done. These perspectives are principally informed
by two metaphorical concepts. The first is the notion of technology as text, as proposed by
Woolgar (Woolgar, 1991; see also Grint & Woolgar, 1997). The second is the notion of the
design world as developed by Schn. In both cases, these concepts have been critiqued and
expanded upon by later scholarship, so that my understanding of each is informed both by the
original ideas, and by these refinements.

Apart from the programming phase of my work which I will discuss shortly my
proposed research will be almost entirely qualitative in nature. It will also focus largely on
documents, ranging from mainframe user manuals to firsthand accounts of specific events. A
major theme supporting this work, moreover, will be reflexivity. This is a rather specific
pathway, yet it is also one that is supported theoretically and methodologically by the work of
scholars such as David Altheide, who describes his approach as ethnographic content analysis
(ECA), though he also on occasion employs the term qualitative document analysis (QDA; see
Altheide, 1987; and Altheide et al., 2008). He also employs the notion of discourse throughout
his work, though not as systematically as scholars from the critical discourse analysis (CDA)
tradition (Athleide, 2000). For this reason, I also plan to bring in the work of Fairclough, who
designed a three-level mode of critical inquiry that can be extremely effective. I will also note
the work of Stephenson, who shapes Fairclough's methods into a model in which the historical
development of discourse is emphasized.
A major component of my work will strive for reflexivity, which provides a
methodological foundation to support certain aspects of my proposed research. In particular, the
program development that I wish to conduct here acts as a reflexive approach in which I strive to
learn more about both the subject at hand and my own interpretation of events. As Woolgar
notes, such reflexive work "suggests that all versions (descriptions, accounts) of technology be
granted no greater authority than any other outcome of textual production and interpretation"
(Woolgar, 1991, 41). Crucially, this includes our own scholarly output, with Woolgar noting
that "we as analysts conventionally privilege our own status vis-a-vis the relativized status of the
texts of others" (Woolgar, 1991, p. 41). By abandoning our lofty regard for our own work, then,
reflexive research provides a much more open space for creative and experimental work that is
considered the equal to more familiar forms of scholarship. As I will explain shortly, my
programming work will used as a means to reflect on both my documents and my analysis of
these documents.

Much of Altheide's work rests upon an expanded understanding of the concept of
ethnography. Traditional ideas about ethnography, he notes, place heavy emphasis on physical
presence that is, the notion that the researcher must be in a physical setting, observing human
activity as it happens, in order to conduct ethnographic research (Altheide et al., 2008). While
not dismissing the importance of "immersion and involvement" in a given setting, he argues that
the notion of what constitutes setting shifts when "symbolic communication" is being studied.
Such research would need to be performed in the same "spirit" as traditional ethnography,
If the key element involves human beings in a situation, then an ethnographic
perspective entails being "there" with the people. However, if the research focus
is not on human action per se but rather on symbolic meanings and perspectives
within a different domain, then a research perspective and orientation can also be
said to be "ethnographic" if it is oriented to emergence, discovery, and description
(Altheide et al., 2008, p. 135).
From this perspective, ethnographic research is grounded in a value system, not a specific
mechanism by which data is gathered. Altheide then goes on to explain how this approach
would be implemented in a scenario where symbolic communication is emphasized:
Document analysis becomes ethnographic when the researcher immerses him or
herself in the materials and asks key questions about the organization, production,
relationships, and consequences of the content, including how it reflects
communication formats grounded in media logic. The focus initially is on
exploration, reading, looking, reflecting, and taking notes before more systematic
and focused observations are undertaken (Altheide et al., 2008, p. 135).
The key point here is that ethnographic researchers need to adopt a grounded, flexible approach
in which they allow the documentary evidence to lead them on a process of discovery and
analysis. This means that the overall goals of a given research project cannot be overly rigid, as
research findings may encourage a different tack with respect to a particular question. As
Altheide puts it, "[a]lthough categories and 'variables' initially guide the study, others are
allowed and expected to emerge throughout the study," leading to an overall process of "constant
discovery and constant comparison of relevant situations, settings, styles, images, meanings, and
nuances" (Altheide, 1987, p. 68; emphasis in original).
What Altheide is proposing, then, is an approach to engaging with documentary research
that breaks from older methods in which (ostensibly) objective and systematic analysis was
considered ideal (see Franzosi, 2008). This sort of work centred around the notion of the
protocol, defined by Altheide as "a list of questions, items, categories, or variables that guide
data collection from documents" (Altheide, 1996, p. 26). While QDA researchers also pose
questions and deal with categories and variables, these are all subject to change as progress is
made through the documentary evidence, whereas they are essentially fixed when using
traditional methods. As Altheide puts it, "the investigator is continually central" in ECA/QDA
research (Altheide, 1996, p. 16; as cited by Altheide et al., 2008, p. 128). And it is the
impressions made by the documents on the researcher that drive an ECA-based research project
Interpretation in QDA emerges as the researcher is immersed in a community of
documents, as he or she converses with them by considering them together as a
community that can speak, and as he or she tracks his or her emerging
interpretations in this very community of documents (Altheide et al., 2008,
p. 128).

Since ECA/QDA is an approach to research that evolves based on subjective impressions
of relevant documents and texts, I must approach this section of the proposal having already
engaged with many of the materials that I will present here. As a consequence, the narrative I
present here with respect to "community of documents" that I will incorporate into my research
will inevitably be informed by this earlier work. Perhaps more importantly, this narrative will
continue to evolve as I work, in line with what Altheide has to say about user-centric research.
What I am providing here, then, is a snapshot of what I am focused on at the moment, and where
I believe my research will take me in the future. There will inevitably be a speculative element
to this, but I will clearly indicate the moments in which I have had to make decisions with only
limited information to guide me.
As I began to conduct research for this project (as well as earlier papers and
presentations), I focused heavily on materials that could be found online. There are limitations
to such a strategy that I will consider shortly, but the abundance of online content was sufficient
to address most of my research questions, with the occasional print source necessary to round out
my arguments. These online repositories, moreover, are quite varied in character, ranging from
institutional archives such as MIT's Dome to hobbyist websites such as Bombjack, but the quality
of the available documentation which almost always take the form of pdf files is generally
strong. I will briefly describe each repository below:
- Dome: a vast collection of digital documents drawn from the diversity of research
projects conducted at MIT over the years. Project Whirlwind one of the earliest and
most influential of mainframe computers is well represented here, with internal
memos, reports, and meeting minutes available for essentially the entre lifespan of the
machine. The archive is meant for MIT materials only, so research data is necessarily

- bitsavers.org: created by Al Kassow, now a curator at the Computer History
Museum, bitsavers is "one of the largest on-line archives of historic computer
hardware and software documentation in the world."
It is particularly strong with
respect to materials from the mainframe and minicomputer eras (roughly 1960s to
1970s), including the PDP line created by Digital Equipment Corporation. All of
these materials have been scanned and rendered into pdf files.

- Bombjack: an online archive created and maintained by David Haynes, who goes by
the handle "DLH". Haynes' vast collection focuses on the hobbyist home computing
era, and in particular the Commodore line of computers that were built largely in the
1980s. The resources available here are extremely useful for my own research
purposes, with the type-in programming books from the 1980s featuring prominently.


- The Computer Magazine Archive: a component of the Internet Archive, TCMA
includes digital scans of most, if not all, of the leading hobbyist programming
magazines of the 1970s and 1980s. Among the most important for the purposes of
my project are Compute!, Creative Computing, and BYTE. Most are provided by
outside contributors, which means that quality of the scans can vary.

- The RAND Corporation, Published Research: a portal on RAND's website that
allows users to search through old RAND reports and other related documentation.
Some of these works can be directly downloaded as pdf files, but a substantial
proportion of older materials have yet to be scanned, and I have been obligated to
seek out printed reports on several occasions

Beyond these repositories, searching via Google has often turned up many works that are
available from personal websites and other smaller-scale spaces. There is a certain trial-and-
error aspect to Google searches, however, which I find somewhat troubling. Search queries have
to also be targeted to specific works, which virtually nullifies the potential for serendipitous
discoveries. Regardless, I have had to go in this direction to locate certain documents that I felt
were vital to my research.
There is, of course, a certain element of risk involved by relying so heavily on online
materials, with the issues of authenticity and completeness ranking among the most serious
concerns. I am not overly concerned about authenticity with respect at least to pdf files. I would
be somewhat more suspicious if I were planning on employing documents that had been typed
out in HTML, rather than scanned.
Completeness, however, is an issue that needs addressing.
For example, while a website such as Bombjack offers a vast selection of texts for Commodore
computers, it does not archive scans of books written specifically for Commordore's competitors
(it does have many "generic" titles, however, as well as those that are multi-platform). I would
have to go elsewhere to find books related to, for example, Sinclair Research's "ZX" line of
home computers, including the ZX81 and the ZX Spectrum. The World of Spectrum
website, in
fact, provides links to numerous scans of ZX books, but it is difficult to compare quantitatively
or otherwise its offering as compared to Bombjack beyond the most rudimentary of metrics.
For other systems, Internet Archive's Folkscanomy portal hosts many collections of computer
and programming books from the hobbyist era, but quantities vary across platforms.
There are a few ways in which I will correct for potential problems of this nature. First
off, I will seek out information sources which offer may list sources that I have otherwise
overlooked. Contemporary magazine advertisements and catalogues may help in this area.
Secondly, I will be sure to follow up on references to these sorts of sources in the articles and
monographs that I will cite in my own work; I have already used this strategy successfully with
respect to RAND Reports. Lastly, since I will be adopting an ECA approach, my work will
accommodate a sprawling, somewhat haphazard primary source domain. My work will be more
exploratory than systematic, finding and chasing information and ideas that resonate with my
research questions. I will explain this process in more detail in a later section

A practice that is fairly common. See, for example, Lemon Amiga, a website dedicated to the Amiga line
of computers (http://www.lemonamiga.com/).
Programming Design Worlds
As noted in the literature review for this proposal, Schn's ideas concerning design
worlds form the conceptual foundation for the programming project(s) I would like to develop
and engage with for this dissertation. These design worlds will be based largely on individual
games, or games within the same genre. This sort of dissection of existing artefacts is one of the
major approaches cited by Schn with respect to building a design world in the field of
There is a two-way interaction between design types and design worlds. On the
one hand, elements of a design world may be assembled to produce an artifact
that comes to function either in the practice of an individual designer or in a
larger design culture - as a typeOn the other hand, the direction of causality
may be reversed. A vernacular building type with its constituent things and
relations, forms, materials, construction methods, ways of organizing space, and
symbolic vocabularies may 'loosen up' to provide the furniture of a design world
(Schn, 1998, p.184).
It is this latter process that is, working with existing games so that they "loosen up" that I will
implement in my work. This strategy also has the advantage of flexibility, as it provides a means
of designing models so that final coding is not necessarily required for my purposes. By this I
mean that the models can be described in text and pseudo-code, and then converted into formal
code if time permits. I will, of course, strive to get to this final stage, but it may be glossed over
if there are complications elsewhere in my research that demand more of my time than I am
expecting. Before immediately getting into more methodological detail, however, I want to
discuss the value I believe that coding (pseudo-code or actual code) may add to my dissertation
An overall goal of my research is to find out if there are links between hobbyist game
programming, programming languages, and programming environments. My current hypothesis
is that simple but powerful (relatively speaking, given the technological potential of the time)
languages implemented in highly-interactive systems (a term which I will discuss in more detail
shortly) are capable of providing spaces in which creative programming is accommodated and
supported. The BASIC systems that were implemented across so many home computing
platforms in the 1970s and 1980s, met this criteria, or so I will argue. As mentioned in the
introduction, there are individuals and institutions that are currently trying to recapture the
"magic" of the hobbyist age, whether through new hardware systems such as the Raspberry Pi, or
new software systems such as Scratch, or combinations of these two approaches.
None of these
efforts, however, are targeted according to the principles I have set out here, though that does not
exactly mean that they do not meet them, at least in part. But I do want to create a controlled
space in which I can experiment with language and environment in an effort to meet the above
criteria. More importantly, however, I anticipate that, by coding, I will uncover new criteria to
add to the above. The historical research I will conduct for this project will provide me with a
better understanding of how hobbyist-friendly programming systems emerged and evolved. I do
not think that this work would be complete, however, without extrapolating from my findings by
designing my own simple coding systems. By taking this extra step, I believe I will gain insights
that would be elusive by adopting an exclusively research-based approach.

Raspbian a Debian Linux implementation for the Raspberry Pi comes with Scratch built in, as well as
Python and the Pygame library. But there are other implementations that more specifically target children and
focus on gaming, the most notable of which at the moment is the Kano system (http://www.kano.me/).
Despite this focus on coding tools, I recognize that my dissertation may evolve to the
point where it would not be feasible to engage in any programming practices. My work in other
phases of my project may simply take up all or most the time I would need to properly code.

Development Tools
All of the tools that I plan to use are free, non-commercial and offer generous usage and
distribution policies, at least for the purposes of non-profit research and development. My
development language of choice will be Javascript. Despite its faults, I believe that Javascript is
an incredibly powerful development tool for two important reasons. First off, it utilizes the
browser engine for processes like interface layout that are more difficult to recreate on other
platforms. And secondly, Javascript is an interpreter-based language, as opposed to being
compiler-based, which means that techniques such as reflexive reprogramming and dynamic
code generation are possible. It is also a language that I have considerable experience with,
especially when compared to Java, Ruby or Python, the other languages that I might have
potentially gone with. My experience will allow for me to avoid some of the more common
issues that arise in Javascript programming, such as confusion over typeless variables, and
problems working with the nuances of its object-oriented implementation.
In order to create custom programming languages that work within a Javascript/HTML
space, I will use the Jison utility created by Zach Carter.
Developed to be a Javascript version
of the Unix/GNU tools Lex, Yacc, Flex, and Bison, Jison combines the functionality of a lexical
analyzer and a parser generator. What this means, essentially, is that it can take a formal
definition, or "grammar", of a programming language and use it to call up Javascript routines that
convert the code into JS that can be evaluated at runtime. In practice, this means that you can

invent new programming languages with new syntax, structure, and functionality and get
them to run like regular Javascript code (as long as the grammar is properly defined and
The one major drawback to my use of Jison is that translated code needs to be evaluated
using Javascript's eval function. Under most circumstances, programmers are advised to avoid
using eval because there are typically more efficient ways to complete a given task, and it poses
a slight security risk is the code is deployed online. With respect to what I propose to do here,
eval is by far the simplest way to go (and perhaps the only), and I will stay offline by running my
programs on local node.js servers that will block outside connections.
As these programs will
then behave essentially like any other HTML/Web application, they could be run in a typical
Web browser, though I might include a lightweight small-scale browser such as node-webkit so
they behave more like desktop applications.

Node.js (http://nodejs.org/) is a Javascript platform that is able to serve as a lightweight application
server. It is therefore well-suited for small-scale projects like what I am proposing here.
Research & Analysis
Despite treating my proposed research and development sub-projects as separate entities,
I intend to integrate them quite closely, so that, for example, knowledge gained from a particular
primary source could then be tested on an updated version of an existing program (or spur the
creation of a new program altogether, though I will try to keep things from sprawling out too
much). Due to the nature of Altheide's ECA/QDA research, however, I cannot set out a detailed
plan explaining precisely how such integration will take place. What I can do here, however, is

As already discussed, I will borrow heavily from Altheide's notions of ECA and QDA

my perspective will be more reflexive in terms of how I navigate available sources, and unlike in
more systematic approaches to content analysis, the completeness of my dataset is not quite as

Specialize in command-line games

and the materials they do have uploaded

Researcher is central

Background, Technology as Text

The type-in books and magazines discussed above serve as important evidence of the
discourses around home game programming that were prevalent at the time. I also believe,
however, that a more hands-on approach to these issues would be extremely beneficial from a
research perspective. I will propose, therefore, that the development of a series of modest
programming exercises should serve as an important element of my research, for which more
details will be provided throughout this proposal.

Note that the emphasis with this approach will fall on gaming practices, not just the
games themselves. The key point to keep in mind here is that the games themselves were
generally not particularly exciting or engaging, even by the standards of the day; most of them
used a simple question and answer format in which the player would, upon request, supply
values for certain variables. The possibility spaces, to borrow a term from Bogost, were quite
limited. I will argue that the fun was not to be had in playing games, but in building new games
and modifying existing games. As I will explain in more detail shortly, the documentary
evidence of the period supports this claim.
My proposed project, then, will focus on this evidence so that I might trace the historical
development of hobbyist game programming, going all the way back to the earliest years of
digital computing, when "display hacks" screen-based animations that were sometimes
interactive first became popular.

The emphasis with this approach will fall on the users, not the games

Since many of these type-in games were fairly small and simple to follow, they were also easy to
adapt, as already noted.

With these goals in mind, I propose to use Woolgar's notion of "technology as text" to frame and
guide my work.

This means delving into the code that was listed in the hobbyist publications of the time and
selecting a subset of exemplar works (likely all or most will be games.)

But for the purposes of my research, I will also need to abstract out the fundamental elements
that make up each program. This second step is vital if we are to understand the ways in which
users were able to analyze and modify game code. To support this aspect of the project, I
propose to plan and develop a

in an ongoing programming project that will recreate certain aspects of the hobbyist programmer
experience, particularly in this adoption and adaption phase.

The means by which these games and programs circulated, in fact,

There are people and institutions trying to bring back hobbyist practices
Need to understand why these systems were so popular

- Texts built and shared via design world. Rewriters borrow elements of design world
- When discussing how texts are shared, discuss time-share, ARPAnet, type-in books,
various other artefacts that make them vibrant cultural texts
- Possibly these are "types" as per Schon
- Or use the Logic of Architecture setup primitives, relations, functions, axioms,
- Game design language people have been trying to do this with games for some time

In the early 1980s, COMPUTE! Publications was one of the leading publishers of books
and magazines that featured "type-in" programs for the personal "home" computer user. The
printed program is a type of digital artefact that is now largely extinct. In this earlier era,
however, they were extremely popular. COMPUTE's works were typically filled with source
code for games and applications though games made up the bulk of the content that readers
could type in on their own machines. While there were limitations with respect to how much
code a single user could be reasonably expected to reproduce and then debug, since it was quite
easy to make mistakes this format was quite successful for a time. When it came to digital
gaming, however, COMPUTE was not only competing with other hobbyist magazines, but also
with the burgeoning commercial game industry.

COMPUTE! writers would often tout the benefits of the type-in approach. In an anthology of
games for Commodore's VIC-20, for example, they made the following claims:
[O]ne of the best things about typing in programs yourself is that you can see
exactly how another programmer created the effects you want to use in your own
games. You may soon find that the best computer game of all is programming
games for other people to play! (Carmichael, 1983, p. 8; emphasis added).
While programming is undoubtedly an enjoyable activity for many home hobbyists, declaring
that it is a "game" in and of itself may seem highly unusual. In a later volume of VIC-20 games
published by COMPUTE, this argument is repeated:
[W]hen you're through with the planning and programming, you'll discover a
great secret: Game designing is the best game of all. No matter how brilliant your
game is, no one will ever have as much excitement and frustration and satisfaction
and fun playing it as you had in creating it (Card, 1984, p. 35).
This passage, in addition to the larger article that contains it, provide us with a few more clues as
to how coding might work as a game. It suggests that game programming is a process with a
definite end goal the creation of a new game. Yet it implies that users must also work through
various sub-challenges, each with its own rules and expectations, in order to get to this end point.
Success is by no means guaranteed, and there will be times when a given task will cause
significant frustration. Despite the seemingly open-ended nature of programming, this text
suggests that working on a specific programming project is very game-like in execution.
If we accept this premise, however, there is another issue that comes to the fore: why is it
that anyone would even want to "play" this programming game? The games themselves vary
widely with respect to quality and complexity; in the early years of the home computing, they
were often simple, text-based exercises. Why would anyone want to create a computerized
version of blackjack, for example, when the real thing can be set up and played quite easily?
Game programming may be an educational on some level at the very least, it teaches users how
to program but COMPUTE and its competitors generally presented programming as a "fun"
activity, not a pedagogical practice. Perhaps, then, there is something intrinsically appealing
about writing code. The popularity of hobbyist home computing in the 1970s and 1980s
suggests that this is in fact the case, or at least was the case at that time. If so, it is worth
knowing what conditions were in place for programming to flourish as a home hobby for so
many years. This is not a trivial issue, for, as I will show in my work, various organizations are
now attempting to design digital tools that will supposedly recreate these conditions.

Despite the advantages of using Woolgar's model, I recognize the need to address the
critical issues it does not cover, or covers insufficiently. I will focus on two in particular: user
agency, and text composition.

Design worlds are intended to be personal creations that can be adapted as needed by
users as they work through their design problems.

My overarching hypothesis, then, is that digital games can be broken down into
constituent elements, which may then be aggregated and organized into programming systems
that are capable of succinctly reproducing the original games, along with countless variants. A
typical programming system will include both a programming language and an environment
within which programs may be developed and executed, though this structure is not mandatory,
and there may even be cases where the user can select from multiple environments.

Smalltalk is the best example of this sort of system

The choice of elements is critical with respect to the character of these variants, but it is also
typically a highly-subjective process in which certain aspects of a given game are favoured and
highlighted over others. I will argue, however, that this level of subjectivity is in fact ideal, in
that it allows for there to be a significant degree of creativity on the part of the user.
Commodore BASIC is a war game DSL
The languages that are built out of these games do not need to conform to any particular
paradigm. A given language may be imperative or declarative, or both.

Will draw heavily on Woolgar

For my dissertation, then, I plan to address three separate but related concepts. At the
beginning of each new stage, I will introduce and work with a new theoretical model that I
believe to be useful in terms of supporting my arguments in that section.

For the first, I will make the case for treating programs as texts, borrowing from the "technology
as text" paradigm established by Woolgar (and later Grint & Woolgar). By doing so, I will focus
on two related datasets for which the notions of gaming and real-time interactivity were
particularly prominent: the games developed and shared by hobbyist programmers in the 1970s
and 1980s, as discussed above, and the interactive war games that were produced by the RAND
Corporation for several decades. Both of these text forms perpetuated widespread practices of
reading and rewriting, as we would expect when using the technology as text model.
The second argument I will make is that the development tools that were used to program
and play these games also behave as texts. A key element to this point is the fact that programs
developed in a particular environment will generally reflect the structure and operation of its
environment. This is particularly true for games. Games as we know them did not exist until the
concept of real-time interactivity was technically feasible.

The programs made in these IPSs echo the structure of system within which they are embedded.

The final argument will be, however, that any program, regardless of genre, can be considered a

Secondly, I will discuss a specialized form of game text
for which I will employ Schn's notion of the design world.

I will focus in particular on the notion of reflexive reading, in which Woolgar argues that all
"readers" of a given technology are simultaneously writers, forming their own texts that
encompass their perceived experiences. This leads to a situation where technologies become
"texts that produce texts," with the produced texts then becoming sites of additional rounds of
reflexive reading and writing (Woolgar, 1991, p. 42). I will argue, then, that type-in home q
acomputer games

These texts need not take physical form, but, in the case of hobbyist programming, the typically
did take such a form, as specific game genres and mechanics were borrowed, adapted, and
borrowed again as new texts appeared in books and magazines.

I am focusing on games here because they were by far the most popular genre of program the
number of game-only specialty books published at this time demonstrates this which made
them a primary target for the sort of reflective rewriting that Woolgar describes.

For my dissertation, then, I propose to pursue the general topic of game programming as a means
to both play in a design world and create (and re-create) text. Such text generally takes the form
of code, though a particular game program does not need to be "finished" in order to qualify; I
will, in fact, challenge the very notion of the finished program, in that well-formed design worlds
allow for continual reflective reprogramming. I will argue, moreover, that digital gaming is a
particularly active site for the rewriting of program texts. More than any other type of program
or application, games provide the most comprehensive new worlds/systems of thought

I want to pursue a special case in which program texts have the power to modify the design
worlds upon which they are built.

I will look at the following era

Magazines sites of rewriting
Each of these programs could serve as a text

Inherent rewriting of read text
Expand this, many things are games, such as Photoshop
To program is to read and write

To understand
Necessary to frame these texts within larger hobbyist space
What are the tools like? BASIC is designed for fun, specifically, BASIC as operating system

there are likely many game developers out there, at least, that would agree with these statements.
But we have to consider the larger context within which this text was situated in order to
understand its full impact. As with COMPUTE's other publications, this particular book was
produced not for professional software developers, but for personal computer enthusiasts; to
make this abundantly clear, COMPUTE! Publications advertised their flagship periodical as
"The Leading Magazine Of Home, Educational, And Recreational Computing" right on the front
cover. COMPUTE! Publications' products were intended for a wide audience.
This emphasis on programming, then, is intriguing; certainly, none of the mass market
computer magazines (or websites) that are around today feature type-in games and utilities. Yet
in this VIC-20 programming book a work that is, as the title suggests, wholly devoted to
gaming there is a heavy emphasis on programming, as the following passage from the
foreword emphasizes:
[T]o make this book as useful as possible, many of the games are accompanied by
explanations of how the program works. Chapters at the beginning and end of the
book will also help you learn how to write your own games (COMPUTE!'s first
book of VIC games, p. v).
The same work also touts the supposed benefits of working with printed code, claiming to the
reader that "you can see exactly how the game's creator brought off the effects you like. It will be
fairly easy for you to learn techniques that you can use in your own programs" (COMPUTE!'s
first book of VIC games, p. v). The actual games that are featured in the book are mentioned
briefly, but the clear emphasis is on programming. While the games themselves are described as
being "fun", then, it seems clear that the real fun was meant to be had in coding.
Examples like this reflect a major difference between the ways in which computer users
interacted with their systems in this earlier era as compared to now: programming used to be
considered a rather common, everyday practice a practice that was considered to be
intrinsically enjoyable, moreover. With some exceptions, coding is now thought of as something
of a specialized trade, important to computer scientists and software developers but largely
irrelevant for most other users. Devices like the Raspberry Pi are intended to turn the tide back
towards coding, but their reach is still somewhat limited. Regardless, important questions still
loom: why is it that programming was so popular several decades ago, and why did things
change? What lessons could we learn about this earlier era that could be applied to modern
computing practices? Can programming be returned to a position of prominence? Is this even a
good idea?
To address these and similar questions, I propose to conduct a research and development
project that will accomplish the following goals: first, I will demonstrate the importance of
digital games as honest and complete expressions of the systems upon which they are developed
and executed. Rather than drawing boundaries between digital games and other digital artefacts,
then, I will argue that gaming deserves a place of prominence with respect to the study of digital
technologies and information systems.

This work will build off of the idea, as expressed here by Kitchin and Dodge, that code may be
used to express new ontological perspectives:
Regardless of the nature of programming, the code created is the manifestation of
a system of thought an expression of how the world can be captured,
represented, processed, and modeled computationallyProgramming then
fundamentally seeks to capture and enact knowledge about the world practices,
ideas, measurements, locations, equations, and images in order to augment,
mediate, and regulate people's lives (Kitchin & Dodge, 2011, p. 26).

My final goal is to take this a step further by demonstrating how certain games and gaming
genres can actually serve as platforms upon which to build other programs (games or otherwise).

Lastly, I aim to show that the writing by the player of new game "texts" is the inevitable outcome
of their reading existing game texts.
Two theoretical paradigms

I propose that my dissertation proceed along two related tracks. The first track will involve a
historical study, ranging from the 1960s to the early 1980s, in the general area of digital gaming
and game programming, but with a specific emphasis on interactive programming systems that
is, real-time development environments in which users could build, educate, and manage
programs in a specified high-level language. Such programming "workshops", running variants
of the BASIC programming language, served essentially as full-fledged operating systems for
many of the earliest home computers, and they have a history that goes back even further. The
important and influential JOSS system, for example, was developed by the RAND Corporation
in the mid-1960s. At the same time, the original BASIC language system developed by Kemeny
and Kurtz went online at Dartmouth. At DEC, the FOCAL language was directly inspired by
JOSS, while their PDP-BASIC would bring a JOSS/FOCAL-like interface to a BASIC system.
Crucially, all of these systems became important and influential platforms for gaming,
precisely because of the real-time interactivity they offered. At RAND, JOSS became an ideal
system to host and "referee" the elaborate war gaming exercises that were a critical component of
the firm's research. At DEC, there was an aggressive push to get their PDP machines into public
schools and universities, and the thriving DECUS user group became a hub for the exchange of
student and hobbyist programs, including a wide variety of games. It should be noted that
virtually all of these games were text-based, requiring the player to type in certain commands
and other information as needed, with the results output to a printer or monitor. The typical text-
based game, then, behaved quite similarly to the IPSs themselves. With some effort, I believe it
is even possible to think about such games as IPSs themselves.
In order to frame and organize this research, I plan to apply the concept of the "design
world", as developed by Schn and defined as follows:
[Design worlds] are environments entered into and inhabited by designers when
designing. They contain particular configurations of things, relations and
qualities, and they act as holding environments for design knowledge (Schn,
1988, pp.182-183).
This definition is rather abstract, which is reflective of Schn's wider interests in learning and
pedagogy. I believe, however, that the design world paradigm can be used effectively in the
study of programming systems, primarily by providing a means to break such systems down into
their constituent pieces. The "things, relations and qualities" of a given system are the product of
many years of research in computer science and engineering,
are the building blocks used to make games and other programs. The characteristics of the
system, then, are reflected in the programs developed within it, and games, I will argue, embody
their systems more effectively and more thoroughly than other types of programs.
For the second component of my dissertation which I would work on in tandem with
my historical research I propose using the same concepts to design, and possibly program, one
or more digital design worlds. In keeping with Woolgar's notion of reflexive research, these
design worlds would serve as alternate interpretations of my research findings. By engaging
with the processes of designing and implementing such systems, moreover, I hope to provide
insights into the benefits (and limitations) of the design world approach. My findings could, I
believe, also serve as guidelines with respect to future work in this area.

This sort of innovation may seem trivial to us, but the emergence of interactive computing was
not an inevitable or "organic" development. Rather, it was a concept that had to be invented to
address specific issues. In the case of the RAND Corporation, such a system was used to act as
an arbiter in war gaming exercises. War gaming was a major focus of RAND research, and one
of their primary concerns was the development of automated rule systems that could respond to
user commands and update the game state accordingly. If this all sounds vaguely familiar, it is
because virtually all digital war-based strategy games operate on this principle. In fact, many
if not most digital games from this era operated on this principle.

Technological innovation is, of course, only one aspect of a wider infrastructure that was
being built

of the digital tools, including the programming languages and programming environments that
emerged in the 1960s and 1970s within institutions such as MIT, The RAND Corporation, BBN
(Bolt, Beranek and Newman), and Xerox PARC.

I will also place a heavy emphasis on digital gaming
To address such questions, my proposed dissertation would proceed along two
simultaneous tracks. First, I will use the documentary evidence from the period to learn more
about the tools used by programmers.

I will argue that these implementations of BASIC were rudimentary digital "design worlds" in
which code could be built and executed within the same programming environment. I will also
argue that such systems effectively serve users for which programming and play are inextricably
bound together. I borrow the term design worlds from the field of architecture, and in particular
the work of Schn. A design world, as Schn explains it, a platform for creative work within a
well-defined space:
These are environments entered into and inhabited by designers when designing.
They contain particular configurations of things, relations and qualities, and they
act as holding environments for design knowledge (Schn, 1988, pp.182-183).
In their fully realized form, then, design worlds are "workshops" that contain artefacts that are
useful for a given design problem. Crucially, however, these workshops are developed and
refined by the designers themselves. As Schn indicates, "[d]esigners construct their design
worlds not only through the shaping of materials but through interlocking processes of
perception, cognition and notation" (Schn, 1988, p. 183). Design world construction is
therefore a reflexive process: designers may continue to develop their design worlds as they
reflect on their work.

Such a model stands in contrast to contemporary programming practices, in which code is
created in highly-specialized development environments, and compiled and executed in wholly
separate spaces. This is not meant to be a critical comparison IDEs and built-in compilers are
useful and effective. But a

How can we bring this back?

While the actual games contained in the text are mentioned briefly,
The real fun, it would seem, is in personal programming.

Beyond even the passage cited above, the introduction encourages users to develop their own

This was at a time, moreover, when programming was a much more essential element of home
Advantages of recreating this?
Served as an ecosystem for gaming
Program and play in same ecosystem

This argument is somewhat perplexing from a contemporary perspective. How could
programming be thought of as a game? We tend to think of programming as a problem-solving
tool, or as a means to an end.

For my dissertation, I propose to explore these questions, and use any insights I gain to
inform a larger discussion on what I consider to be artificial barriers between programmers and
users, and games and "non-games", for lack of a better term. I anticipate, moreover, that this
work will allow me to explore another potentially false dichotomy with respect to data and code.
and, critically, code and data.

[older game theories]
While these theories and concepts have proven to be exceedingly effective as foundations
for research into digital games, they are also powerful ontological devices that inform research
practices. Just by using the term "game", in fact, we have already imposed a specific ideological
perspective upon our materials; as Suchman puts it, "systems of categorization are ordering
devices, used to organize the persons, settings, events or activities by whom they are employed
or to which they refer" (Suchman, 1993, p. 182). In this particular case, we are creating a
categorical dichotomy in which most digital artefacts can ostensibly be classified as either
"games" or "non-games". There are, of course, a diversity of perspectives with respect to how
we should define games, and what criteria an artefact much meet in order to be considered as a
game. There are also specific genres for example, educational games for which we might
claim certain artefacts serve both game-related and non-game related purposes. But these are
special cases; we would never, for example, try to classify a title in the Grand Theft Auto series
as a non-game, nor would we count Microsoft Word as a game. In all of these examples,
moreover, we are still employing the game/non-game paradigm, if only to conclude that certain
titles are not so easy to classify.
Consider the following alternative: rather than worrying about what is or is not a game,
what if we found a more general category in which we included all or at least most digital
artefacts. Such a scheme could, perhaps, adopt the notion of "technology as text" as developed
by Grint and Woolgar. Similar to the notion of the "cultural text", in which the notion of what
constitutes a text is expanded as follows:
[W]henever we produce an interpretation of something's meaning a book,
television program, film, magazine, T-shirt or kilt, piece of furniture or ornament
we treat it as a text. A text is something that we make meaning from (McKee,
2003, p. 4).
For Grint and Woolgar, the notion of technology as text allows for a new perspective on the
relationships between technologies, developers; it "sets the frame for an examination of the
processes of construction (writing) and use (reading) of the machine" (Grint & Woolgar, 1997, p.
70). It is these processes that they focus on in much of their work, to the point where they
receive criticism for the seemingly passive role they accord the user/reader. While it is true that
the idea of "configuring the user" leans in this direction, they maintain that user readings are
"interpretatively flexible," though only within certain boundaries. Taking this a step further, they
argue that users write as much as they read, in that they generate new interpretations of a given
technology in question based on their own experiences with it.

What I propose here, then, is a research project in which I treat digital gaming as a locus
for powerful forms of experimentation and innovation with respect to the generation and
presentation of novel texts. By this I mean that the elements that go into the production of
specific types of games/texts may be delineated and extracted, and then repurposed for the
creation of new texts. Such elements could include routines that, for example, reproduce certain
aspects of mechanical physics, or control the strategic decisions made by opponent players in a
strategy game. Or we could, if possible, break such routines up into their constituent pieces, so
that we are dealing with such high-level programming constructs as loops and if-then clauses.
The depth to which we should explore a given game varies depends on a number of outside
factors, though access is a key issue.
The end goal, for the purposes of this project, is the system of elements that provides the
most effective toolkit for game/text reconstruction and redesign. Such systems constitute what
Schn calls "design worlds." The design world paradigm will play a critical role in this study,
acting as a structural foundation that links game/text elements to specific design practices.

Associated with code
Design world

I will do this by focusing in on "text-based" games, which are games that rely on type-in
commands from players, and output spooling, line-by-line text via a printer or computer screen.

I will focus largely on game design and development, but this by no means is meant to diminish
the contributions of players in the reinterpretation process. I will, in fact, emphasize areas in
which design and play are not mutually exclusive, and I will then argue that players should be
provided with tools that allow them to more easily reconfigure the games they play. Many
games do in fact offer such tools, but their scope is typically limited. I believe that it is more
useful to focus in on areas and eras in the history of digital computing in which users were
provided with tools that allowed them to create their own games, and which actually blurred the
lines between player and developer. I am thinking here specifically of the period in which the
notion of "conversational computing" emerged, followed soon after by the rise of personal
"home" computers and hobbyist programmers. Historically, this would run rough from the early
1960s to the early 1980s, but with different areas of focus within more specific timeframes.

What I hope to achieve with this research is to develop methodological models by which we may
study digital game artefacts as texts, with an emphasis on tools and practices in which the roles
of player and programmer are closely linked.

develop digital game "ecosystems" that is, environments in which code in various stages of
development and execution may coexist.

Such a system resembles in many ways the Smalltalk programming environment, though there
are also significant differences.

Idea of exploring through code


Features such as game level editors which may be found in a wide variety of titles and genres
go in this direction, but are typically limited in their scope. The digital artefacts that users create
with these tools are, for the most part, inextricably tied to the games within which they are
embedded. That is not to say that

In a separate article, moreover, Woolgar invokes a reflexive, poststructuralist perspective in that
he argues that users write as they read. While technological artefacts generally encourage users
to adopt specific readings, individual users ultimately decide

They also emphasize the notion that readers are also active "writers"

In another work, Woolgar addresses such criticism by

Hobbyist programming area

As Suchman, citing the research of Sacks, indicates, "systems of categorization are ordering
devices, used to organize the persons, settings, events or activities by whom they are employed
or to which they refer.

Categories politics

COMPUTE!'s first book of VIC games. (1983). Greensboro, NC: COMPUTE! Publications.

Altheide, D. L. (1987). Reflections: ethnographic content analysis. Qualitative Sociology, 10 (1),
Altheide, D., Coyle, M., DeVriese, K., & Schneider, C. (2008). Emergent qualitative document
analysis. In S. N. Hesse-Biber, & P. Leavy (Eds.), Handbook of emergent methods
(pp. 127-151). New York: The Guilford Press.
Franzosi, R. (2008). Content analysis. London: SAGE Publications.
McGee, A. (2003). Textual analysis: a beginner's guide. London: SAGE Publications.
Schn, D. A. (1988). Designing: rules, types and words. Design Studies, 9(3), 181-190.