Académique Documents
Professionnel Documents
Culture Documents
1. SUMMARY
We believe that the 2012-2013 high school CX debate topic should be:
“Resolved: That the United States Federal Government should
increase its regulation of the Internet.”
We’ll clarify.
2. INTRODUCTION
A lot of Americans use the Internet:
“A recent count put the number of Internet users in just the United States at
over 104 million; most (93 percent) use email and search the Web for
information (80 percent). Every day, 50 million Americans go online.
Researchers argue that the Internet is now a mainstream medium for nearly
every type of human communication, and one that, in the year 2000, achieved
“critical mass” in the United States.”
Hoffman, D. L., Novak, T. P., & Schlosser, A. E. (2003). Locus of Control, Web
Use, and Consumer Attitudes Toward Internet Regulation. Journal of Public
Policy & Marketing, 22(1), 41-57. Retrieved from EBSCOhost.
And the duty of regulating the Internet has largely fallen on the United
States. However, the United States has abdicated much of this power
in favor of self-regulatory regimes:
“Since the Internet was developed in the United States, and the country has
the largest number of Internet users, it is only natural that the American
government would be expected to take the lead in regulation. One of the
first official indications the world received about Internet regulation at the
international level from the United States was a call by President Clinton and
Vice-President Gore for self-regulation. While the concept of self-regulation
has historically occupied an important place in American mass media policy,
it was alien to most of the rest of the world. There are several definitions of
the concept of self-regulation. When applied to the Internet, it refers to the
role of the Internet community—namely, engineers, software developers,
analysts, network specialists, administrators and users—in determining the
standards, protocols, codes of ethics and direction of the Internet, with
minimal government participation, input and control. The Clinton-Gore
framework was an exhaustive survey of the administration's policies with
regard to the Internet or the Global Information Infrastructure (GII). This
expression represented a globalization of the nomenclature of the Internet
from the term “Information Superhighway” proposed by Vice-President Gore
in 1994, and embodied in the National Information Infrastructure.
The Clinton-Gore framework for global electronic commerce
acknowledged that the Internet had transformed the world into a planetary
plaza. The two leaders argued that the Internet, this global network of
networks, would thrive for the benefit of all people if it were allowed to
regulate itself. The framework was a unique, unprecedented document in
that, on the one hand, it was a diplomatic message addressed to
governments around the world, and, on the other hand, an executive order
addressed to a number of departments of the American government. In the
framework, President Clinton said in part that the actions of governments can
have a profound effect on the growth of electronic commerce. He therefore
appealed to governments around the world to act with restraint, “knowing
when to act and… when not to act” is important in matters of Internet
regulation. Clinton and Gore called on governments to adopt a market-
oriented approach to the Internet and appealed to the industry to regulate
itself. This self-regulation model soon became the de facto Internet
regulation standard at the national and international levels.”
“But then the tide reversed. Then came libel originating in distant countries,
stock manipulation from afar, worldwide domain name cybersquatting, sales
tax circumvention by citizens purchasing faraway goods, hate speech
websites located in countries protecting this kind of expression, online
casinos based within the territory of states encouraging this business as it
would almost exclusively affect foreign people in foreign countries while
generating tax revenues, and worse.
The dark side of the Web manifested itself, and it triggered a
movement for cultural and nationalistic withdrawal. People started to say that
they did not want outlandish foreigners to do the equivalent of standing in
the garden in front of their house doing things that are regarded with outright
repugnance in their community. The French were anxious at the thought of
there being, just around the corner, defiant Americans believing it is their
fundamental right to say whatever they want to say, even if it involves an
apology for Nazism.
In the United States, people were incensed about lax foreign
governments not cracking down on online casinos, which were intruding into
American homes and offices, computers, and mobile phones, to fuel
compulsive gambling.
Many countries became concerned about incitements to terrorism and
appeals to fund terrorist organizations flowing into their country simply by
dint of being globally accessible. Some governments began to consider
blocking by technical means local residents’ access to foreign Internet
sources that glorify terrorism.
Other governments grew increasingly apprehensive about the West
spreading its culture and values throughout the world by a mere information
transfer into territories which were previously exposed mainly to local
information. Suddenly, the free and global character of the Internet started to
be considered an evil. The global Internet community started to think that,
after all, it did not want to be a single community, but several, and that each
community should be allowed to live according to its internal fundamental
values, according to its own choices of public policy (in the sense of ordre
public), which partake of the expression of each nation’s Volksgeist .
The Internet should be free, most agreed, but only insofar as this
freedom stopped short of violating the fundamental principles underlying the
operation of each state’s legal system. The ‘ancient principles governing law
and politics within nations’ were being challenged.”
3.2 ICANN
Another more specific plan enforcement mechanism can be found in
the United States’ relationship with The Internet Corporation for
Assigned Names and Numbers, or ICANN. ICANN was created by the
United States in 1998:
“The Internet Corporation for Assigned Names and Numbers (ICANN)
coordinates the Internet’s technical function, essentially serving as its
manager. The non-profit corporation was created in 1998, under the auspices
of the U.S. Department of Commerce, to assume control of the Internet
domain name system (DNS). The DNS is a global, distributed database that
translates easy - to - remember mnemonic addresses, like
www.digitalmedialaw.us , into numerical identifier, like 213.86.83.116, that
computers can use to locate websites and deliver e - mail.
Each numerical identifier – called an Internet Protocol address – leads
to one computer, the way a telephone number points to one phone. ICANN
ensures that domain names and their corresponding IP addresses are
globally unique, so the same address always leads to the same location.
ICANN is also responsible for the delegation of top - level domain (TLD)
names, like .com, .net, and .org, and country codes, like .us or .uk. It
accredits both the registries assigned to manage particular top - level
domains (like Verisign, which is responsible for housing all of the domains
ending in .com, .net, and .tv in its databases) and the companies that register
individual domain names. The Commerce Department has consistently
maintained that its eventual goal was to release ICANN from government
supervision as soon as it is ready to stand on its own. The original target date
for independence was 2000. In Sept. 30, 2009, the U.S. government signed a
new agreement with ICANN giving up unilateral control of the organization.”
[Digital Media Law]
Even though the United States has ceded its control over the ICANN,
American could still seize control of the ICANN. However, such a move
would be politically unpopular:
“In reality the problem is less severe than this general view suggests. This is
so because, as Jack Goldsmith observes, enforcement jurisdiction is not
affected by this overlapping of a large number of laws. Enforcement
jurisdiction, one may recall, is the authority actually to enforce the law by
inducing or compelling compliance with it.
It is what gives regulation its teeth and makes it effective. This form
of jurisdiction has a strictly territorial basis, which means that in the absence
of extradition, which is unlikely to be granted with respect to the vast
majority of Internet matters, a state can enforce its laws only against in-state
actors, against entities with a presence on the territory of the state or with
assets there. The distinction between prescriptive and adjudicative
jurisdiction, on the one hand, and enforcement jurisdiction, on the other, is
what allowed Joseph Story, almost 200 years ago, to maintain that ‘
whatever force and obligation the laws of one country have in another,
depends solely upon [the latter’s] own express or tacit consent ’ .
It means that providers of Internet content need to worry mainly about
the regulations of the states in which they have a presence or assets.
Enforcement jurisdiction acts as a limiting factor, reducing the overlapping of
directly effective regulations to the various states where Internet actors
have a presence or assets, which falls somewhere short of all the nations of
the entire world. The submission of Internet actors to a worldwide range of
paper rules may be true, but their submission to effective rules is far more
limited.”
“There are three distinct domains in which the Internet may need
governance—that is, intentional and legitimate ordering. Each serves distinct
purposes and requires different kinds of processes and methods, so it is vital
to distinguish between them in order to develop high-level rules and
procedures. The first is technical standardization. This involves reaching
agreement about networking protocols and data formats and documenting
these agreements. Because standards structure the behavior of machines
and people, it is useful to consider them as part of an intentional ordering
process. The second is resource allocation and assignment. In the case of the
Internet, this means virtual resources—Internet identifiers such as domain
names and IP addresses, as well as protocol port numbers. These identifiers
require exclusive use, because they must be unique and exclusive to
function properly as an address. Resources may also be scarce and require
rationing. Allocation and assignment processes coordinate the distribution of
Internet resources to users, to maintain uniqueness and/or to ration
consumption. The third area of governance is human conduct, which is
governed by defining and enforcing regulations, laws, and policies. Whereas
the first two governance functions are concerned with the specification or
coordination of the technical system, the governance of human conduct
orders the actions of people. It includes global public policy for such areas as
spam, cybercrime, copyright and trademark disputes, consumer protection
issues, and public and private security. “
[The Internet and Global Governance]
“Over the past years, the merits of network neutrality regulation have
become a hot topic in telecommunications policy debates. Repeatedly,
proponents of network neutrality regulation have asked the Federal
Communications Commission to impose rules on the operators of broadband
access networks that forbid network operators to discriminate against third-
party applications, content or portals (“independent applications”) and to
exclude them from their network.
These proposals are based on the concern that in the absence of such
regulation, network operators may discriminate against these products and
that this behavior may reduce innovation by providers of these products to
the detriment of society.
Opponents of regulation deny the need for network neutrality
regulation.
They argue that regulation is not necessary because network
operators do not have an incentive to discriminate against independent
applications anyway, or, alternatively, that regulation is harmful because it
would reduce network operators’ incentive to upgrade their networks in the
future.”
[towards an economic framework for network neutrality]
4.2 Gambling
Gambling is big business for the Internet:
4.4 Cybercrime
Cybercrime will be a large case area. Readers will probably most
familiar with the use of viruses and worms to cause economic
damage:
“In 2008, the Computer Security Institute found that almost half of the 522
respondents to its annual Computer Crime and Security Survey experienced
a computer crime in the previous year.
Most frequently, organizations were the victims of malware –software
maliciously designed to harm other computers. The most common forms of
malware are worms and viruses. A computer virus is a parasitic program that
attaches itself to another application. When it is activated, it self-replicates
and spreads throughout a computer system and then, via shared files, to
other computers. Some viruses are simply pranks that spread strange
messages or pictures. Others do serious damage by erasing data and
corrupting hard drives. Computer worms engage in the same malicious
behavior, but do so independently. The essential difference between them is
that worms do not need a host application. Viruses and worms are most
commonly spread through e - mail attachments, links to infected websites,
P2P file sharing, and free software downloads from the Internet.”
[Digital Media Law, Ashley Packer]
But even if we do not need perfect enforcement, the opponents argue back,
we will still require sufficient enforcement to discourage the few among us
who will not adhere to a rule on their own; and even sufficient enforcement is
unlikely in global cyberspace.
Whenever, for example, an information provider is threatened by a
regulation in one state, it just needs to relocate the potentially violating
information to another jurisdiction with a more favorable regulation. Given
the market forces in a global network, over time certain states will turn their
liberal regulatory regimes into a competitive advantage, in essence offering
to providers of questionable content flags of convenience in the sea of
information called the Internet.
As Frances Cairncross put it, cyberspace causes the “death of
distance.” It ridicules traditional national borders and boundaries.
[The shape of governance: Analyzing]
5.2 Disadvantages
The traditional disadvantages for a domestic topic will apply to this
one. But, negatives would also be able to link increased content
restrictions for America to more stringent requirements for those
abroad:
5.3 Counterplans
Other actors could implement affirmatives’ plans. For example, many
scholars are unconvinced that national action is the answer to the
Internet’s woes:
Instead, some authors propose that control of the Internet is best left
to private actors:
6. CRITICAL ISSUES
“The United States has weathered one major federal attack on the Internet
in the form of the Communications Decency Act of 1996, which was declared
unconstitutional. The U.S. is in the enviable position of being the only country
that has a constitutional guarantee of freedom of speech.”
Paper invited for the International Conference:Freedom of Expression,
Censorship, Libraries, Riga, Latvia, October 14-17 1998 Who Censors the
Internetand Why Jana Varlejs, Associate Professor Rutgers School of
Communication, Information and Library StudiesNew Brunswick, New Jersey,
USA
“Overall, one can say that the instinct to censor is quite universal, and that
the stated aim is protection -- of children, of cultural values, of government
stability, and so on. The objective is to maintain control. But why is there this
need to be protective, to be in control? Perhaps underlying this instinct is the
more basic one of fear. As Godwin puts it in his discussion of why the
"Communications Decency Act" was passed by the United States Congress,
It’s that the theocratic right is driven by an irrational fear -- a fear that the
citizens and Congress can’t be trusted to do the right thing if they’re
presented with unvarnished, unmanufactured facts (Godwin, 1998, p. 301).
Across the world, similar fears seem to drive the censors, suggesting a
mistrust of free speech and people’s ability to deal with it rationally, and
perhaps an unacknowledged lack of confidence on the part of the censor in
his her own infallibility.”
Paper invited for the International Conference:Freedom of Expression,
Censorship, Libraries, Riga, Latvia, October 14-17 1998 Who Censors the
Internetand Why Jana Varlejs, Associate Professor
6.1.2 Borders
Negative teams will be able to argue that regulation of the Internet
creates new ‘borders’ within a previously ‘open’ space:
Finally, as with many other elements of the proposed topic, teams will
be able to cite good empirics of the critical implications of Internet
regulation:
“The effects of harmful web contents are chronically at issue in the Internet
regulation debate, leading to frequent calls for intervention and censorship.
When placed within the wider context of media effects, one is left wondering
whether there may indeed be ‘nothing new under the sun’ (Sutter, 2000) in
respect of ‘new’ technology (see also Grabosky, 2001). The virtual demon
evokes an image all too familiar to media scholars, namely that of audiences
being sucked into an impoverished and surrogate reality, causing all manner
of social ills. What such discourses tend to neglect is that if there is any harm
perpetrated, its causes and effects are often complex and multifarious.
The Internet is ‘only’ a medium: as the pro-ana community is keen to
emphasize, you do not catch anorexia from visiting a website. Anorexia is a
complex disorder and the pro-ana controversy is the latest instalment in the
long-running debate on the link between the consumption of media images
and eating disorders. It would not make for a sound public health policy if the
attempt to combat anorexia focused exclusively on the media side of the
disorder. In that sense, the virtual does not hold the key to solving anorexia
(and bulimia) as a ‘problem’ because it is neither sufficient nor necessary to
trigger an eating disorder. “
2008; 4; 311Crime Media Culture Lieve Gies How material are cyberbodies?
Broadband Internet and embodied subjectivity Pg. 323
6.1.5 Cyberfeminism
Negative teams will be able to critique affirmative plans from the
standpoint of cyberfeminism:
Author continues P. 2
6.1.6 Psychoanalysis
Finally, negative and affirmative teams alike will be able to critique
notions of the Internet from the standpoint of psychoanalysis, via
multiple link stories from either a Zizekian or Lacanian perspective:
6.2.1 Self-Regulation
Alternative options for negative teams will include un-ordered ‘self-
regulation’:
“Donath and Boyd (2004) attribute the growing success of social networking
websites (such as ‘Friendster’, ‘Facebook’ and ‘Myspace’) to users’ desire to
authenticate themselves by explicitly stating their connections with others
who know them (preferable from offline encounters) and who are therefore
able to confirm that they are indeed who they claim to be. In contrast with
the anarchical and criminogenic reputation of the Internet, Wall (2001: 167)
finds cyberspace ‘remarkably ordered’, a quality he believes is in no small
part attributable to different layers of governance, including a measure of
self-regulation imposed by users themselves. Self-authentication as a
means of establishing credibility undoubtedly testifies to users’ wish to
contribute to order online. Featherstone (2000) says of virtual reality: As in
all types of communication it is to be expected that forms and conventions
will emerge which provide the equivalents of everyday face-to-face cueing
devices, turn-taking in conversations, body language etc. which are driven by
the economizing imperative of being understood. (p. 615)”
2008; 4; 311Crime Media Culture Lieve Gies How material are cyberbodies?
Broadband Internet and embodied subjectivity p. 319-320
“Within the present context, the question concerns more precisely how the
different solutions to the self-regulation of the Internet can encounter
their own conditions of increase in reflexivity. These conditions have to be
met if one wants to mobilise effectively the new reflexive resources which are
needed to face unprecedented ethical situations. However, literature on the
subject of self-regulation of the Internet already attempts to go beyond
the insufficiencies of actual solutions ; hereby holding that self-regulated
networks can go beyond individual market behaviour by developing a certain
level of collective constraint which is different from the one emanating
directly from the government (Black [9]). One can think of forms of self-
regulation by delegation as in the case of the privatisation of the root by the
creation of the ICANN (Internet Corporation for Assigned Names and
Numbers) (Ogus [43], p. 596 ; Mueller [39], pp. 518-519) or of forms of
spontaneous emergence of voluntary constraints within user communities
(Poullet [46] ; Ogus [43]). Nevertheless, these solutions are most of the time
limited to proposing a purely formal reflexivity of ethical codification or
juridical self-rule. To take into account the reflexivity of the actors and the
institutional frameworks in addition to the formal rules, two types of solutions
are proposed in literature on the subject. The first solution, which can be
described as ‘decentralized regulation’ (Lemley [29]) or ‘multi-regulation’
(Vivant [55])7, tries to take innovation (Kling [25], p. 116) or on the question
of the real beneficiaries of the increase in productivity in organisations
through computerisation (Kling [25], p. 123). These are the terms used in
the field of Internet governance. One could prefer the term of “polycentric”
governance, used in the field of community management of common goods
studies, which has the advantage of showing that decentralisation does not
imply the absence of any coherence between the subsystems. The use of this
term, introduced by V. Ostrom, Ch. Thibout and R. Warren, connotes a
coherent manner of functioning of the system as a whole through “various
contractual and cooperative undertakings” between the independent into
account the reflexivity of the new actors emerging in the field of the Internet.
This solution focuses on the increase in reflexivity of the emerging actors
through the recurrent interaction between subsystems of normativity, such
as the interaction one can observe within the Internet Society between the
Internet Societal Task Force (ISTF) on the one hand and the Internet
Architecture Board (IAB) and the Internet Engineering Taks Force (IETF) on
the other8. The second solution, which we describe as ‘co-regulation’ in the
strong sense, focuses on an institutional framing facilitating the responsibility
of the actors in favour of the research of common solutions, such as in the
proposition of the French and Australian coregulatory agencies.”
Ethics and Learning From State Regulation towards Reflexive Self-
Regulation of the Information Society © Tom Dedeurwaerdere (UCL-FNRS)
Paper to be presented at the World Computer Congress, Montreal 2002*. p.
10-11
“It is important to stress, however, that rather than taking us into entirely
new directions, our ‘cyber selves’ (Aas, 2007) constitute merely an additional
layer to already densely structured social identities. This is becoming ever
more pronounced as offline social rules are routinely relied on to order
cyberspace where they mimic familiar hierarchies and cannot be decoupled
from existing material inequalities. When considering the case for increased
regulation of the Internet, we would do well to bear in mind that social
control, often but not exclusively driven by commercial and political
governance imperatives, is already deeply enmeshed with cyberculture.”
2008; 4; 311Crime Media Culture Lieve Gies How material are cyberbodies?
Broadband Internet and embodied subjectivity p. 320, 327
Affirmative teams will also be able to cite good solvency advocates for
permutations concerning such critique alternatives:
“Accepting, however, that a strong case can be made for this ‘control’ vision
of the Internet, some have argued for the creation of new rights. It has
variously been suggested that the Internet requires the development of new
constitutional rights, or the creation of ‘digital rights’. Part of the debate
about rights relates back to the question of whether these are required
because the Internet presents a new ‘space’. The position of this paper is
that early hacker culture was not altogether naïve in thinking that the
Internet posed an opportunity for interacting differently to the ‘real’ world (or
creating a new ‘space’). This mode of real-time, decentralised and remote
communication arguably did present new opportunities to develop alternate
personas and non-geographically bounded relationships. The furor which
developed about this capacity, specifically the capacity to act deceptively
through adopting anonymity as a result of these technical characteristics, led
to debates about how to build, ‘responsibility’, and ‘accountability’ between
Internet users online. Arguably, this debate has now been largely decided by
the process of commercialisation. As Lessig noted, the simplest route to
imposing ‘responsibility’ on individuals in cyberspace is by the way it is done
in the ‘real’ world- through identification. Anonymity and pseudonymity can
offer a form of strategic resistance for Internet users against ways that the
Internet could be used as a method of social control.”
The ‘Strategic Value’ of Online Anonymity By Madeleine Mispel Market
mindset? (Page 16)
Author continues p. 42
“In order to preserve the potential for deliberative democracy that the
Internet offers, it is necessary to have legal and administrative regulation.
This is one of the consequences which follows from the concept of a
normative public sphere. In the future, regulation of the Internet will
depend more and more upon the limitations imposed by commercial
interests, but it remains important that non-profit organizations should exist
in addition to commercial providers so that affordable access to the Internet
can be maintained. This is especially the case when one considers
developing countries and the so-called third world. For these countries and
for organizations such as Unesco, this access should be a life-line for the
political use of technology. At the same time politicians should keep clearly in
view the fact that access to information is part of the fundamental right of
citizens to unrestricted communication and interaction; and this is basic and
essential to a functioning democracy. The political and legal results of
maintaining a critical and deliberative public sphere would be seen in public
access to the Internet in places such as public buildings and libraries, as well
as free access to on-line archives, databases and commercial data-banks
such as Lexis-Nexis and Genios. The recommendation is simple, but in
combination with the relevance of a vital public sphere as it is pictured in the
model of deliberative democracy it is very important.”
2001 27: 21 Philosophy Social Criticism Antje Gimmler Deliberative
democracy, the public sphere and the Internet p. 21 and 34
As an umbrella term that includes the associated terms cyberspace and the
Web (World Wide Web), the Internet can refer to the actual network and the
exchange of data between computers. Many people use the Internet in a
seemingly straightforward way: sending and receiving personal email,
accessing public information, downloading maps, viewing merchandise and
making purchases online, and generally using the technologies for
information gathering and transmission. Internet can also refer to social
spaces where relationships, communities, and cultures emerge through the
exchange of text and images, either in real time or in delayed time
sequences. There is a long tradition of social interaction and community
development based on the capabilities of the Internet. In short, the Internet
can be perceived as a set of technological tools, a complex network of social
relations, a language system, a cultural milieu, and so forth. The way one
defines and frames the Internet influences how one interacts with Internet-
based technologies, as well as how one studies the Internet.
Internet communication as a tool for qualitative research Annette N.
Markham
(The authors would like to briefly clarify here the distinction between
the Internet and the World Wide Web. The Internet is a series of
interconnected computers that communicate using the Internet
Protocol Suite. The World Wide Web is a system of hyperlinked
documents accessed via the Internet. In other words, the Internet is
the framework and the Web is the app.)