Vous êtes sur la page 1sur 5

In 1999 Cisco released a new type of router that could filter and analyze the

data through the networks. This technology made possible to establish


priorities and block the packet flowing from certain sources allowing the
operators to intervene not only for optimizing network performance (which was
Ciscos original purpose), also for commercial reasons. Theres a wide debate
around the net neutrality (that implies treating all types of data packets equally
without discriminating source, content or destination user) and originates a
new concern about if the governments or the corresponding authorities should
allow the data packets discrimination. The net neutrality advocates argue that
the operators have the right to set different price to different speeds that they
might offer to the users but not to access to certain content over the web.
The internet comprehends three layers: the content layer (the main content),
the logical layer (that refers to the algorithms and standards allowing the
content layer to be interpreted by the machine) and the physical layer (end
nodes such as PCs and network related wireless or wired devices); the content
layer is regulated (not only virtually but also with physical sanctions) because
of the copyright and intellectual property material that may be shared through
networks and also other types of material that involves illegal content, as for
the logical layer there are restrictions for example, against the software tools
that provide P2P communication and about the physical layer the devices allow
the government/regulator entity to access to encrypted data and intercept
communications.
The end to end principle refers to the network functionality: how the contents
go through nodes communicated by pipes where their simple function is to take
the data to the next node without using it for any special purpose which is the
final application function; end to end has helped the internet growth and the
increase of competition and innovation allowing a wider number of applications
to use the internet.
There are some access tiering methods used by the operators that change the
internet default settings giving them its control, allowing them to set different
prices for access, there are three methods: best efforts (that treats all the
data equally using a data structure such as an arrangement with the principle
first in first out access where the origin or the destination are irrelevant),
needs-based discrimination (works with best efforts rule until there is network
congestion
prioritizing
the
latency-sensitive
packets)
and
active
discrimination (that refers to the scenario where the operator inspects all the
packets and prioritize them according certain determined rules without taking
into account if the network is congested as a decision parameter). Companies
(such as Google or Yahoo) have servers all around the world and there they
store data that travels through the pipes to make their pages to load more
rapidly using access tiering, also intermediary service providers use it as a tool
for making short the path that the data packets has to travel to reach its

destination; for example, making an analysis about Verizon plans (an US


operator) experts concluded that the 80% of its network capacity is used for
the provided services (such as VoIP, IPTV, etc. using optical fiber technologies)
and the other 20% for the users that might need to use Verizons pipes for
making their data packets reach their destination.

http://wiki.commres.org/pds/NetworkNeutrality/NetNeutrality.pdf
The basic model contemplates three main parts: Internet Service Providers
(ISPs), Content Providers (CPs) and the Users. It considers a monopolistic ISP
and two CPs; applying neutrality the ISP cannot give any type of priority to one
or other CP and it would provide its network line with no charge while without
neutrality the priority privilege could be sold to any of those CP making access
tiering a direct violation of the networks neutrality.
Considering that the network works as a queue, using a Poisson statistical
distribution, the arrival rate of the consumers will be considered as and the
service rate represented by ; the processing times use the expression 1/ as
the mean for the exponential distribution being a near approximation to the
packet arrival process where there are a large number of customers, being
each one not a determining value in the final system performance (and the
customers decisions are independent form each other). In a short time
analysis, has a fixed value while in a long run analysis (considering
investment incentives) it tends to have differential values. In a neutral network,
it works under the principle first in first delivered while in a discriminatory
network packets are delivered in a certain order depending on its priority.
About delivering times per user, for a neutral network the expected waiting
time is the same for each one while in a system with priorities the values tend
to be much lower or much higher depending on the assigned priority.
Without considering much details, a neutral network, considering the
mentioned case, both CPs could be chosen in a symmetrical way so the
monopolistic ISP purpose will be serve all the users equally; in consequence,
there is a wider market opportunities for the ISP by offering the priorities
privilege, making the CPs competence more notorious causing that the CP with
more priority will have more costumers subscribed.
http://www.econstor.eu/bitstream/10419/26435/1/577510908.PDF
http://www.techpolicyinstitute.org/files/wallsten_unbundling_march_2009.pdf

http://globalvoicesonline.org/2010/09/04/chile-first-country-to-legislate-netneutrality/

https://itunews.itu.int/En/3352-Net-neutrality-to-regulate-or-not-toregulate.note.aspx

--------------------------VVVVVVVVV----------------------------VVVVVV--------------WWWWWW---------------U( - (. .) - )U
U( v (. .) v )U
U( T (. .) T )U

Vous aimerez peut-être aussi