Vous êtes sur la page 1sur 33

How Microcontrollers Work

by Marshall Brain
http://electronics.howstuffworks.com/microcontroller.htm

Microcontrollers are hidden inside a surprising number of products these days. If your microwave
oven has an LED or LCD screen and a keypad, it contains a microcontroller. All modern
automobiles contain at least one microcontroller, and can have as many as six or seven: The engine is
controlled by a microcontroller, as are the anti-lock brakes, the cruise control and so on. Any device
that has a remote control almost certainly contains a microcontroller: TVs, VCRs and high-end
stereo systems all fall into this category. Nice SLRand digital cameras, cell
phones, camcorders, answering machines, laser printers, telephones (the ones with caller ID, 20-
number memory, etc.), pagers, and feature-laden refrigerators, dishwashers, washers and dryers (the
ones with displays and keypads)... You get the idea. Basically, any product or device that interacts
with its user has a microcontroller buried inside. In this article, we will look at microcontrollers so
that you can understand what they are and how they work. Then we will go one step further and
discuss how you can start working with microcontrollers yourself -- we will create a digital clock
with a microcontroller! We will also build a digital thermometer. In the process, you will learn an
awful lot about how microcontrollers are used in commercial products.
What is a Microcontroller?
A microcontroller is a computer. All computers -- whether we are talking about a personal desktop
computeror a large mainframe computer or a microcontroller -- have several things in common:

 All computers have a CPU (central processing unit) that executes programs. If you are sitting
at a desktop computer right now reading this article, the CPU in that machine is executing a
program that implements the Web browser that is displaying this page.
 The CPU loads the program from somewhere. On your desktop machine, the browser program
is loaded from the hard disk.
 The computer has some RAM (random-access memory) where it can store "variables."
 And the computer has some input and output devices so it can talk to people. On your desktop
machine, the keyboard and mouse are input devices and the monitor and printer are output
devices. A hard disk is an I/O device -- it handles both input and output.
The desktop computer you are using is a "general purpose computer" that can run any of thousands
of programs. Microcontrollers are "special purpose computers." Microcontrollers do one thing well.
There are a number of other common characteristics that define microcontrollers. If a computer
matches a majority of these characteristics, then you can call it a "microcontroller.

 Microcontrollers are "embedded" inside some other device (often a consumer product) so that
they can control the features or actions of the product. Another name for a microcontroller,
therefore, is "embedded controller."
 Microcontrollers are dedicated to one task and run one specific program. The program is stored
in ROM (read-only memory) and generally does not change.
 Microcontrollers are often low-power devices. A desktop computer is almost always plugged
into a wall socket and might consume 50 watts of electricity. A battery-operated microcontroller
might consume 50 milliwatts.
 A microcontroller has a dedicated input device and often (but not always) has a small LED or
LCD display for output. A microcontroller also takes input from the device it is controlling and
controls the device by sending signals to different components in the device. For example, the
microcontroller inside a TV takes input from the remote control and displays output on the TV
screen. The controller controls the channel selector, the speaker system and certain adjustments
on the picture tube electronics such as tint and brightness. The engine controller in a car takes
input from sensors such as the oxygen and knock sensors and controls things like fuel mix and
spark plug timing. A microwave oven controller takes input from a keypad, displays output on
an LCD display and controls a relay that turns the microwave generator on and off.
 A microcontroller is often small and low cost. The components are chosen to minimize size and
to be as inexpensive as possible.
 A microcontroller is often, but not always, ruggedized in some way. The microcontroller
controlling a car's engine, for example, has to work in temperature extremes that a normal
computer generally cannot handle. A car's microcontroller in Alaska has to work fine in -30
degree F (-34 C) weather, while the same microcontroller in Nevada might be operating at 120
degrees F (49 C). When you add the heat naturally generated by the engine, the temperature can
go as high as 150 or 180 degrees F (65-80 C) in the engine compartment. On the other hand, a
microcontroller embedded inside a VCR hasn't been ruggedized at all.
The actual processor used to implement a microcontroller can vary widely. For example, the cell
phone shown on Inside a Digital Cell Phone contains a Z-80 processor. The Z-80 is an 8-
bit microprocessordeveloped in the 1970s and originally used in home computers of the time. The
Garmin GPS shown in How GPS Receivers Work contains a low-power version of the Intel 80386,
I am told. The 80386 was originally used in desktop computers. In many products, such as
microwave ovens, the demand on the CPU is fairly low and price is an important consideration. In
these cases, manufacturers turn to dedicated microcontroller chips -- chips that were originally
designed to be low-cost, small, low-power, embedded CPUs. The Motorola 6811 and Intel 8051 are
both good examples of such chips. There is also a line of popular controllers called "PIC
microcontrollers" created by a company called Microchip. By today's standards, these CPUs are
incredibly minimalistic; but they are extremely inexpensive when purchased in large quantities and
can often meet the needs of a device's designer with just one chip. A typical low-end microcontroller
chip might have 1,000 bytes of ROM and 20 bytes of RAM on the chip, along with eight I/0 pins.
In large quantities, the cost of these chips can sometimes be just pennies. You certainly are never
going to run Microsoft Word on such a chip -- Microsoft Word requires perhaps 30 megabytes of
RAM and a processor that can run millions of instructions per second. But then, you don't need
Microsoft Word to control a microwave oven, either. With a microcontroller, you have one specific
task you are trying to accomplish, and low-cost, low-power performance is what is important.
In How Electronic Gates Work, you learned about 7400-series TTL devices, as well as where to buy
them and how to assemble them. What you found is that it can often take many gates to implement
simple devices. For example, in the digital clock article, the clock we designed might contain 15 or
20 chips. One of the big advantages of a microcontroller is that software -- a small program you
write and execute on the controller -- can take the place of many gates. In this article, therefore, we
will use a microcontroller to create a digital clock. This is going to be a rather expensive digital
clock (almost $200!), but in the process you will accumulate everything you need to play with
microcontrollers for years to come. Even if you don't actually create this digital clock, you will learn
a great deal by reading about it. The microcontroller we will use here is a special-purpose device
designed to make life as simple as possible. The device is called a "BASIC Stamp" and is created
by a company called Parallax. A BASIC Stamp is a PIC microcontroller that has been customized
to understand the BASIC programming language. The use of the BASIC language makes it
extremely easy to create software for the controller. The microcontroller chip can be purchased on
a small carrier board that accepts a 9-volt battery, and you can program it by plugging it into one of
the ports on your desktop computer. It is unlikely that any manufacturer would use a BASIC Stamp
in an actual production device -- Stamps are expensive and slow (relatively speaking). However, it
is quite common to use Stamps for prototyping or for one-off demo products because they are so
incredibly easy to set up and use. They are called "Stamps," by the way, because they are about as
big as a postage stamp. The specific BASIC Stamp we will be using in this article is called the
"BASIC Stamp Revision D". The BASIC Stamp Revision D is a BS-1 mounted on carrier board
with a 9-volt battery holder, a power regulator, a connection for a programming cable, header pins
for the I/O lines and a small prototyping area. You could buy a BS-1 chip and wire the other
components in on a breadboard. The Revision D simply makes life easier. You can see from the
previous table that you aren't going to be doing anything exotic with a BASIC stamp. The 75-line
limit (the 256 bytes of EEPROM can hold a BASIC program about 75 lines long) for the BS-1 is
fairly constraining. However, you can create some pretty neat stuff, and the fact that the Stamp is so
small and battery operated means that it can go almost anywhere.
Variables
All variables in the BS-1 have pre-defined names (which you can substitute with names of your
own). Remember that there are only 14 bytes of RAM available, so variables are precious. Here are
the standard names:
 w0, w1, w2...w6 - 16-bit word variables
 b0, b1, b2...b13 - 8-bit byte variables
 bit0, bit1, bit2...bit15 - 1-bit bit variables
Because there are only 14 bytes of memory, w0 and b0/b1 are the same locations in RAM, and w1
and b2/b3 are the same, and so on. Also, bit0 through bit15 reside in w0 (and therefore b0/b1 as
well). I/O pins You can see that 14 of the instructions in the BS-1 have to do with the I/O pins. The
reason for this emphasis is the fact that the I/O pins are the only way for the BASIC Stamp to talk
to the world. There are eight pins on the BS-1 (numbered 0 to 7) and 16 pins on the BS-2 (numbered
0 to 15). The pins are bi-directional, meaning that you can read input values on them or send output
values to them. The easiest way to send a value to a pin is to use the HIGH or LOW functions. The
statement high 3 sends a 1 (+5 volts) out on pin 3. LOW sends a 0 (Ground). Pin 3 was chosen
arbitrarily here -- you can send bits out on any pin from 0 to 7. There are a number of interesting I/O
pin instructions. For example, POT reads the setting on a potentiometer (variable resistor) if you
wire it up with a capacitor as the POT instruction expects. The PWM instruction sends out pulse-
width modulated signals. Instructions like these can make it a lot easier to attach controls and motors
to the Stamp. See the documentation for the language for details. Also, a book like Scott
Edward's Programming and Customizing the BASIC Stamp Computer can be extremely helpful
because of the example projects it contains.
Playing with a BASIC Stamp A screenshot of a typical BASIC program editor If you would like to
play with a BASIC Stamp, it's very easy to get started. What you need is a desktop computer and
a BASIC Stamp starter kit. The starter kit includes the Stamp, a programming cable and an
application that you run on your desktop computer to download BASIC programs into the Stamp.
You can get a starter kit either from Parallax (the manufacturer) or from a supplier like Jameco (who
should be familiar to you from the electronic gates and digital clock articles). From Parallax, you
can order the BASIC Stamp D Starter Kit (part number 27202), or from Jameco you can order part
number 140089. You will receive the Stamp (pictured below), a programming cable, software and
instructions. The kit is $79 from both suppliers. Occasionally, Parallax runs a special called "We've
Bagged the Basics" that also includes Scott Edward's Programming and Customizing the BASIC
Stamp Computer]. Hooking up the Stamp is easy. You connect it into the parallel port of your PC.
Then you run a DOS application to edit your BASIC program and download it to the Stamp. To run
the program in this editor, you hit ALT-R. The editor application checks the BASIC program and
then sends it down the wire to the EEPROM on the Stamp. The Stamp then executes the program.
In this case, the program produces a square wave on I/O pin 3. If you hook up a logic probe or LED
to pin 3 (see the electronic gates article for details), you will see the LED flash on and off twice per
second (it changes state every 250 milliseconds because of the PAUSE commands). This program
would run for several weeks off of a 9-volt battery. You could save power by shortening the time
that the LED is on (perhaps it is on for 50 milliseconds and off for 450 milliseconds), and also by
using the NAP instruction instead of PAUSE.
10 Technologies Kids Already Don’t Know How to Use
by BERNADETTE JOHNSON
http://electronics.howstuffworks.com/everyday-tech/10-technologies-kids-dont-know11.htm

There are always generation gaps between people born in different eras, but nothing makes them so
obvious as technology. New gadgets come into being and become obsolete within just a few years.
Lots of items that seem like they didn't come out all that long ago to us adults have already evolved
into unrecognizability to children who are old enough to use the latest high-tech gadgetry (which
these days probably starts at 3). At the time of this writing, 15-year-olds were born around 2000, 10-
year-olds around 2005 and 5-year-olds around 2010. These kids were born during or shortly after
some pretty major shifts in our tech, from hardcopy to cloud-based software, from difficult-to-
access Internet to 24/7 connectivity, from pagers and cell phones to smartphones, and from desktops
and laptops to mobile computing. A lot of devices have, in fact, been driven to extinction (or near
extinction) by smartphone apps that replicated their functionality. With that in mind, here are 10 bits
of tech today's adults used as youngsters, or even just a few years ago, that now seem like barely
decipherable ancient relics to today's always-wired children.

The Original iPod


The original iPod, released in 2001, was a portable digital music player with a monochrome display.
Kids today find many things about it baffling [source: Fine Brothers]. It required headphones to use
since it didn't have built-in speakers, and it didn't have a touchscreen. You had to turn a physical
scroll wheel to navigate through your songs. The original also didn't really do anything but let you
store and play songs. And you couldn't even buy and load songs from the Internet from the device
itself. You had to physically connect it to your computer via a cable to download new content.
iTunes wasn't even introduced until 2003, so there was no easy single purchase point for your songs.
Like many outmoded devices, they were large and clunky by today's standards. They weren't as
intuitive to navigate and didn't do nearly as much as newer mobile devices. The iPod has since
morphed into the iPod Touch, which is a full-fledged mini tablet with wireless connectivity that can
play music, movies, games and more. It's pretty much an iPhone without the phone, which is way
more modern kids' speed. And pretty much all smartphones can carry a lot of digital music, making
a dedicated MP3 player unnecessary for most people.

Computers
Using old-school computers took a lot of technical savvy, especially the early ones that booted you
directly onto a command line and required connection to external storage devices and other
components. They've gotten more user-friendly over the years, with more intuitive graphical user
interfaces (GUIs). With the newest OSes, you don't have to do much (if any) configuration, which
is making the background processes and setup of computers a bit of a mystery to modern kids. And
more and more computing is done using mobile devices, which require even less tinkering in the
background. According to a survey by educational nonprofit organization Project Tomorrow, in
2013, 64 percent of students primarily connected to the Internet through 3G/4G mobile devices and
23 percent through a smart TV or gaming console [source: Riedel]. Most people probably don't
know command-line directives, but modern kids also find powering up old desktops kind of foreign,
since you have to turn on the computer, monitor and all other external peripherals separately rather
than hit one friendly power button. The younger generations also increasingly don't know how to do
things like change WiFi and other configuration settings, troubleshoot computer issues and reinstall
the OS (which might be necessary when the computer has contracted a virus). With most new smart
mobile devices, you turn them on and they work, and you download apps from an app store and they
just run. If something goes terribly wrong, you turn them off and back on, or jump out of the app
and back in or hand them to a professional to fix, rather than doing the tinkering that was once par
for the course with older computers.

Flip-phone Texting
Texting has been a common form of communication for years, beginning in earnest
after cellular providers started allowing text messaging across competing networks around 2001.
Smartphones with fully alphanumeric virtual keyboards have become all the rage, especially since
the iPhone came out in 2007. But in the earlier texting days, people were doing a lot of it on flip
phones and other similar cell phones, most of which had tiny displays and push-button number pads
rather than touchscreen alphanumeric keyboards. You would have to use the number keys to type
text, with each number or symbol representing multiple letters or other characters. You'd have to hit
a number multiple times until your letter came up, and spaces and other special characters could
usually be typed using the star or pound keys. This may be second nature to those of us who were
around for the cell phone revolution, but to kids who grew up with virtual keyboards that show all
the letters, this seems unintuitive, slow and hard to master.

Film Cameras
Another technology that didn't make it unscathed through the digital revolution was the film camera.
Digital cameras have technically been around for decades, but they first appeared in their current
form in the mid-1990s as small point-and-shoot cameras with LCD view screens. They began to
surpass film cameras in sales around 2003, and they took over the market almost entirely within just
a few years. As a result, children today have grown up almost entirely with the instant gratification
of digital cameras. When you show a child an old-school film camera, they're unlikely to know that
it needs film, how to load or advance the film or how to get them to take a picture. And the lack of
a preview screen baffles them, never mind the steps and expense necessary to get the film developed
so you could actually see your photos. Now that most smartphones have high-resolution cameras,
they're cutting into the digital camera market, so the point-and-shoot camera in general may be a
puzzlement to the kids of the near future.

Pagers
Kids today also missed the heyday of the pager. Before cell phones became widely adopted,
the pager (also sometimes called a beeper) was the other commercial choice for instant mobile
communication. Popular in the 1990s, beepers allowed people to send the pager holder a phone
number or other numeric message by placing a call to the pager's number then typing the message
number. When the pager beeped, the owner knew to look at the tiny LCD screen to see the message
and to find a phone to return the call. Starting in the late 1990s, pagers were supplanted by the cell
phone, which gave users the ability to make calls or send text messages from just about anywhere.
The now ubiquitous smartphone offers even more mobile communication options.
The few remaining pagers in action are now mainly used in industries like health care, where getting
someone's immediate attention is important. So a few of today's kids who decide to become doctors
may one day need to learn to use them. But even in health care settings, pagers being slowly replaced
by smartphones.

Cartridge Game Consoles


Early home gaming consoles like the Atari 2600, Atari 7200, Colecovision and original Nintendo
Entertainment System (NES) read software from game cartridges — hard, boxy contraptions that
contained a sort of internal motherboard with exposed metal connection points at the end that made
a connection when they were inserted into the gaming device. When faced with an old gaming
system, it takes even teenagers a moment to figure out how to insert the cartridges and power up the
system. They also have trouble figuring out what to do when the game doesn't work right away,
which often required removing and reinserting the cartridge, sometimes several times, to get the
game going. Although some handheld gaming systems use little cartridges, the last home cartridge
console was the Nintendo 64, released in 1996. All others at that point had started to move to CD-
like optical media. Having constantly connected high-speed Internet and growing up in the age of
the mobile computing devices have also made downloading software second nature to kids, more so
than fiddling with physical storage media.

CD and DVD Software Media


It wasn't long ago that you absolutely had to buy hard copies of your software applications to get
them onto your computer. Prepackaged software started in the 1970s and '80s on cassette tapes
or floppy disks and eventually evolved into higher capacity CD-ROMs and DVD-ROMs. In a lot of
cases, say if you wanted to play a game, the disk would have to be in the CD or DVD drive or you
couldn't play. You'd also burn any data you needed to move from computer to computer onto CD-
R/RW or DVD-R/RW disks. But now you're more likely to move data using little USB flash drives,
or send it via the Internet. And lots of computing happens on pared down devices like netbooks,
tablets or smartphones, which rely on downloads rather than software installation from physical
media. High-speed Internet in the home has become ubiquitous, and you've likely become
accustomed to downloading your software from the cloud even on regular computers, so
manufacturers are dispensing with these built-in drives on some of the more full-featured laptops.
You can perform a number of tasks in the cloud that used to require dedicated software on our
computers, including creating documents, storing and editing photos, and checking or sending email.
Although kids might be familiar with putting game disks in a gaming console, even those are moving
heavily toward downloads. And a lot of kids today do most of their computing on mobile devices
that don't require insertion of any physical media.

PDAs
Before the iPhone spurred the smartphone market with its 2007 debut, most people's cell phones
were mainly used as phones, of all things. They slowly began to do things like hold music and run
rudimentary apps, but for the most part, if you wanted to take digital notes or access a calendar or
the like in the early 2000s, you needed another dedicated productivity device — the personal digital
assistant (PDA). These were the predecessors to smartphones and included Palm Pilots, Windows
Pocket PCs and Blackberries. Some required writing or navigating in a special touchpad area of the
device with a stylus, and others had built-in keyboards. They ran apps like today's smartphones, but
there weren't that many to choose from and you couldn't just download them on the fly. The earliest
PDAs didn't even have wireless connectivity. To get data uploaded or downloaded, they had to be
connected to a computer via a serial cable. They were useful, but a far cry from today's wirelessly
connected, app-loaded smartphones, which allow users to do many things that used to necessitate
carrying multiple devices. Today's children have never known a world without the Internet, and the
youngest have grown up surrounded by easily portable devices that can connect to the 'net to send
and receive all their data. You could even say that our smartphones, and non-phone devices like
tablets and the iPhone Touch, are next-gen PDAs.

Dial-up Modems
With high-speed Internet common in most homes, hopping online is largely an instant and silent
affair. But before many of today's kids were born, unconnected people had to take multiple steps get
computers onto the Internet. It required dialing into your ISP (Internet Service Provider) using a
phone number via an external (or later internal) dial-up modem. Older models even required placing
a rotary phone headset onto a cradle on the modem. Connecting via a modem was noisy because it
literally placed a phone call and sent analog signals over the phone lines. This would tie up the phone
line, and the slow data-transfer method meant downloading and uploading took a while. Most kids
today likely don't recognize the modem noise and don't know the torture of watching a picture draw
itself onto the computer screen from the top down at a snail's pace. Most mobile devices and WiFi-
enabled modern computers detect any local WiFi networks automatically. You just have to choose
the network you want to connect to (such as your home WiFi network), type your password and
boom, you're off and running, able to simply open a browser and surf the Internet, or go to a built-
in app store and download software and entertainment media. If your password is saved on your
device, you may only have to do this once. And when smartphones don't have access to WiFi (or the
WiFi is turned off), they simply connect by default to the carrier's cellular network, through which
they can also send and receive Internet data.

Cassette Tapes
Many adults today grew up with cassette tapes (among other older audio media), and younger adults
grew up consuming their music on CDs. But today's children and teens were born in the age of
digital music, which took off not long after the invention of the MP3 in the mid-1990s and the
Napster music sharing site's 1999 debut. Now most people consume music through iTunes, Google
Play and the Amazon Digital Music store, or online streaming services such as Pandora and Spotify.
The CD sections of stores have shrunk considerably, and many kids have never even seen a cassette
tape. Working with old cassette players is challenging for them: They have to figure out how to
insert the tape on the right side to get to the song they want to hear, something you don't even have
to think about with CDs or digital music, which allow you to jump right to the song you want to
play. With tape, going from one song to another usually required fast forwarding or rewinding and
often repeatedly stopping, hitting play and listening until you got to the right point on the tape.
Heaven forbid the tape might jam and unspool and have to be wound back into the cassette. Only
having access to the songs on one tape in a cassette player at a given time is a far cry from the
situation today, when hundreds or thousands of songs can be instantly accessed on a mobile device,
with any other music available through the Internet.
There's nothing like researching kids' use of technology to realize how old you are. Some of the
things on the list that I now use daily seem like they've been around forever, but when I think back,
the iPhone really hasn't been with us for much of my life, and we haven't had high-speed WiFi and
relatively inexpensive cellular data plans for very long. Those ushered in the age of the touchscreen
tablet and constant connectivity, which disrupted and transformed just about all of our gadgets. Now
just about everything has downloadable apps and we send all kinds of data to and through our
phones, even health data from wearable devices. And I can't imagine life without video streaming.
I love all this newfound computing capability, but it's fun to revisit some of the obsolete gadgets of
my youth.
Forty years of the internet: how the world changed for ever

Oliver Burkeman
HTTPS://WWW.THEGUARDIAN.COM/TECHNOLOGY/2009/OCT/23/INTERNET-40-
HISTORY-ARPANET

Towards the end of the summer of 1969 – a few weeks after the moon landings, a few days after
Woodstock, and a month before the first broadcast of Monty Python's Flying Circus – a large grey
metal box was delivered to the office of Leonard Kleinrock, a professor at the University of
California in Los Angeles. It was the same size and shape as a household refrigerator, and outwardly,
at least, it had about as much charm. But Kleinrock was thrilled: a photograph from the time shows
him standing beside it, in requisite late-60s brown tie and brown trousers, beaming like a proud
father.
Had he tried to explain his excitement to anyone but his closest colleagues, they probably wouldn't
have understood. The few outsiders who knew of the box's existence couldn't even get its name
right: it was an IMP, or "interface message processor", but the year before, when a Boston company
had won the contract to build it, its local senator, Ted Kennedy, sent a telegram praising its
ecumenical spirit in creating the first "interfaith message processor". Needless to say, though, the
box that arrived outside Kleinrock's office wasn't a machine capable of fostering understanding
among the great religions of the world. It was much more important than that.
Sign up to the Media Briefing: news for the news-makers
It's impossible to say for certain when the internet began, mainly because nobody can agree on what,
precisely, the internet is. (This is only partly a philosophical question: it is also a matter of egos,
since several of the people who made key contributions are anxious to claim the credit.) But 29
October 1969 – 40 years ago next week – has a strong claim for being, as Kleinrock puts it today,
"the day the infant internet uttered its first words". At 10.30pm, as Kleinrock's fellow professors and
students crowded around, a computer was connected to the IMP, which made contact with a second
IMP, attached to a second computer, several hundred miles away at the Stanford Research Institute,
and an undergraduate named Charley Kline tapped out a message. Samuel Morse, sending the first
telegraph message 125 years previously, chose the portentous phrase: "What hath God wrought?"
But Kline's task was to log in remotely from LA to the Stanford machine, and there was no
opportunity for portentousness: his instructions were to type the command LOGIN.
To say that the rest is history is the emptiest of cliches – but trying to express the magnitude of what
began that day, and what has happened in the decades since, is an undertaking that quickly exposes
the limits of language. It's interesting to compare how much has changed in computing and the
internet since 1969 with, say, how much has changed in world politics. Consider even the briefest
summary of how much has happened on the global stage since 1969: the Vietnam war ended; the
cold war escalated then declined; the Berlin Wall fell; communism collapsed; Islamic
fundamentalism surged. And yet nothing has quite the power to make people in their 30s, 40s or 50s
feel very old indeed as reflecting upon the growth of the internet and the world wide web. Twelve
years after Charley Kline's first message on the Arpanet, as it was then known, there were still only
213 computers on the network; but 14 years after that, 16 million people were online, and email was
beginning to change the world; the first really usable web browser wasn't launched until 1993, but
by 1995 we had Amazon, by 1998 Google, and by 2001, Wikipedia, at which point there were 513
million people online. Today the figure is more like 1.7 billion.
Unless you are 15 years old or younger, you have lived through the dotcom bubble and bust, the
birth of Friends Reunited and Craigslist and eBay and Facebook and Twitter, blogging, the browser
wars, Google Earth, filesharing controversies, the transformation of the record industry, political
campaigning, activism and campaigning, the media, publishing, consumer banking, the pornography
industry, travel agencies, dating and retail; and unless you're a specialist, you've probably only been
following the most attention-grabbing developments. Here's one of countless statistics that are liable
to induce feelings akin to vertigo: on New Year's Day 1994 – only yesterday, in other words – there
were an estimated 623 websites. In total. On the whole internet. "This isn't a matter of ego or
crowing," says Steve Crocker, who was present that day at UCLA in 1969, "but there has not been,
in the entire history of mankind, anything that has changed so dramatically as computer
communications, in terms of the rate of change."
Looking back now, Kleinrock and Crocker are both struck by how, as young computer scientists,
they were simultaneously aware that they were involved in something momentous and, at the same
time, merely addressing a fairly mundane technical problem. On the one hand, they were there
because of the Russian Sputnik satellite launch, in 1957, which panicked the American defence
establishment, prompting Eisenhower to channel millions of dollars into scientific research, and
establishing Arpa, the Advanced Research Projects Agency, to try to win the arms technology race.
The idea was "that we would not get surprised again," said Robert Taylor, the Arpa scientist who
secured the money for the Arpanet, persuading the agency's head to give him a million dollars that
had been earmarked for ballistic missile research. With another pioneer of the early internet, JCR
Licklider, Taylor co-wrote the paper, "The Computer As A Communication Device", which hinted
at what was to come. "In a few years, men will be able to communicate more effectively through a
machine than face to face," they declared. "That is rather a startling thing to say, but it is our
conclusion."
On the other hand, the breakthrough accomplished that night in 1969 was a decidedly down-to-earth
one. The Arpanet was not, in itself, intended as some kind of secret weapon to put the Soviets in
their place: it was simply a way to enable researchers to access computers remotely, because
computers were still vast and expensive, and the scientists needed a way to share resources. (The
notion that the network was designed so that it would survive a nuclear attack is an urban myth,
though some of those involved sometimes used that argument to obtain funding.) The technical
problem solved by the IMPs wasn't very exciting, either. It was already possible to link computers
by telephone lines, but it was glacially slow, and every computer in the network had to be connected,
by a dedicated line, to every other computer, which meant you couldn't connect more than a handful
of machines without everything becoming monstrously complex and costly. The solution, called
"packet switching" – which owed its existence to the work of a British physicist, Donald Davies –
involved breaking data down into blocks that could be routed around any part of the network that
happened to be free, before getting reassembled at the other end.
"I thought this was important, but I didn't really think it was as challenging as what I thought of as
the 'real research'," says Crocker, a genial Californian, now 65, who went on to play a key role in
the expansion of the internet. "I was particularly fascinated, in those days, by artificial intelligence,
and by trying to understand how people think. I thought that was a much more substantial and
respectable research topic than merely connecting up a few machines. That was certainly useful, but
it wasn't art."
Still, Kleinrock recalls a tangible sense of excitement that night as Kline sat down at the SDS Sigma
7 computer, connected to the IMP, and at the same time made telephone contact with his opposite
number at Stanford. As his colleagues watched, he typed the letter L, to begin the word LOGIN.
"Have you got the L?" he asked, down the phone line. "Got the L," the voice at Stanford responded.
Kline typed an O. "Have you got the O?"
"Got the O," Stanford replied.
Kline typed a G, at which point the system crashed, and the connection was lost. The G didn't make
it through, which meant that, quite by accident, the first message ever transmitted across the nascent
internet turned out, after all, to be fittingly biblical:
"LO."
Frenzied visions of a global conscious brain
One of the most intriguing things about the growth of the internet is this: to a select group of
technological thinkers, the surprise wasn't how quickly it spread across the world, remaking
business, culture and politics – but that it took so long to get off the ground. Even when computers
were mainly run on punch-cards and paper tape, there were whispers that it was inevitable that they
would one day work collectively, in a network, rather than individually. (Tracing the origins of
online culture even further back is some people's idea of an entertaining game: there are those who
will tell you that the Talmud, the book of Jewish law, contains a form of hypertext, the linking-and-
clicking structure at the heart of the web.) In 1945, the American presidential science adviser,
Vannevar Bush, was already imagining the "memex", a device in which "an individual stores all his
books, records, and communications", which would be linked to each other by "a mesh of associative
trails", like weblinks. Others had frenzied visions of the world's machines turning into a kind of
conscious brain. And in 1946, an astonishingly complete vision of the future appeared in the
magazine Astounding Science Fiction. In a story entitled A Logic Named Joe, the author Murray
Leinster envisioned a world in which every home was equipped with a tabletop box that he called a
"logic":
"You got a logic in your house. It looks like a vision receiver used to, only it's got keys instead of
dials and you punch the keys for what you wanna get . . . you punch 'Sally Hancock's Phone' an' the
screen blinks an' sputters an' you're hooked up with the logic in her house an' if somebody answers
you got a vision-phone connection. But besides that, if you punch for the weather forecast [or] who
was mistress of the White House durin' Garfield's administration . . . that comes on the screen too.
The relays in the tank do it. The tank is a big buildin' full of all the facts in creation . . . hooked in
with all the other tanks all over the country . . . The only thing it won't do is tell you exactly what
your wife meant when she said, 'Oh, you think so, do you?' in that peculiar kinda voice "
Despite all these predictions, though, the arrival of the internet in the shape we know it today was
never a matter of inevitability. It was a crucial idiosyncracy of the Arpanet that its funding came
from the American defence establishment – but that the millions ended up on university campuses,
with researchers who embraced an anti-establishment ethic, and who in many cases were
committedly leftwing; one computer scientist took great pleasure in wearing an anti-Vietnam badge
to a briefing at the Pentagon. Instead of smothering their research in the utmost secrecy – as you
might expect of a cold war project aimed at winning a technological battle against Moscow – they
made public every step of their thinking, in documents known as Requests For Comments.
Deliberately or not, they helped encourage a vibrant culture of hobbyists on the fringes of academia
– students and rank amateurs who built their own electronic bulletin-board systems and eventually
FidoNet, a network to connect them to each other. An argument can be made that these unofficial
tinkerings did as much to create the public internet as did the Arpanet. Well into the 90s, by the time
the Arpanet had been replaced by NSFNet, a larger government-funded network, it was still the
official position that only academic researchers, and those affiliated to them, were supposed to use
the network. It was the hobbyists, making unofficial connections into the main system, who first
opened the internet up to allcomers.
What made all of this possible, on a technical level, was simultaneously the dullest-sounding and
most crucial development since Kleinrock's first message. This was the software known as TCP/IP,
which made it possible for networks to connect to other networks, creating a "network of networks",
capable of expanding virtually infinitely – which is another way of defining what the internet is. It's
for this reason that the inventors of TCP/IP, Vint Cerf and Bob Kahn, are contenders for the title of
fathers of the internet, although Kleinrock, understandably, disagrees. "Let me use an analogy," he
says. "You would certainly not credit the birth of aviation to the invention of the jet engine. The
Wright Brothers launched aviation. Jet engines greatly improved things."
The spread of the internet across the Atlantic, through academia and eventually to the public, is a
tale too intricate to recount here, though it bears mentioning that British Telecom and the British
government didn't really want the internet at all: along with other European governments, they were
in favour of a different networking technology, Open Systems Interconnect. Nevertheless, by July
1992, an Essex-born businessman named Cliff Stanford had opened Demon Internet, Britain's first
commercial internet service provider. Officially, the public still wasn't meant to be connecting to the
internet. "But it was never a real problem," Stanford says today. "The people trying to enforce that
weren't working very hard to make it happen, and the people working to do the opposite were
working much harder." The French consulate in London was an early customer, paying Demon £10
a month instead of thousands of pounds to lease a private line to Paris from BT.
After a year or so, Demon had between 2,000 and 3,000 users, but they weren't always clear why
they had signed up: it was as if they had sensed the direction of the future, in some inchoate fashion,
but hadn't thought things through any further than that. "The question we always got was: 'OK, I'm
connected – what do I do now?'" Stanford recalls. "It was one of the most common questions on our
support line. We would answer with 'Well, what do you want to do? Do you want to send an email?'
'Well, I don't know anyone with an email address.' People got connected, but they didn't know what
was meant to happen next."
Fortunately, a couple of years previously, a British scientist based at Cern, the physics laboratory
outside Geneva, had begun to answer that question, and by 1993 his answer was beginning to be
known to the general public. What happened next was the web.
The birth of the web
I sent my first email in 1994, not long after arriving at university, from a small, under-ventilated
computer room that smelt strongly of sweat. Email had been in existence for decades by then – the
@ symbol was introduced in 1971, and the first message, according to the programmer who sent it,
Ray Tomlinson, was "something like QWERTYUIOP". (The test messages, Tomlinson has said,
"were entirely forgettable, and I have, therefore, forgotten them".) But according to an unscientific
poll of friends, family and colleagues, 1994 seems fairly typical: I was neither an early adopter nor
a late one. A couple of years later I got my first mobile phone, which came with two batteries: a very
large one, for normal use, and an extremely large one, for those occasions on which you might
actually want a few hours of power. By the time I arrived at the Guardian, email was in use, but only
as an add-on to the internal messaging system, operated via chunky beige terminals with green-on-
black screens. It took for ever to find the @ symbol on the keyboard, and I don't remember anything
like an inbox, a sent-mail folder, or attachments. I am 34 years old, but sometimes I feel like
Methuselah.
I have no recollection of when I first used the world wide web, though it was almost certainly when
people still called it the world wide web, or even W3, perhaps in the same breath as the phrase
"information superhighway", made popular by Al Gore. (Or "infobahn": did any of us really, ever,
call the internet the "infobahn"?) For most of us, though, the web is in effect synonymous with the
internet, even if we grasp that in technical terms that's inaccurate: the web is simply a system that
sits on top of the internet, making it greatly easier to navigate the information there, and to use it as
a medium of sharing and communication. But the distinction rarely seems relevant in everyday life
now, which is why its inventor, Tim Berners-Lee, has his own legitimate claim to be the progenitor
of the internet as we know it. The first ever website was his own, at CERN: info.cern.ch.
The idea that a network of computers might enable a specific new way of thinking about information,
instead of just allowing people to access the data on each other's terminals, had been around for as
long as the idea of the network itself: it's there in Vannevar Bush's memex, and Murray Leinster's
logics. But the grandest expression of it was Project Xanadu, launched in 1960 by the American
philosopher Ted Nelson, who imagined – and started to build – a vast repository for every piece of
writing in existence, with everything connected to everything else according to a principle he called
"transclusion". It was also, presciently, intended as a method for handling many of the problems that
would come to plague the media in the age of the internet, automatically channelling small royalties
back to the authors of anything that was linked. Xanadu was a mind-spinning vision – and at least
according to an unflattering portrayal by Wired magazine in 1995, over which Nelson threatened to
sue, led those attempting to create it into a rabbit-hole of confusion, backbiting and "heart-slashing
despair". Nelson continues to develop Xanadu today, arguing that it is a vastly superior alternative
to the web. "WE FIGHT ON," the Xanadu website declares, sounding rather beleaguered, not least
since the declaration is made on a website.
Web browsers crossed the border into mainstream use far more rapidly than had been the case with
the internet itself: Mosaic launched in 1993 and Netscape followed soon after, though it was an
embarrassingly long time before Microsoft realised the commercial necessity of getting involved at
all. Amazon and eBay were online by 1995. And in 1998 came Google, offering a powerful new
way to search the proliferating mass of information on the web. Until not too long before Google, it
had been common for search or directory websites to boast about how much of the web's information
they had indexed – the relic of a brief period, hilarious in hindsight, when a user might genuinely
have hoped to check all the webpages that mentioned a given subject. Google, and others, saw that
the key to the web's future would be helping users exclude almost everything on any given
topic, restricting search results to the most relevant pages.
Without most of us quite noticing when it happened, the web went from being a strange new
curiosity to a background condition of everyday life: I have no memory of there being an
intermediate stage, when, say, half the information I needed on a particular topic could be found
online, while the other half still required visits to libraries. "I remember the first time I saw a web
address on the side of a truck, and I thought, huh, OK, something's happening here," says Spike
Ilacqua, who years beforehand had helped found The World, the first commercial internet service
provider in the US. Finally, he stopped telling acquaintances that he worked in "computers", and
started to say that he worked on "the internet", and nobody thought that was strange.
It is absurd – though also unavoidable here – to compact the whole of what happened from then
onwards into a few sentences: the dotcom boom, the historically unprecedented dotcom bust, the
growing "digital divide", and then the hugely significant flourishing, over the last seven years, of
what became known as Web 2.0. It is only this latter period that has revealed the true capacity of the
web for "generativity", for the publishing of blogs by anyone who could type, for podcasting and
video-sharing, for the undermining of totalitarian regimes, for the use of sites such as Twitter and
Facebook to create (and ruin) friendships, spread fashions and rumours, or organise political
resistance. But you almost certainly know all this: it's part of what these days, in many parts of the
world, we call "just being alive".
The most confounding thing of all is that in a few years' time, all this stupendous change will
probably seem like not very much change at all. As Crocker points out, when you're dealing with
exponential growth, the distance from A to B looks huge until you get to point C, whereupon the
distance between A and B looks like almost nothing; when you get to point D, the distance between
B and C looks similarly tiny. One day, presumably, everything that has happened in the last 40 years
will look like early throat-clearings — mere preparations for whatever the internet is destined to
become. We will be the equivalents of the late-60s computer engineers, in their horn-rimmed
glasses, brown suits, and brown ties, strange, period-costume characters populating some dimly
remembered past.
Will you remember when the web was something you accessed primarily via a computer? Will you
remember when there were places you couldn't get a wireless connection? Will you remember when
"being on the web" was still a distinct concept, something that described only a part of your life,
instead of permeating all of it? Will you remember Google?
Since you’re here …
… we have a small favour to ask. More people are reading the Guardian than ever but advertising
revenues across the media are falling fast. And unlike many news organisations, we haven’t put up
a paywall – we want to keep our journalism as open as we can. So you can see why we need to ask
for your help. The Guardian’s independent, investigative journalism takes a lot of time, money and
hard work to produce. But we do it because we believe our perspective matters – because it might
well be your perspective, too.
I appreciate there not being a paywall: it is more democratic for the media to be available for all
and not a commodity to be purchased by a few. I’m happy to make a contribution so others with less
means still have access to information.Thomasine F-R.
If everyone who reads our reporting, who likes it, helps fund it, our future would be much more
secure. For as little as £1, you can support the Guardian – and it only takes a minute. Thank
you.
How Smartphones Work
BY Dave Coustan, Jonathan Strickland & John Perritano
HTTP://ELECTRONICS.HOWSTUFFWORKS.COM/SMARTPHONE6.HTM

Isn’t it great when science fiction becomes science fact? If you’re a little older, you probably wanted
a communication device just like the one Captain Kirk used in the TV series “Star Trek” when you
were growing up. Kirk and the crew of the USS Enterprise could talk over vast distances with these
personal communication devices. Without the “communicator, the order to “beam us up, Mr. Scott”
would have fallen on deaf ears, and we all know what would have happened to Kirk if he didn’t
have any bars on his device. Now that we’re well into the 21st century, our “communicators” make
the ones on “Star Trek” seem like antiques. Not only can we talk to one another on our smartphones,
but we can text, play music or a game, get directions, take pictures, check e-mail, find a great
restaurant, surf the Internet, watch a movie. You get the idea. Smartphones are cell phones on
steroids. Why is that? Unlike traditional cell phones, smartphones, with their big old memories,
allow individual users like you and me to install, configure and run applications, or apps, of our
choosing. A smartphone offers the ability to configure the device to your particular way of doing
things. The software in the old-style flip phones offers only limited choices for reconfiguration,
forcing you to adapt to the way they are set up. On a standard phone, whether or not you like the
built-in calendar application, you’re stuck with it except for a few minor tweaks. But if that phone
were a smartphone, you could install any compatible calendar application you liked. Here's a list of
some of the additional capabilities smartphones have, from intuitive to perhaps less so:

 Manage your personal info including notes, calendar and to-do lists
 Communicate with laptop or desktop computers
 Sync data with applications like Microsoft Outlook and Apple's iCal calendar programs
 Host applications such as word processing programs or video games
 Scan a receipt
 Cash a check
 Replace your wallet. A smartphone can store credit card information and discount or
membership card info
 Pay bills by downloading apps such as PayPal and CardStar
 Allow you to create a WiFi network that multiple devices can use simultaneously. That
means you can access the Internet from your iPad or laptop without a router or another
peripheral device.

The Layers of a Smartphone


Everyone has a smartphone, or so it seems. In fact, there were an estimated 1.4 billion smartphones
in the world as of December 2013 [source: Koetsier]. People are constantly talking on them, taking
pictures, surfing the Internet and doing dozens of other things, including shopping for cars. Captain
Kirk would be jealous. At their core, smartphones, and all cell phones for that matter, are mini radios,
sending and receiving radio signals. Cell phone networks are divided into specific areas called cells.
Each cell has an antenna that receives cell phone signals. The antenna transmits signals just like a
radio station, and your phone picks up those signals just as a radio does. Smartphones use cell phone
network technology to send and receive data (think phone calls, Web browsing, file transfers).
Developers classify this technology into generations. Do you remember the first generation? It
included analog cell phone technology. However, as cell phone technology progressed, the protocols
became more advanced. In 2014, cell phones are in the world of the fourth generation, or 4G.
Although most carriers are expanding their 4G technology, some companies, such as Samsung, are
developing 5G technology, which if recent tests are any indication, will allow you to download an
entire movie in less than a second. You can read more about network technologies and protocols in
the article How Cell Phones Work
Smartphone Hardware and Software
As long as we're talking details, let's have a quick look at smartphone hardware.Some smartphones
run on processors. Along with processors, smartphones also have computer chips that provide
functionality. Phones with cameras have high-resolution image sensors, just like digital cameras.
Other chips support complex functions such as browsing the Internet, sharing multimedia files or
playing music without placing too great a demand on the phone’s battery. Some manufacturers
develop chips that integrate multiple functions to help reduce the overall cost (fewer chips produced
per phone help offset production costs). You can visualize software for smartphones as a software
stack. The stack consists of the following layers:

 kernel -- management systems for processes and drivers for hardware


 middleware -- software libraries that enable smartphone applications (such as security, Web
browsing and messaging)
 application execution environment (AEE) -- application programming interfaces, which
allow developers to create their own programs
 user interface framework -- the graphics and layouts seen on the screen
 application suite -- the basic applications users access regularly such as menu screens,
calendars and message inboxes
Smartphone Operating Systems
The most important software in any smartphone is its operating system (OS). An operating system
manages the hardware and software resources of smartphones. Some platforms cover the entire
range of the software stack. Others may only include the lower levels (typically the kernel and
middleware layers) and rely on additional software platforms to provide a user interface framework.
We've added some snapshots of specific smartphone operating systems.
Designed primarily for touch-screen mobile devices, Android, or Droid, technology is the
operating system that most mobile telephones used as of Comscore's February 2014 numbers.
Developed by Google, most people consider the Droid technology revolutionary because its open
source technology allows people to write program codes and applications for the operating system,
which means Android is evolving constantly. Smartphone users can decide whether to download the
applications. Moreover, Android operating systems can run multiple applications, allowing users to
be multitasking mavens. And get this: Any hardware manufacturer is free to produce its own
Android phone by using the operating system. In fact, many smartphone companies do just that.
Android app’s store has hundreds of thousands of apps. Apple is always innovating, and iOS allows
iPhone screens to be used simply and logically. Touted by Apple as the “world’s most advance
mobile operating system,” iOS supports more access from sports scores to restaurant
recommendations. As of publication, its latest version iOS7 allows for automatic updates and a
control center that gives users access to their most used features. It also makes surfing the net easier
with an overhaul to the Safari browser. Reviewers say that
Windows Phone 8 (WP8) is as simple to use as iOS and as easy to customize as Android. Its
crowning achievement is LiveTiles, which are programmed squares that users can rearrange on their
screen to easily access the information they want. WP8 works well with other Microsoft products,
including Office and Exchange. For those who do a lot of calling, connecting to Facebook and
texting, WP8 may meet their needs. At first glance, experts say, Ubuntu 13.10 Touch might seem
like an ordinary operating system, but it’s not. Experts say Ubuntu Touch one of the easiest systems
to use, allowing seamless navigation with multiple scopes. There are no hardware buttons on the
bottom, for example. Instead, Ubuntu works from the edges. Developed by Canonical, the Ubuntu
Touch allows users to unlock the phone from the right edge. You can swipe down from the top edge
to access the phone’s indicators, including date, time, messages (from variety of sources, ie: Skype
and Facebook) and wireless networks. The phone also makes it easy for people to organize and share
photos. Every shot is automatically uploaded to a personal cloud account, which makes it available
on all devices, including iOS, Android and Windows [sources: Ubuntu, Vaughan-Nichols].
Flexible Interfaces
The core services on smartphones all tie in to the idea of a multipurpose device that can effectively
multitask. A user can watch a video, field a phone call, then return to the video after the call, all
without closing each application. Or he or she can flip through the digital calendar and to-do list
applications without interrupting the voice call. All of the data stored on the phone can be
synchronized with outside applications or manipulated by third-party phone applications in
numerous ways. Here are a few systems that smartphones support.
Bluetooth
This short-range radio service allows phones to wirelessly link up with each other and with other
nearby devices that support it. Examples include printers, scanners, input devices, computers and
headsets.
Some varieties of Bluetooth only allow communication with one device at a time, but others allow
simultaneous connection with multiple devices. To learn more, check out How Bluetooth Works.
Data Synchronization
A phone that keeps track of your personal information, like appointments, to-do lists, addresses, and
phone numbers, needs to be able to communicate with all of the other devices you use to keep track
of those things. There are hundreds of possible platforms and applications you might use for this in
the course of a day. If you want to keep all of this data synchronized with what's on your phone,
then you generally have to look for a cell phone that speaks the languages of all of the devices and
applications you use. Or you can go out and buy new applications that speak the language of your
cell phone.
 The Open Mobile Alliance (OMA) is a collaborative organization with the following
mission:
 Be the center of mobile service enabler specification work, helping the creation of
interoperable services across countries, operators and mobile terminals that will meet the
needs of the user.
 The OMA formed a Data Synchronization Working Group, which continued the work
begun by the SyncML Initiative. SyncML was an open-standards project designed to
eliminate the trouble of worrying about whether your personal information manager tools
sync up with your phone and vice versa. The project is designed so that any kind of data
can be synchronized with any application on any piece of hardware, through any network,
provided that they are all programmed to OMA standards. This includes synchronization
over the Web, Bluetooth, mail protocols and TCP/IP networks.
 SyncML allows data to be synchronized from a phone to Windows, Mac
and Linux applications using Bluetooth, infrared, HTTP or a USB cable. Visit the OMA
Web site for more information.
 Java
 A smartphone that's compatible with the Java programming language allows the user to
load and run Java applications and MIDlets. MIDlets are applications that use a subset of
Java and are specifically programmed to run on wireless devices. Java MIDlets include
add-ons, games, applications and utilities.
 Since there are millions of Java developers worldwide, and the Java development tools are
freely accessible, smartphone users can install thousands of third-party applications on their
phones. Because of the way the OS architecture of most phones is built, these applications
can access and use all of the data on the user's phone.
The Future of Smartphones
With data transmission rates reaching blistering speeds and the incorporation of WiFi technology,
the sky is the limit on what smartphones can do. Possibly the most exciting thing about smartphone
technology is that the field is still wide open. It's an idea that probably hasn't found its perfect, real-
world implementation yet. Every crop of phones brings new designs and new interface ideas. No
one developer or manufacturer has come up with the perfect shape, size or input method yet. The
next "killer app" smartphone could look like a flip phone, a tablet PC, a candy bar or something no
one has conceived of yet. Perhaps the most challenging consideration for the future is security.
Smartphones may be vulnerable to security breaches such as an Evil Twin attack. In one of these
attacks, a hacker sets a server’s service identifier to that of a legitimate hotspot or network while
simultaneously blocking traffic to the real server. When a user connects with the hacker’s server,
information can be intercepted and security is compromised. On the other side, some critics argue
that anti-virus software manufacturers greatly exaggerate the risks, harms and scope of phone viruses
in order to help sell their software. Read more in the article How Cell Phone Viruses Work. The
incredible diversity in smartphone hardware, software and network protocols inhibit practical, broad
security measures. Most security considerations either focus on particular operating systems or have
more to do with user behavior than network security. For lots more information on smartphones and
related topics, check out the links on the following page.
Issues in Telecommunications Development
by Valetta malta
http://www.itu.int/newsarchive/press/WTDC98/Feature1.html

When the 850 or so delegates to the second World Telecommunication Development Conference
entered the hall of the Mediterranean Conference Centre, it was with a serious and urgent purpose:
to try to find new ways of rapidly advancing telecommunications development in the non-
industrialized world, in an attempt to bring developing nations into to the fast-emerging Global
Information Infrastructure. It is an oft-cited fact that, whilst developments in telecommunications
are moving along in leaps and bounds throughout the world’s richer nations, some developing
countries are lagging ever further behind. More than half the people living in the developing world
still do not have access to a simple voice telephone. Communications capabilities that those of us
living in Europe, the Americas, or the wealthy nations of the Pacific Rim take for granted everyday
– telephone, fax, voicemail, e-mail, mobile cellular and paging – are a world away from the everyday
lives of people living in the vast majority of countries around the world. In reality, the new
communications revolution has touched only a very few nations, which have the networks,
development capital and large user base necessary to support growth and deployment of new
services. This gap between North and South, affluent and poor, has long been a problem and source
of concern for those, like the ITU, who work to further the global provision of basic
telecommunications. Today, new gaps are emerging: between urban and rural areas, in valued-added
services, in Internet access. As the United Nations specialized agency for telecommunications, the
ITU has as one of its primary mandates the fostering of telecoms network growth with the purpose
of extending access to communications services to as many of the world’s people as possible.
Global Telecoms Development
Back in 1984, the Union set about ascertaining the real state of the world’s telecoms networks,
commissioning a special report, known as the Maitland report. The report, prepared by the
Independent Commission for Worldwide Telecommunications Development, spoke of a 'missing
link' – the lack of a reliable telecommunications infrastructure – which was holding back countries
in the developing world from reaching their full economic potential. Teledensity figures, which
count the number of main telephone lines for every 100 people, continue to hover around 1.0 to 3.5
in most of the developing world, while wealthy nations enjoy rates around 50. Indeed, most carriers
industrialized markets are now experiencing the most growth in demand for second or third phone
lines for family members, fax machines, or connection to the Internet. The findings of the Maitland
report, and the establishment of a specialized sector within the ITU (ITU-D) to deal with
development issues, led to the first World Telecommunication Development Conference in Buenos
Aires in 1994. The aim of this conference was to develop programmes specially targeted at the needs
of the developing world, which would translate into real and rapid improvements in
telecommunications infrastructure. These programmes have have not all produced the expected
results. Hearteningly, quite a few developing nations, including many in Asia and several in Africa,
have begun to respond to development efforts and notch up impressive improvements in teledensity
and telecommunications access. Countries like Botswana, China, Chile, Thailand, Hungary, Ghana
and Mauritius have all made good progress in extending telecommunications access in the last two
to three years. However, there remain some nations, particularly many of the 48 UN-designated
Least Developed Countries, in which the situation has little or not improved, when they have not
worsened. While the number of new lines installed in these countries frequently has actually
increased from between 20 to 30% over the last 10 years, this increase has often not been sufficient
to keep up with the rise in population, causing teledensity figures to fall in real terms. Furthermore,
since most lines in the developing world are concentrated in and around the major city, the situation
for people living in rural areas is frequently much worse than even low teledensity figures would
indicate.
The problem for some nations is now becoming acute. The growing use of electronic processing
systems in business throughout the industrialized world is making it harder and harder for countries
with poorly developed communications networks to win a share of trade and commerce in richer
world, effectively cutting off the major source of hard currency and earnings. For people living in
much of the developing world, the new Global Information Infrastructure proposed by Vice
President Al Gore at the ITU’s first World Telecommunication Development Conference must seem
a long way off, and getting ever more distant. With changes in the telecommunications environment
taking place at lightning speed, it has become imperative that urgent methods be found to improve
the communications infrastructure in those countries which have failed to make significant progress
under the development schemes currently in place. If not, we risk these countries falling so far
behind that catch-up may well be almost an impossibility.
Signs of Change
Four years after the ITU’s first World Telecommunications Development Conference in Argentina,
there are nevertheless good signs of change on the horizon in much of the developing world. Indeed,
in parts of the African continent, which is home to 33 of the 48 LDCs, some are even beginning to
speak of an Africa renaissance generated by the opening up of formerly closed markets. According
to the International Monetary Fund, Africa should record growth of 4.7%, putting it just behind
world growth leader Asia. Furthermore, Uganda and Botswana are forecast to be among the ten
fastest growing economies in the world this year. The Telecommunication Development Bureau
(BDT) of the ITU has worked hard to create a paradigm shift in telecoms development whether
advising on appropriate institutional structures, networks and services for efficient
telecommunications or assisting in mobilizing the financial and human resources necessary to
introduce technological innovations. Over the years, it has largely contributed to a change of attitude
from an inward-focused approach to an outward-oriented route to telecoms development.In that new
evolving environment, operators are taking advantage of foreign investment and partnerships to
build and expand their networks with it fast results in many countries. New technologies such as
mobile cellular telephony and wireless local loop are helping to increase access for urban
populations living in the developing world. In rural regions, where the cost of installing private lines
is often prohibitively high when compared with the revenue that these lines generate, the emphasis
is now on improving access rather than bringing a line into every home. Initiatives such as
community telecentres, which give rural dwellers access to a central telecommunications facility
where they can make and receive calls, send and receive faxes and e-mail, and even surf the Internet,
have great potential to bring the benefits of modern communications to formerly isolated towns and
villages in a way that is economically viable. A trial project of so-called Multipurpose Community
Telecentres is already underway in Uganda, through a partnership between the ITU, UNESCO and
the Canadian International Development Research Centre (IDRC). It is envisaged that this MCT,
based at Nakaseke in the country’s south-west, and equipped with a range of communications
equipment, could generate US$450 a month in revenue once fully operational i.e. half of
the annual current revenue per line.

Market Liberalization

Further evidence of a shift in attitudes in parts of the African continent is the move to embrace the
free-trade principles of the World Trade Organization. Seven African countries have already joined
the WTO – Côte d’Ivoire, Ghana, Mauritius, Morocco, Senegal, South Africa and Tunisia – and a
number of other countries, including Cape Verde, Guinea, Guinea Bissau and Sao Principe and
Principe have begun to liberalize their markets in the same spirit, but outside the framework of the
WTO. In addition, five African operators were privatized between 1996 and 1997, compared to only
one in the five years between 1990 and 1995. More than a dozen new private mobile cellular
companies have begun operations since 1995, while more than 100 Internet service providers have
started operations in sub-Saharan Africa. But, regulatory regimes that reduce impredictability will
be essential to attract investors and policies will need to be put into place to ensure a level playing
field.
Why a World Conference?
In addition to the Union’s considerable on-the-ground activities, the Union’s four-yearly global
development conferences bring the world’s telecommunications community together to focus
regional and national development efforts, and to serve as a think-tank for the development of
innovative strategies that could be used to improve telecommunications access in the world’s under-
served regions. The first World Telecommunication Development Conference in 1994 defined
twelve programme areas which required special efforts for the improvement of telecommunications
infrastructure: policies strategy and financing; human resources management; business plan
development; maritime radiocommunication development; computer-aided network planning;
frequency management; maintenance; development of mobile cellular systems; integrated rural
development; broadcasting; information service development; and development of telematics and
electronic networks, such as the Internet. The plan that was developed at the first WTDC to address
each of these programme areas, the Buenos Aires Action Plan, will be reviewed at the forthcoming
WTDC, to see where it has succeeded, and in what respects it has failed to live up to expectations.
Despite overall good results, changes in the global telecommunications environment, including new
global trading arrangements ushered in by the World Trade Organization agreement on trade in
telecommunications services, will necessitate new approaches. The second WTDC will be looking
closely at policies which could improve access to telecommunications and information technologies
throughout the world’s underserved regions.
What’s On the Agenda?
First and foremost, the second WTDC will review the comprehensive Action Plan developed at
Buenos Aires. The conference will attempt to ascertain from delegates working on the ground in
countries targeted by the programme whether the strategies developed are bringing about real
improvements in network development, and what further steps could be taken to improve the
situation. This review will provide the input for the development of the next four-year plan for the
work of the ITU’s Development Sector, and give an indication as to whether policies are moving in
the right direction. Specific areas of discussion will centre around new ways of developing mutually
beneficial partnerships for development, innovative funding arrangements, new technologies and
their application in developing countries, and the need for new regulatory structures to meet the
needs of the evolving global telecommunications environment. Certainly, at the end of the day, every
delegate at the WTDC will hope that the conference can come up with a new four-year plan that is
not only idealistic, but which is able to be implemented at grass roots level. As the uptake of
telecommunications-based information services accelerates throughout the developed world, the
time for platitudes and lofty, unrealistic projects is long behind us. If we fail to make real progress
in the countries currently lagging behind, instead of creating a Global Information Infrastructure, we
will have merely cemented in place the old inequalities which have kept poor countries from
benefiting from prosperity and growth. For this reason, many are already citing the Malta
conference, as a ‘make-or-break’ event for telecommunications development. Today more than ever,
there are good reasons for optimism for the future of global telecommunications development.
Deregulation and liberalization of markets is already bringing an injection of foreign capital and
helping to jump-start the telecoms infrastructure of many developing nations. New technologies,
from community telecentres to satellite-based GMPCS phone systems, are offering great
possibilities for overcoming the tyranny of high costs and remote locations, which have plagued
operators and administrations in the developing world since they first began building their networks.
And information distribution systems, such as the Internet, do provide real opportunities for bringing
education and information within the reach of all. The key issue surrounding most of these
technologies and systems is price. At the moment, almost all of these options are well-beyond the
range most inhabitants of developing nations could be expected to pay. Costs will have to come
down, and one of the tasks of the WTDC will be to consider how this might be feasible while at the
same time maintaining private sector interest and participation in local markets.
Some would say telecommunications development is at a crossroads. While the concern over the
need for rapid progress in the developing world is justified, the tools for change are in our hands.
The ITU’s Valletta conference just needs to work out how to use them.
Understanding the Telecommunications Revolution
By Lillian Goleniewski
http://www.informit.com/articles/article.aspx?p=24667&seqNum=3

In recent years, the word telecommunications has been used so often, and applied in so many
situations, that it has become part of our daily lexicon, yet its definition remains elusive. So, let's
start with a definition. The word telecommunications has its roots in Greek: tele means "over a
distance," and communicara means "the ability to share." Hence, telecommunications literally
means "the sharing of information over a distance." Telecommunications is more than a set of
technologies, it's more than an enormous global industry (estimated to be US$2.5 trillion), it's more
than twenty-first-century business and law that is being re-created to accommodate a virtual world,
and it's more than a creator and destroyer of the state of the economy. Telecommunications is a way
of life. Telecommunications affects how and where you do everything—live, work, play, socialize,
entertain, serve, study, teach, rest, heal, and protect. Telecommunications has served a critical role
in shaping society and culture, as well as in shaping business and economics. It is important to
examine telecommunications from the broadest perspective possible to truly appreciate the depth
and complexity of this field and thereby understand the opportunities it affords. The best way to
learn to "think telecom" is to quickly examine how it is changing both business and lifestyle.
Throughout the 1980s and 1990s, much of the IT&T (information technologies and
telecommunications) industry's focus was on how to reengineer the likes of financial institutions,
manufacturing, retail, service, and government. These technology deployments were largely pursued
and justified on the grounds of reducing costs and enhancing competitiveness by speeding
communications. Today, we are shifting our focus to another set of objectives: Our technology
deployments are targeted at supporting not just the needs of a business enterprise, but also those of
the consumers. The revolution in integrated media is transforming all aspects of human activity
related to communication and information. We are moving to computer-based environments that
support the creation, sharing, and distribution of multimodal information. Whereas traditional
telecommunications networks have allowed us to cross barriers associated with time and distance,
the new multimedia realm is allowing us to include vital physical cues in the information stream,
introducing a physical reality into the world of electronic communications, goods, and services. Not
surprisingly, some of the industries that are being most radically revolutionized are those that deal
with the human senses, including entertainment, health care, education, advertising, and, sadly,
warfare. In each of these key sectors, there are telecommunications solutions that address the
business need, reduce costs, or enhance operations by speeding business processes and aiding
communications. These industries are also examining how to virtualize their products and/or
services—that is, how to apply telecommunications to support electronic services targeted at the
consumers of that industry's products. Not surprisingly, changing the way you attend a class, see a
doctor, watch a movie, get a date, shop for software, take a cruise, and stay in touch creates
significant changes in how you use your time and money. Simply put, technology changes your way
and pace of life.This chapter presents the big picture of the telecommunications revolution, and the
rest of the book gives greater detail about the specific technologies and applications that will
comprise the telecommunications future.
Changes in Telecommunications
A quick orientation of how emerging technologies are affecting industries and lifestyle highlights
the importance of understanding the principles of telecommunications, and, hopefully, to inspire you
to "think telecom." The changes discussed here are ultimately very important to how
telecommunications networks will evolve and to where the growth areas will be.
An enormous amount of the activity driving telecommunications has to do with the emergence of
advanced applications; likewise, advances in telecommunications capabilities spur developments in
computing platforms and capabilities. The two are intimately and forever intertwined. The following
sections discuss some of the changes that are occurring in both telecommunications and in
computing platforms and applications, as well as some of the changes expected in the next several
years.
Incorporating Human Senses in Telecommunications
Telecommunications has allowed a virtual world to emerge—one in which time and distance no
longer represent a barrier to doing business or communicating—but we're still lacking something
that is a critical part of the human information-processing realm. The human mind acts on physical
sensations in the course of its information processing; the senses of sight, sound, touch, and motion
are key to our perception and decision making. Developments in sensory technologies and networks
will allow a new genre of sensory reality to emerge, bridging the gap between humans and machines.
One of the most significant evolutions occurring in computing and communications is the
introduction of the human senses into electronic information streams. The following are a few of the
key developments in support of this more intuitive collaborative human–machine environment.
Computers are now capable of hearing and speaking, as demonstrated by Tellme, a popular U.S.
voice-activated service that responds to defined voice prompts and provides free stock quotes,
weather information, and entertainment guides to 35,000 U.S. cities.
The capability to produce three-dimensional sound through digital mastery—a technology called
"virtual miking"—is being developed at the University of Southern California's Integrated Media
Systems Center.
Virtual touch, or haptics, enables a user to reach in and physically interact with simulated computer
content, such as feeling the weight of the Hope diamond in your hand or feeling the fur of a lion.
Two companies producing technology in this area are SensAble Technologies and Immersion
Corporation. They are producing state-of-the-art force feedback, whole-hand sensing, and real-time
3D interaction technologies, and these hardware and software products have a wide range of
applications for the manufacturing and consumer markets, including virtual-reality job training,
computer-aided design, remote handling of hazardous materials, and "touch" museums.
The seduction of smell is also beginning to find its way into computers, allowing marketers to
capitalize on the many subtle psychological states that smell can induce. Studies show that aromas
can be used to trigger fear, excitement, and many other emotions. Smell can be used to attract visitors
to Web sites, to make them linger longer and buy more, to help them assimilate and retain
information, or to instill the most satisfying or terrifying of emotional states (now that's an interactive
game!). Three companies providing this technology today are Aromajet, DigiScents, and TriSenx.
Aromajet, for example, creates products that address video games, entertainment, medical, market
research, personal and home products, and marketing and point of sales applications. The visual
information stream provides the most rapid infusion of information, and a large portion of the human
brain is devoted to processing visual information. To help humans process visual information,
computers today can see; equipped with video cameras, computers can capture and send images,
and can display high-quality entertainment programming. The visual stream is incredibly demanding
in terms of network performance; thus, networks today are rapidly preparing to enable this most
meaningful of information streams to be easily distributed.
The Emergence of Wearables
How we engage in computing and communications will change dramatically in the next decade.
Portable computing devices have changed our notion of what and where a workplace is and
emphasized our desire for mobility and wireless communication; they are beginning to redefine the
phrase dressed for success. But the portable devices we know today are just a stepping stone on the
way to wearables. Context-aware wearable computing will be the ultimate in light, ergonomic,
reliable, flexible, and scalable platforms. Products that are available for use in industrial
environments today will soon lead to inexpensive, easy-to-use wearables appearing at your
neighborhood electronics store:
Xybernaut's Mobile Assistant IV (MA-IV), a wearable computer, provides its wearer with a full-
fledged PC that has a 233MHz Pentium chip, 32MB memory, and upward of 3GB storage. A wrist
keyboard sports 60 keys. Headgear suspended in front of the eye provides a full-color VGA screen,
the size of a postage stamp but so close to the eye that images appear as on a 15-inch monitor. A
miniature video camera fits snugly in a shirt pocket. Bell Canada workers use MA-IVs in the field;
they replace the need to carry manuals and provide the ability to send images and video back to
confer with supervisors. The MA-IV is rather bulky, weighing in at 4.4 pounds (2 kilograms), but
the soon-to-be-released MA-V will be the first mass-market version, and it promises to be
lightweight.
MIThril is the next-generation wearables research platform currently in development at MIT's Media
Lab. It is a functional, operational body-worn computing architecture for context-aware human-
computer interaction research and general-purpose wearable computing applications. The MIThril
architecture combines a multiprotocol body bus and body network, integrating a range of sensors,
interfaces, and computing cores. It is designed to be integrated into everyday clothing, and it is both
ergonomic and flexible. It combines small, light-weight RISC processors (including the
StrongARM), a single-cable power/data "body bus," and high-bandwidth wireless networking in a
package that is nearly as light, comfortable, and unobtrusive as ordinary street clothing. To be truly
useful, wearables will need to be aware of where you are and what you're doing. Armed with this
info, they will be able to give you information accordingly. (Location-based services are discussed
in Chapter 14, "Wireless Communications.")
Bandwidth
A term that you hear often when discussing telecommunications is bandwidth. Bandwidth is a
critical commodity. Historically, bandwidth has been very expensive, as it was based on the sharing
of limited physical resources, such as twisted-pair copper cables and coax. Bandwidth is largely
used today to refer to the capacity of a network or a telecom link, and it is generally measured in
bits per second (bps). Bandwidth actually refers to the range of frequencies involved—that is, the
difference between the lowest and highest frequencies supported—and the greater the range of
frequencies, the greater the bandwidth, and hence the greater the number of bits per second, or
information carried.
Moving Toward Pervasive Computing
As we distribute intelligence across a wider range of devices, we are experiencing pervasive
computing, also called ubiquitous computing. We are taking computers out of stand-alone boxes to
which we are tied and putting them into ordinary things, in everyday objects around us. These new
things, because they are smart, have a sense of self-awareness and are able to take care of themselves.
When we embed intelligence into a device, we create an interesting new opportunity for business.
That device has to have a reason for being, and it has to have a reason to continue evolving so that
you will spend more money and time on it. To address this challenge, device manufacturers are
beginning to bundle content and applications with their products. The result is smart refrigerators,
smart washing machines, smart ovens, smart cabinets, smart furniture, smart beds, smart televisions,
smart toothbrushes, and an endless list of other smart devices. (These smart devices are discussed in
detail in Chapter 15, "The Broadband Home and HANs.") Devices are becoming smaller and more
powerful all the time, and they're getting physically closer to our bodies, as well. The growing
amount of intelligence distributed throughout the network is causing changes in user profiles.
Moving Toward Machine-to-Machine Communications
We are moving away from human-to-human communications to an era of machine-to-machine
communications. Today, there are just over 6 billion human beings on the planet, yet the number of
microprocessors is reported to be more than 15 billion. Devices have become increasingly
intelligent, and one characteristic of an intelligent system is that it can communicate. As the universe
of communications-enabled devices grows, so does the traffic volume between them. As these smart
things begin to take on many of the tasks and communications that humans traditionally exchanged,
they will change the very fabric of our society. For example, your smart washing machine will
initiate a call to the service center to report a problem and schedule resolution with the help of an
intelligent Web agent long before you even realize that something is wrong! These developments
are predicted to result in the majority of traffic—up to 95% of it—being exchanged between
machines, with traditional human-to-human communications representing only 5% of the network
traffic by 2010.
Adapting to New Traffic Patterns
Sharing of information can occur in a number of ways—via smoke signals, by letters sent through
the postal service, or as transmissions through electrical or optical media, for example. Before we
get into the technical details of the technologies in the industry, it's important to understand the
driving forces behind computing and communications. You need to understand the impact these
forces have on network traffic and therefore on network infrastructure. In today's environment,
telecommunications embodies four main traffic types, each of which has different requirements in
terms of network capacity, tolerance for delays—and particularly variations in the delay—in the
network, and tolerance for potential congestion and therefore losses in the network:
Voice—Voice traffic has been strong in the developed world for years, and more subscriber lines
are being deployed all the time. However, some three billion people in the world haven't even used
a basic telephone yet, so there is yet a huge market to be served. Voice communications are typically
referred to as being narrowband, meaning that they don't require a large amount of network capacity.
For voice services to be intelligible and easy to use, delays must be kept to a minimum, however, so
the delay factors in moving information from Point A to Point B have to be tightly controlled in
order to support real-time voice streams. (Concepts such as delay, latency, and error control are
discussed in Chapter 6, "Data Communications Basics.") Data—Data communications refers to the
exchange of digitized information between two machines. Depending on the application supported,
the bandwidth or capacity requirements can range from medium to high. As more objects that are
visual in nature (such as images and video) are included with the data, that capacity demand
increases. Depending again on the type of application, data may be more or less tolerant of delays.
Text-based exchanges are generally quite tolerant of delays. But again, the more real-time nature
there is to the information type, as in video, the tighter the control you need over the latencies. Data
traffic is growing much faster than voice traffic; it has grown at an average rate of about 30% to
40% per year for the past decade. To accommodate data communication, network services have been
developed to address the need for greater capacity, cleaner transmission facilities, and smarter
network management tools. Data encompasses many different information types. In the past, we
saw these different types as being separate entities (for example, video and voice in a
videoconference), but in the future, we must be careful not to separate things this way because, after
all, in the digital age, all data is represented as ones and zeros.Image—Image communications
requires medium to high bandwidth—the greater the resolution required, the greater the bandwidth
required. For example, many of the images taken in medical diagnostics require very high resolution.
Image traffic tolerates some delay because it includes no motion artifacts that would be affected by
any distortions in the network. Video—Video communications, which are becoming increasingly
popular and are requiring ever-greater bandwidth, are extremely sensitive to delay. The future is
about visual communications. We need to figure out how to make video available over a network
infrastructure that can support it and at a price point that consumers are willing to pay. When our
infrastructures are capable of supporting the capacities and the delay limitations required by real-
time applications, video will grow by leaps and bounds. All this new voice, data, and video traffic
means that there is growth in backbone traffic levels as well. This is discussed further later in the
chapter, in the section "Increasing Backbone Bandwidth."The telecommunications revolution has
spawned great growth in the amount and types of traffic, and we'll see even more types of traffic as
we begin to incorporate human senses as part of the network. The coming chapters talk in detail
about what a network needs in order to handle the various traffic types.
Handling New Types of Applications
The new traffic patterns imply that the network will also be host to a new set of applications—not
just simple voice or text-based data, but to new genres of applications that combine the various
media types. The ability to handle digital entertainment applications in a network is crucial. In some
parts of the world, such as Asia, education may have primary focus, and that should tell us where
we can expect greater success going forward. But throughout much of the world, entertainment is
where people are willing to spend the limited numbers of dollars that they have to spend on electronic
goods and services. The digital entertainment realm will include video editing, digital content
creation, digital imaging, 3D gaming, and virtual reality applications, and all these will drive the
evolution of the network. It's the chicken and the egg story: What comes first, the network or the
applications? Why would you want a fiber-optic broadband connection if there's nothing good to
draw over that connection? Why would you want to create a 3D virtual reality application when
there's no way to distribute it? The bottom line is that the applications and the infrastructures have
to evolve hand-in-hand to manifest the benefits and the dollars we associate with their future.
Another form of application that will be increasingly important is in the realm of streaming media.
A great focus is put on the real-time delivery of information, as in entertainment, education, training,
customer presentations, IPO trade shows, and telemedicine consultations. (Streaming media is
discussed in detail in Chapter 11, "Next-Generation Network Services.") E-commerce (electronic
commerce) and m-commerce (mobile commerce) introduce several new requirements for content
management, transaction platforms, and privacy and security tools, so they affect the types of
information that have to be encoded into the basic data stream and how the network deals with
knowledge of what's contained within those packets. (Security is discussed in detail in Chapter 11.)
Increasing Backbone Bandwidth
Many of the changes discussed so far, but primarily the changes in traffic patterns and applications,
will require immense amounts of backbone bandwidth. Table 1.1 lists a number of the requirements
that emerging applications are likely to make on backbone bandwidth.
In addition, advances in broadband access technologies will drive a demand for additional capacity
in network backbones. Once 100Gbps broadband residential access becomes available—and there
are developments on the horizon—the core networks will require capacities measured in exabits per
second (that is, 1 billion Gbps). These backbone bandwidth demands make the revolutionary forces
of optical networking critical to our future. (Optical networking is discussed in detail in Chapter 12,
"Optical Networking.")
Responding to Political and Regulatory Forces
New developments always bring with them politics. Different groups vie for money, power, the
ability to bring new products to market first and alone, and the right to squash others' new ideas. A
prominent characteristic of the telecommunications sector is the extent to which it is influenced by
government policy and regulation. The forces these exert on the sector are inextricably tied to
technological and market forces. Because of the pervasive nature of information and communication
technologies and the services that derive from them, coupled with the large prizes to be won, the
telecommunications sector is subjected to a lot of attention from policymakers. Particularly over the
past 20 years or so, telecommunications policy and regulation have been prominent on the agendas
of governments around the world. This reflects the global trend toward liberalization, including, in
many countries, privatization of the former monopoly telcos. However, interest from policymakers
in telecommunications goes much deeper than this. A great deal of this interest stems from the
extended reach and wide impact that information and communication technologies have. Here are
some examples: Telephony, e-mail, and information services permit contact between friends and
families and offer convenience to people in running their day-to-day lives. Thus, they have major
economic and social implications. In the business arena, information and communication
technologies offer business efficiency and enable the creation of new business activities. Thus, they
have major employment and economic implications. Multimedia and the Internet offer new audio,
video, and data services that affect entertainment and education, among other areas. These new
services are overlapping with traditional radio and television broadcasting, and major cultural
implications are appearing. News delivery influences peoples' perceptions of governments and their
own well-being, thereby influencing voter attitudes. Telecommunications brings attention to cultural
trends. Therefore, telecommunications has major political as well as cultural implications.
Government applications of information and communication technologies affect the efficiency of
government. Defense, national security, and crime-fighting applications are bringing with them
major political implications. Given this background of the pervasive impact that information and
communication technologies have, it is hardly surprising they get heavy policy attention.
Regulatory Background Although many national regulatory authorities today are separate from
central government, they are, nevertheless, built on foundations of government policy. Indeed, the
very act of creating an independent regulatory body is a key policy decision. Historically, before
telecommunications privatization and liberalization came to the fore, regulation was often carried
out within central government, which also controlled the state-run telcos. That has changed in recent
years in many, but not all, countries.
Given their policy foundation, and the fact that government policies vary from country to country
and from time to time, it is not surprising that regulatory environments evolve and differ from
country to country. These evolutions and international variations sometimes pose planning problems
for the industry, and these problems can lead to frustrations and tensions between companies and
regulatory agencies. They can also lead to disagreements between countries (for example, over trade
issues). Although moves to encourage international harmonization of regulatory regimes (for
example, by the International Telecommunications Union [ITU] and by the European Commission)
have been partially successful, differences remain in the ways in which countries interpret laws and
recommendations. Moreover, given that regulations need to reflect changing market conditions and
changing technological capabilities, it is inevitable that over time regulatory environments will
change, too. So regulation is best viewed as another of the variables, such as technological change,
that the telecommunications industry needs to take into account.
The Policy and Regulatory Players
At the global level, there are a number of international bodies that govern or make recommendations
about telecommunications policy and regulation. In addition to the ITU and the European
Commission, there are various standards bodies (for example, Institute of Electrical and Electronics
Engineers [IEEE], European Telecommunications Standards Institute [ETSI], American National
Standards Institute [ANSI], the Telecommunication Technology Committee [TTC]) and industry
associations (for example, the European Competitive Telecommunications Association [ECTA], the
Telecommunications Industry Association [TIA]). Representatives of national governments and
regulatory authorities meet formally (for example, ITU World Radio Conferences, where many
countries are represented) and informally (for example, Europe's National Regulatory Authorities
[NRAs] exchange views at Independent Regulators Group [IRG] meetings). Other organizations,
such as the World Trade Organization (WTO) and regional bodies, also influence
telecommunications policy and regulation at the international level. At the national level, several
parts of central government are generally involved, and there can sometimes be more than one
regulatory body for a nation. Some of these organizations are major players; others play less
prominent, but nevertheless influential, roles. In the United States, for example, the Federal
Communications Commission (FCC) is the national regulatory body, and public utility commissions
regulate at the state level. The U.S. State Department coordinates policy regarding international
bodies such as the ITU. The White House, the Department of Commerce, largely through the
National Telecommunications and Information Administration (NTIA), the Justice Department, the
Trade Representative, and the Department of Defense are among the various parts of the
administration that set or contribute to telecommunications policy. The U.S. Congress and the U.S.
government's legislative branch also play important roles. In addition, industry associations, policy
"think tanks," regulatory affairs departments within companies, telecommunications lawyers, and
lobbyists all contribute to policy debates and influence the shape of the regulatory environment.
Other countries organize their policy and regulatory activities differently from the United States.
For example, in the United Kingdom, the Office of Telecommunications (OFTEL) mainly regulates
what in the United States would be known as "common carrier" matters, whereas the
Radiocommunications Agency (RA) deals with radio and spectrum matters. However, at the time
of writing, it has been proposed that OFTEL and RA be combined into a new Office of
Communications (OFCOM). In Hong Kong, telecommunications regulation was previously dealt
with by the post office, but now the Office of the Telecommunications Authority (OFTA) is the
regulatory body. So, not only do regulatory environments change, but so, too, do the regulatory
players.
The Main Regulatory Issues
Let's look briefly at what regulators do. Again, this varies somewhat from country to country and
over time. In the early years of liberalization, much time would typically be spent in licensing new
entrants and in putting in place regulations designed to keep a former monopoly telco from abusing
its position by, for example, stifling its new competitors or by charging inappropriately high prices
to its customers. Here the regulator is acting as a proxy for market forces. As effective competition
takes root, the role of the regulator changes somewhat. Much of the work then typically involves
ensuring that all licensed operators or service providers meet their license obligations and taking
steps to encourage the development of the market such that consumers benefit. The focus of most
regulatory bodies is, or should be, primarily on looking after the interests of the various end users
of telecommunications. However, most regulators would recognize that this can be achieved only if
there is a healthy and vibrant industry to deliver the products and services. So while there are often
natural tensions between a regulator and the companies being regulated, it is at the same time
important for cooperation between the regulator and the industry to take place. In Ireland, for
example, the role of the regulator is encapsulated by the following mission statement: "The purpose
of the Office of the Director of Telecommunications Regulation is to regulate with integrity,
impartiality, and expertise to facilitate rapid development of a competitive leading-edge
telecommunications sector that provides the best in price, choice, and quality to the end user, attracts
business investment, and supports ongoing social and economic growth."
Flowing from regulators' high-level objectives are a range of activities such as licensing, price
control, service-level agreements, interconnection, radio spectrum management, and access to
infrastructure. Often, regulatory bodies consult formally with the industry, consumers, and other
interested parties on major issues before introducing regulatory changes. A more detailed
appreciation of what telecommunications regulators do and what their priorities are can be obtained
by looking at the various reports, consultation papers, and speeches at regulatory bodies' Web sites.
10 Disruptive Technologies You Use Every Day
BY BERNADETTE JOHNSON
http://electronics.howstuffworks.com/everyday-tech/10-disruptive-technologies-you-use-every-
day11.htm

The term disruptive innovation was brought into the lexicon by Clayton M. Christensen in his book
"The Innovator's Dilemma," in the context of businesses adopting technologies that eventually
completely surpass or replace previous technologies, possibly harming whichever business backed
the wrong technology. A disruptive technology is something new that disrupts an industry, and quite
often completely changes the way we all do things. The car disrupted the horse and carriage
industries. Small personal computers have given individuals computing power that only used to be
possible via the huge mainframes that crunched numbers exclusively in corporate, academic and
government institutions. Computers and all the things that have come along with them have wreaked
havoc in any number of industries. Even on a smaller scale, individual components of home
computers have gone through cycles of disruption, such as the evolution of the various sorts of
storage media (think floppy drives to CDs to flash drives), and the move from desktop computers to
more portable laptops to even smaller mobile devices. The new disruptive items aren't even
necessarily better or more powerful. They might cause disruption by being cheap or simple enough
for mass adoption, and then, as is the case with most computing devices, they grow faster, more
powerful and better over time. There is the common refrain of consumers not even knowing they
needed something until it was brought into being, and that's true of many disruptive technologies.
Here are 10 disruptive technologies that many people are now using on a regular basis. We may be
able to recall their predecessors fondly, but we probably don't really miss them very much.
E-mail
Sending digital messages from computer to computer began over ARPANET, the beginning of our
modern Internet, in the early 1970s. The average person didn't gain access until the 1990s or later.
But now pretty much everyone has an e-mail address, possibly several. It's a quick and easy method
of communication that's put a dent in personal letter writing, phone conversation and face-to-face
meetings. The nearly instantaneous nature of e-mail and other digital communication methods have
made communication over a distance far easier than it used to be, and has led many people to call
traditional physical correspondence "snail mail." There's even an e-mail version of junk physical
mail: spam. Emailing is cheaper and easier than hand writing letters in a lot of ways. There's no
postage, paper or ink to buy. Hitting the send button takes much less effort than stamping and mailing
a letter. The typical home in the U.S. apparently received a letter every two weeks in 1987, but it
was down to once every seven weeks in 2010, not counting greeting cards or invitations
[source: Schmid]. But even invitations and greeting cards have going digital. According to a 2011
Pew Internet survey, at that time 92 percent of adults in the U.S. who got online used e-mail, 61
percent of them used it on a typical day and 70 percent of all Americans used e-mail to some extent
[source: Purcell]. And people aren't just sending personal e-mails. Just about anyone who works on
a computer has a work e-mail via which they correspond with coworkers or clients, send documents,
set up meetings and the like. Even at home, we're not just sending the equivalent of the long letters
of yore. We are sending quick questions, links to websites, and attach documents, pictures, music
and video files. Much of the novelty of e-mail has diminished, and quick communication is now
increasingly taking place via phone text messages, instant messaging and social media.
Social Media
Social networking sites facilitate social interaction and information sharing among friends,
acquaintances or even strangers over the Internet. They usually allow you to post text statuses, links,
images or videos that are either accessible by anyone with access to the site or only to private groups
of friends. They often incorporate the ability to send private messages, and many now also include
instant messaging and video chat features. Major social media sites include Facebook, Google+,
MySpace, Tumblr, LinkedIn, and Twitter, among others. Facebook has more than 1 billion users,
making it the largest social networking site, but there are lots of others with millions or hundreds of
millions of users [sources: Adler, Berkman]. For some people, social networking has taken the place
of e-mail, texting, the phone and even face-to-face interaction. As of early 2013, more than half of
the people who use the Internet also regularly use social media [source: Berkman]. The numbers in
the U.S. are higher, at around 74 percent as of January 2014 [source: Pew Research Internet Project].
Social sites are becoming the main avenues of communication for some of us, or at least the ones
on which we spend the most time. A 2013 Experian study found that people in the U.S were spending
16 minutes of every hour on social networking sites, on average, through both personal computers
and mobile devices [source: Gaudin]. And a study in the U.K. in 2010 found that a quarter of people
spent more time socializing via social networks than in person, and that 11 percent of adults would
eschew in-person social events in favor of social media, e-mail, texting and the like
[sources: Fowlkes, Telegraph]. Those numbers are likely to continue to grow. A major downside to
so much online socializing is that, at least according to some studies, roughly 7 percent of
communication is verbal and the other 93 percent is nonverbal [source: Tardanico]. In other words,
we're losing things like tone of voice and body language, at least when we communicate using only
text. This leaves us with lots of potential for miscommunication and even willful misrepresentation,
which is bad for building solid relationships with people. Social networking is reportedly also
altering journalism. We're getting more and more of our news via links posted on social networks
and some stories are even breaking online first. Everyday citizens sometimes capture news on their
cell phone cameras as it's happening or post eyewitness accounts of an event, and these get picked
up by more traditional media after the fact. News organizations have had to join Twitter and
Facebook and other sites, and they're now competing against bloggers and other amateur journalists
for users' attention online. Social networking sites are allowing us to reconnect with long lost friends,
raise money and awareness for charities, get involved in politics (or, maybe more often, get into
political arguments), share experiences and widen our real-world network of friends and
acquaintances. We just might want to put in some in-person face time with some of them, too.
Streaming Media Services
In 2013, Michael Powell, the head of the National Cable & Telecommunications Association, stated
in a Senate hearing that Netflix was the largest subscription video provider in the U.S., not cable or
satellite [sources: Eggerton, Komando]. Netflix actually began as a DVD-by-mail service, then
moved to streaming and still provides both services. Netflix is also often credited with driving video
rental giant Blockbuster to bankruptcy and closure. Blockbuster now exists as a streaming site and
an on-demand service of Dish Network. There are other major streaming sites, including Hulu and
Amazon Instant Video; video sharing site YouTube; TV channel sites such as HBO Go and Watch
ABC; services that allow you to rent movies and TV shows, such as Google Play, iTunes and Vudu;
and Redbox, which offers both online streaming and DVD rentals at special vending-machine-like
kiosks. Not only did these innovative streaming and rental sites severely disrupt the video store
market, but they're taking a toll on cable and satellite providers as more and more people are cutting
the cord and going with online subscriptions to streaming sites only. If you aren't tied to any shows
that require a cable or satellite subscription, you might be able to give up cable and partake of the
many thousands of hours of entertainment from which to choose online. Lots of people who stream
keep their cable or satellite subscriptions, however. Streaming services, including Netflix and
Amazon, are starting to develop and offer their own original programming, too. Music has gone
through a similar shift, with CDs being supplanted over time by digital downloads, starting with the
inception of MP3 compression and music sharing (or pirating) sites like Napster, then moving to
paid digital downloads from sites like iTunes and Google Play and now unlimited music streaming
through services such as Pandora and Spotify. You can consume streaming video and audio through
apps on your smart TV, DVD player, gaming console, computer and even your phone or tablet.
There are also dedicated streaming set-top boxes, like the Roku, which allows you to download more
than a thousand streaming apps, including most of the major players plus a lot of small niche
channels. Other choices include the Boxee box and Apple TV, and small flash drive sized HDMI
sticks such as the Google Chromecast and the Roku Streaming Stick.
E-Readers and E-Reader Apps
E-Readers like the Amazon Kindle, the Barnes & Noble Nook and the Kobo Glo have taken a chunk
out of the market for paper books. Most of them feature high-resolution black text on a white or
slightly gray page for comfortable reading, and a few incorporate lighting so that you don't have to
read by daylight or lamplight. Some advantages of e-books are that they tend to be at least a little
bit cheaper than their paper counterparts, and you can carry dozens or hundreds of them with you
on an e-reader. Libraries are even offering e-book checkout in some cases. E-readers also allow you
to download and read newspapers, magazines and comics. And some of them will let you listen to
an audio version of a book while you are reading. But now that so many people carry smartphones
and have tablets, the dedicated e-readers themselves are not necessary for switching to e-books.
There are e-reader apps, like Kindle, Stanza and Apple's iBooks, through which you can order and
read digital books on your mobile device or computer. Some even allow you to bypass downloads
and read your books in the cloud. Kindle also makes its own multi-use color tablet, the Kindle Fire,
that makes it easy to buy and read books, but also to do anything else you can on just about any
other tablet. One downside to e-books is that not everything is available in digital form yet, so you
may still ave to read some of your chosen books on paper. Some people prefer the look and feel of
a paper book and aren't likely to switch. Per a survey by Princeton Survey Research Associates
International in early 2014, 69 percent of adults read at least one paper book in the previous year,
28 percent read at least one e-book and 14 percent listened to at least one audiobook. Some people
use all three formats, although 4 percent of readers stick exclusively with e-books [source: Zickuhr
and Rainie].
Smart Mobile Devices
Smartphones and tablets allow us to access the Internet from nearly anywhere, essentially letting us
carry around the bulk of human knowledge in our pockets, or at least a tool to get to that knowledge.
They are beginning to replace a great many things we formerly used all the time, and are introducing
us to services that never existed before. Phone apps let you check email, play games, surf the net,
create text documents, access product reviews, find directions and identify music that is playing at
your location, among a great many other things. These highly portable devices can act as music
players, cameras, GPS devices, video viewers and e-readers. With the built in calendar, to-do list,
dictation and voice activated personal assistant programs, they could begin to reduce the need for
live assistants. And how many people wear watches these days? Smartphones with cameras have
already taken a bite out of the consumer camera market. The higher end smartphones have
resolutions and other capabilities that rival many digital cameras on the market. A study from 2011
found that even then, people were using their smartphones for more than half of their spur of the
moment photos, although they were a little more likely to use a dedicated camera for vacation shots
[source: Donegan]. Smartphones also have the added bonus of allowing us to quickly share our
photos and videos on social networking sites, and there are even photo editing apps so you can tweak
and retouch your image before you upload it. Many people are cancelling their home phone service,
and, to a lesser extent, their home Internet in favor of the cellular data plans of their smartphones.
Mobile devices are also allowing the Internet to reach areas in developing nations where it would be
cost prohibitive to get traditional online service to the home. As their processors get more powerful
and cellular Internet connections get faster, both smartphones and tablets are replacing laptops and
desktop computers as peoples' day to day computing devices. They have already become more
powerful than a lot of our old laptops from years ago, they don't require as much power, and 3G
and 4G cellular networks and WiFi connectivity have brought them faster broadband speeds.
They can also be used for word processing and accessing business related sites on the road, although
their small screens and slower processors don't make them ideal for some business uses.
You can also use your smartphone or tablet as a remote control for a myriad of devices, such a
gaming systems and video streaming devices. You can even buy infrared (IR) devices that work
with mobile device apps so that you can control your TV and other hardware that usually requires
an IR remote. Smartphones, too, might share some of the blame for reducing in person
communication. It's what many of us are using to check e-mail and surf social media sites, after all,
even sometimes when we're out with friends or family. Sales of smartphones surpassed sales of
laptops in 2007 [source:], and surpassed sales of personal computers in general in late 2010
[sources: Wingfield, Arthur]. Laptops and desktops are still necessary for certain types of
computing, but there could be a day when you're just as likely to plug your phone into a keyboard
and monitor as crack open a laptop. Smartphones and tablets are also playing a major role in our
next disruptive technology
Mobile Payment Options
Mobile apps and services are coming along to disrupt the traditional cash register. The path for
moving away from the register was paved in part by the near abandonment of cash as our primary
payment method. Per a Nielsen survey released in January 2014, 54 percent of people around the
world and 71 percent in North America prefer plastic to cash for their daily spending
[source: Nielsen]. We are also increasingly willing to make online purchases with credit and debit
cards, including shopping on smartphones and tablets. These developments, along with the advent
of touchscreen mobile devices and relatively easy access to the reliable broadband Internet
connectivity, have made in-store smartphone and mobile-based payment systems a reality.
Major contenders in the mobile payment arena are Square, Intuit GoPayment, Pay AnyWhere,
ShopKeep and even PayPal with PayPal Here (which lets you take credit cards and scan checks for
deposit into your PayPal account). Some simply consist of an app on your device and a tiny card
reader plugged into its audio jack. This gives even the smallest independent store or street vendor
the ability to take credit cards. Retailers are charged a percentage per transaction, a monthly fee or
both, along with the cost of the hardware, which is much cheaper than traditional registers and card
readers. Square also offers a stand that connects to an iPad, bar-code scanner, receipt printer and
related devices for a more robust cash register replacement. Mobile devices themselves have
cameras that allow them to scan barcodes. Some major retailers have been experimenting with
payment and product lookup via dedicated mobile devices, too, including Barneys New York, Urban
Outfitters, Gucci, Saks Fifth Avenue and Gaylord Hotels. Employees might be wandering the store
with mobile devices, able to help customers anywhere. Tablets are even appearing at tables in
restaurants to allow you to order items and pay with a swipe. Some companies, including Wal-Mart,
have experimented with letting people check out entirely on their own mobile devices using apps
that let them ring up merchandise. Such innovations could potentially be the death of waiting in line,
although it could also reduce jobs. Phones that incorporate Near Field Communications (NFC) allow
you to pay for things without your physical credit or debit cards at NFC-enabled payment stations
using apps such as Google Wallet. Online payment methods like PayPal are even being accepted at
some stores, and for places that can't process PayPal payments, the service can issue users a debit
card. NCR and other cash register manufacturers may not have to worry about extinction, however,
due to the rising popularity of the next item.
Self-checkout Stations
One technology that's becoming quite common in grocery stores and other large retail establishments
is the self-checkout station. These usually consist of four or more kiosks, each with a scanner, a
touchscreen monitor, a card reader, slots for taking and dispensing cash and areas on which to bag
or place your purchases. There's generally one worker overseeing a few kiosks, and the customer
does the rest -- scanning items, typing in the codes of produce and bagging the groceries. If
something goes wrong, the worker will be signaled to come and help so that the transaction can go
forward. There's concern that self-checkout will lead to job losses for some of the millions of
cashiers in the U.S. -- there were around 3 million in 2013 [sources: Bureau of Labor
Statistics, Thibodeau]. This is an issue that comes up whenever something is automated and only
time will tell if they take a major toll on the job market or simply shift workers into other jobs. But
some stores are giving them up due to customer service and job loss concerns, including the grocery
chain Albertsons. Many people prefer to deal with a human being, although some prefer self-
checkout. Another disadvantage of self-checkout is the greater risk of theft. As an anti-theft measure,
many of the systems weigh or otherwise sense the items you've put in the bagging area and check
that against what you scanned. Of course, thieves will try tricks like weighing non-produce items as
produce in an effort to sneak away with product. Some theft risk can be reduced with video
monitoring software that can quickly alert a staff member to perceived wrongdoing, or conveyor
systems that scan items quickly and automatically, making it harder for would-be shoplifters to slip
things by the scanners. Advantages include shorter lines and quicker checkout times, at least
provided there are no complications. Ikea has actually opted to remove them from stores in the U.S.
because they were causing longer checkout times due the difficulty most people had getting their
items to scan. But these kiosks are becoming more and more prevalent, and will likely improve over
time. One expected improvement is scanners that actually recognize your purchases rather than
relying on barcodes or keyed-in product codes.
Wearable Fitness Devices
Fitness and lifestyle trackers are all the rage, and there are many to choose from, including the Fitbit,
the Nike Fuelband, the Adidas Fit Smart, the Samsung Gear Fit, the Misfit Shine and the Jawbone
Up, among many others. They do things like track your workout time, steps (like a pedometer),
distance and calories burned, as well as measure your heart rate and monitor your sleep patterns.
Some work in conjunction with apps on your smartphone or an online portal where you can track
your data, set your goals and possibly do things like log dietary information.
The devices and apps can use the gathered data to cue you to increase or decrease your workout
intensity, let you share data with other users for accountability and motivation and, in the cast of at
least one company (GOQii), get you in touch with an experienced fitness coach who monitors your
data, sends advice and responds to questions (for a recurring fee).
Some of these devices are worn on your wrist or ankles, some wrap around your chest and others
clip onto your clothing. They may have small screens, LED status lights or no display at all. Some
require plugging in to upload your data and some sync wirelessly and automatically. Some work
with only one operating system while others work with several. Many modern smartphones even
have sensors now that allow phone apps to perform some of these functions, like tracking your routes
or your steps. Some even have heart rate checking capabilities. These or similar innovations could
disrupt personal training and other fitness related jobs, although there are some things a wearable
device or app are not going to be able to do, like make sure you're using good form -- at least for
now.
Cloud Computing
The cloud is made up of large groups of powerful computers called servers. They're usually housed
in data centers or computer rooms, and these centers are running software that can distribute
processing over their network across multiple servers. Users access cloud services remotely via their
own web browsers. In fact, you could consider anything you can get to on the Internet to be in the
cloud, since you're accessing the data on a remote server. And a lot of the media you're streaming
these days is in the cloud. The advent of cloud computing gives businesses the potential to quickly
increase their processing capabilities without having to buy equipment or hire and train new staff,
and often at lower cost than in-house IT expansion would require. Companies simply pay a host for
whatever type of access and services they need. These types of services also mean that smaller
businesses and startups that never would have been able to raise the capital to buy heavy-duty
equipment and the necessary staff can gain quick access to computing power. Types of cloud
services you can purchase vary from simple data storage to server space on which you have to do
most of the IT work yourself (albeit remotely) to fully realized software systems that you and other
users can simply log into and use. Cloud computing is altering the way we consume and purchase
business software and hardware. Having lots of your data in the cloud also allows data mining for
business analytics. You don't have to be a business to utilize the cloud. The wide availability of
inexpensive broadband connectivity means many of us are always online and able to access data
much more quickly than in the days of dial-up. You may already be using an online e-mail service
like Gmail or Hotmail, online office software like Google Docs, or storing your photos, videos or
documents on storage sites like DropBox, and you just didn't know to call it the cloud. Storage space
used to be expensive, but it's getting cheaper and cheaper. Many cloud services will give you several
gigabytes of storage for free and charge you annual or monthly fees if you need more space.
Anything that you store on them can be accessed from multiple devices and from any location with
an Internet connection. It also makes it easy for people to travel light, or to share data with others.
Our always-on connectivity, along with a shift to downloadable software (also made possible by
cloud storage), is allowing us to swap our heavier desktop and laptop computers for smaller, cheaper
devices with less storage like netbooks, tablets and even phones. We can get to all our data online.
Possible drawbacks to moving things off of a local hard drive are the security of personal
information, loss of data if something goes wrong (say your cloud provider goes under) and loss of
access when you have connectivity issues. But physical hard drives can be lost, as well. The best
solution is to keep anything you don't want to lose in more than one location, and that makes the
cloud a good backup solution. If a computer or physical hard drive fails, the information can be
downloaded from your cloud service to new devices. Many of our phones now automatically sync
our data to a cloud account so that we never have to worry about plugging them into our computers
to upload or download data. If your phone is lost or stolen, in many cases you can wipe its data
remotely and then reinstall everything easily onto a replacement.
The Internet
The Internet began in 1969 as ARPANET, a method to link computers together for data sharing
developed by the Department of Defense's Advanced Research Projects Agency (ARPA). The
development of the World Wide Web (the Internet as we know it) began in 1989 as a project by
British scientist Tim Berners-Lee. In 1994, not long after the birth of the web, only 11 million people
were online. By 2014, billions of people were on the Internet [sources: Berkman, Pew Internet
Research Project]. With high-speed broadband connections at home and cellular data on a lot of our
phones, many people are essentially always online. The Internet has made most of the previously
mentioned disruptions possible and then some. We have access to vast amounts of human knowledge
through web browsers and search engines, along with incredible communication and information
sharing tools. We can make Voice over IP (VoIP) calls, do video chat, instant message and send e-
mail, all nearly instantaneously. The Internet is transforming retail with online purchasing and
mobile payment, education with online classesand our consumption of entertainment with media
streaming, online gaming and downloadable e-books. There are even some upsides that have job
loss related downsides. People are booking travel themselves instead of using travel agents, doing
their banking online instead of going to the bank, buying stocks online instead of consulting a broker
and sending e-mail instead of posting letters. Brick-and-mortar retail store sales have suffered and
now most also have their own online retail presences. People get most of their news online, leading
to declines in newspaper and magazine subscriptions. Even serious news sites have blogs and social
media accounts that link back to their articles. You can pay your bills all at once from a single
financial institution's website. The percentage of bills paid by traditional mail dropped below 50
percent in 2010, according to the US Postal Service [source: Schmid]. You can even pay for print
postage from an online site for any physical items you do have to send through the mail and just
drop them off at the post office or hand them to your mail carrier. Our online access allows some of
us to telecommute instead of driving to work, leading to changes in traditional workspaces and work
practices. Video conferencing at work is more and more common. On the flip side, the Internet also
makes it easy to check personal e-mail, peruse retail sites, post on social networking sites and
otherwise goof off while we're at the office. The Internet has transformed advertising, as well as and
charitable and political fundraising. Any company or non-profit entity that wants to be taken
seriously needs to have a wesite and a presence on the major social media sites, and even in TV and
print ads, you might see Twitter, Facebook and other social network logos. The so-called Internet
of things, involving lots of gadgets that can wirelessly send data, is making the connected home a
real thing. We're beginning to be able to control home appliances and monitoring devices from our
phones while we're away from our domiciles. At this point, there's no limit to the possibilities that
the Internet will bring in the future.
10 Gadgets That Really Should Be Obsolete By Now
BY NATHAN CHANDLER
https://electronics.howstuffworks.com/gadgets/other-gadgets/10-gadgets-should-be-
obsolete11.htm

All of our electronic gadgets get smaller, faster and more powerful. Along the way, older versions
start to sprout gray hairs. When that happens, they either continue to hone their capabilities with a
technical comb over or they fall uselessly out of favor, dropping from retail stores and into history.
Inertia and human nature are funny things, though. Often, devices that find mass market penetration
linger in our homes, cars and pockets, simply because we're too lazy or too cheap to invest in newer
and better alternatives.
In other cases, the older stuff provides nostalgia or may even outperform supposedly superior
descendents. For example, vinyl records aren't as user-friendly or portable as their MP3
counterparts. But proponents argue for vinyl's warmer sound, groovy tactile feedback and artistic
allure.
Most devices aren't so fortunate, especially when it comes to pocket-sized electronics. A whole slew
of products are falling victim following the advent of fully capable smartphones. When one little
phone/computer can do hundreds of tasks pretty well, it suddenly seems ridiculous to carry around
a camera, camcorder, tape recorder, MP3 player, paper maps and dozens of other objects that a
smartphone could reasonably stand in for.
Like no-smoking signs on airplanes, though, there's a still long list of tech that marks the businesses
and people still using them as socially outdated and technologically tone deaf. Keep reading to see
our top 10 technologies that just won't die, even though they probably should.
Origami Navigational Nightmares
For centuries, paper maps guided our ancestors across the continents and seas, ensuring safer travels,
economic booms and depressing family vacations. Yes, traditional maps have many virtues. But
even Christopher Columbus couldn't get his to fold properly.
Since 2003, or about the time GPS units and smartphones began their march across the globe, paper
map sales have been slumping [source: Rodriguez]. Washington State even stopped printing its state
maps altogether at one point (although this was mostly due to budget problems).
It's tough for paper to compete with a gadget that provides turn-by-turn instructions on the fly. Of
course, if a solar flare knocks out GPS satellite service and cell towers for months and the apocalypse
begins, you may wish you'd held onto that old, worn out, badly folded paper map.
Alarming Extinction
Alarm clocks, the ringing, exasperating bane of night owls everywhere, are going the way of the
pterodactyl and T-rex. They've been wiped out by the smartphone swarm.Why, exactly, would a
person shell out $10 to $20 for a device that a smartphone duplicates so easily? Many smartphones
even have speakers that can compete in sound quality compared to an average clock radio.
Furthermore, your smartphone doesn't have light-up face that stares at you all night long,
stoking insomnia.
Your smartphone doesn't start blinking when the power goes out. And your smartphone doesn't need
a 9-volt battery backup. If you're in the business of manufacturing these clocks, best to bail, soon,
before extinction is official.
Gone in a Flash
CD-ROMs are slow. Direct cable connections are hard to set up. USB flash drives, though, are
speedy, rewritable and nearly instantly recognizable between all sorts of devices. And now that
they're ubiquitous, they are also becoming irrelevant.
Flash drives deliver enormous convenience. They store many megabytes or gigabytes of data for a
few minutes or for many months. File transfer speeds are fast. These capabilities come in a chunk
of plastic small enough to dangle from your keychain.
There's just one problem. Now you can access numerous free cloud-storage services such
as DropBox. Instead of dealing with a physical (and easily lost) object like a USB drive, you can
just snag your files from the web anywhere you have Internet access.
Flash drives were a wonderful thing. Their short stay atop the technological mountain flashed by far
too fast.
Kill CDs ASAP
The ever-present optical disc, whether it stores music, movies or computer software, like a
reanimated corpse from our worst nightmares ... refuses to die. Sure, optical discs can store gigabytes
of data. Whoopee.
They're also easily destroyed. Have butterfingers? Drop that $20 Bob Dylan disc, even once, and it
may never play, play, play, play properly again without skipping and stuttering.
They're slow. Insert a disc into your computer or game console drive. Wait. Wait some more. Make
coffee. Perhaps by then the disc will be ready for you to actually use it.
iPods and other digital music players helped drive a stake into CD music players. Flash media and
the cloud are helping push out computer CD-ROM drives, too. These days, a CD drive is an add-on
accessory and not a computing necessity. Streaming availability of all those shows and movies we
used to buy in physical form is chipping away at the DVD and Blu-ray market.
Goodbye discs. And good riddance.
PDAs are DOA
It all seems so quaint now. There was a time when incorporating note taking, voice memos and
calendars into one handheld device seemed cutting edge and innovative. These wonderful devices
were called PDAs (personal digital assistants), and they were The Next Big Thing.
And for a short while, they were. Now, in the common vernacular PDA stands for something else
(public display of affection) ... and the digital devices themselves? They need to go away, pronto.
The exception here may be warehouses, hospitals and similar businesses where workers need a way
to track inventory, collect data or manage products. In this case, PDAs serve as more affordable,
limited purpose tools without all the unnecessary bells and whistles of smartphones or tablets.
Still, we'll never miss the stylus. And in another 10 years, we'll probably figure out where we lost it.
Dial-up Internet
In the United States, around 3 percent of people still use dial-up services to access the Internet. By
contrast, nearly 70 percent have high-speed broadband access [source: Kessler]. Their online
experiences are, shall we say, profoundly different.
When you're connected via a quality broadband connection, you're able to stream audio and video
to multiple HDTVs and your computer at the same time, while you're also surfing on your tablet and
smartphone. By contrast, if you're stuck on a dial-up connection, sometimes it takes several seconds
for a single web page to load in your browser.
Holding onto dial-up may even cost you potential income. More than one set of researchers has
found higher earning potential in households that have broadband Internet access as compared with
those that still use whistling, buzzing modem [source: The Garside].
Those people tend to be older, less educated and living in homes with lower than average incomes
[source: Pew]. They also may be in areas (like rural hideaways) where access options are extremely
limited. Regardless of circumstances, dial-up Internet is a throwback, and not one we want to use
again.
Paging, Paging ... Your Old Technology
There was a time when a belt-clip pager was the equivalent of a Rolex. You were somebody who
always had to be connected. You were a doctor or a lawyer (or, less impressively, a drug dealer.) In
any case, a pager implied importance. Nowadays, they probably just mean you're out of touch.
And yet, just a couple of years ago, Americans were buying millions of dollars worth of pagers
[source: Piltch]. And in some limited situations, pagers might work a bit more reliably than phone
systems or Internet access.
That's especially true in health care settings. Smartphone batteries need recharging. Pagers just need
new alkalines. Cell reception may be spotty in some facilities where pagers may work more reliably.
Those traits aren't enough to save pagers. Like everyone else, doctors prefer their multifunction
smartphones and they don't want to carry multiple gadgets. Soon, pagers will fall by the wayside,
forgotten and irrelevant, like a dope dealer convicted of multiple felonies.
Miles of Tape
"The familiar VHS tape is rapidly going the way of the obsolete 8-track." That dusty snippet is culled
directly from a Washington Post article in 2005 [source: Chediak]. And more than a decade later,
guess what? The VHS tape? No, it's not quite dead.
Americans continue buying VHS format tapes by the millions. It's difficult to ascertain the exact
purpose of these tapes. Perhaps they make good bookends. Maybe the tape is unwound and used
for Halloween decorations.
All snarky tones aside, VHS has proved surprisingly resilient in the age of DVD and streaming
media, not to mention all sorts of tape formats with superior characteristics. In 2005, nearly 90
percent of Americans owned VHS players. Nowadays, that number is 30 percent lower
[source: Garber], but considering the age of VHS, still an impressive sign of this technology's
longevity.
You really should let go of VHS. Let streaming media take over the ribbons of tape you're hoarding
in your closet. The time has come.
Compact Cameras
At a time when pocket-sized cameras are peaking in quality, they're also becoming spectacularly
unnecessary. Like so many other consumer gadgets, these cameras will fall victim to smartphones,
if only for a single reason -- no one wants to carry both a smartphone and a camera.
It's not that smartphones always take great pictures. Most models only do a serviceable job of image
making. The simple convenience of the camera-in-your-phone concept, though, is just easier than
cramming yet another device into your pockets or purse. That's why in 2010, manufacturers sold
about 132 million compact cameras, and three years later, about 50 million fewer
[source: Wakabayashi].
There are still plenty of smartphone holdouts, of course, and those people will still enjoy the
increasing quality and power of small cameras. For everyone else, though, pocket cameras will fade
like Polaroid prints hanging in direct sunlight.
Trading Air for Land
It's like a modern air force versus ancient foots soldiers. There's not really a competition. It's a just
a matter of how long until the airborne forces of cellular technology finally stamp out ground-based
landline phones for good.
For years, many families carried both services. They'd have cell phones for convenience and then
use their traditional landlines as emergency backup (or for Internet service at home). Most younger
consumers under the age of 30 don't see the point of paying for both landline and cell service, so
nearly 70 percent don't have landlines at all. In an age when at least 90 percent of Americans have
cell phones, landline numbers will only continue to dwindle [source: Sparshott].
All technologies change and evolve, even those like land-based telephone service, which has
connected humans all over the planet for decades. No matter how important they may have been,
they'll all eventually be replaced by something newer and more convenient.
I first accessed the Internet using a 300 baud dial-up modem. Sometimes it took several minutes to
connect to a particular online bulletin board. Data transfer was so slow, even for simple text, that I
could actually read the lines as they appeared, one by one, on my monochrome computer screen.
When I finally got my hands on a 4800 baud modem, my life changed. Suddenly, entire pages of
text flashed to my computer at once. I could download thousands of bytes per hour. I am not
nostalgic for obsolete technologies. I drop-kick them into the recycling bin and say thanks for
speedier, better gadgets at every turn.

Vous aimerez peut-être aussi