Vous êtes sur la page 1sur 10

Introduction

Day by day, we see new computer technologies emerging, taking shape and
literally transforming our entire information technology world. Discoveries are made on a
daily basis in all areas of computer technology. New Computer Technology is becoming
such an integral part of our lives, that it is not only much of the time overlooked, but
often it seems we can't even keep up with the latest advances. Pc's and computers have
more capacity, processing power, and features than ever before. We regularly feature
stories on the latest advances in computer and pc technology Will computers reach top
speed by 2020?

The era in which computing power doubles every two years is


drawing to a close, according to the man behind Moore's Law

Moore’s Law:

The maxim, which states that computers double in speed


roughly every two years - has come under threat, from none other than the
man who coined it.

Gordon Moore, the retired co-founder of Intel, wrote an


influential paper in 1965 called 'Cramming more components onto
integrated circuits', in which he theorised that the number of transistors on
a computer chip would double at a constant rate. Silicon Valley has kept
up with his widely accepted maxim for more than 40 years, to the point
where a new generation of chips, which Intel will begin to produce next
year, will have transistors so tiny that four million of them could fit on the
head of a pin.

Related Links

• IBM unveils nanotechnology chip advance

• Intel chips away at AMD

• Intel apologises for 'racist' ad

In an interview however, Mr Moore said by about 2020, his law would come
up against a rather intractable stumbling block: the laws of physics.
"Another decade, a decade and a half, I think we'll hit something fairly
fundamental," Mr Moore said at Intel's twice-annual technology conference.
Then Moore's Law will be no more.Mr Moore was speaking as Intel gave its
first demonstration of a new family of processors, to be introduced in
November, which contain circuitry 45 nanometres - billionths of a metre -
wide.

The 'Penryn' processors, 15 of which will be introduced this year, with


another 20 to follow in the first quarter of 2008, will be so advanced that a
single chip will contain as many as 820 million transistors. Computer
experts said today that a failure to live up to Moore's Law would not limit
the ultimate speed at which computers could run. Instead, the technology
used to manufacture chips would shift.

The current method of Silicon-based manufacturing is known as "bulk


CMOS", which is essentially a 'top-down' approach, where the maker starts
with a piece of Silicon and 'etches out' the parts that aren't needed. "The
technology which will replace this is a bottom-up approach, where chips will
be assembled using individual atoms or molecules, a type of
nanotechnology," Jim Tully, chief of research for semi-conductors at
Gartner, the analyst, said. "It's not standardised yet - people are still
experimenting - but you might refer to this new breed of chips as 'molecular
devices'."

Anthony Finkelstein, head of computer science at University College


London, said, however, that a more pressing problem in the meantime was
to write programs which took full advantage of existing technologies. "It's
all very well having multicore chips in desktop machines, but if the software
does not take advantage of them, you gain no benefit." "We are hitting the
software barrier before we hit the physical barrier," he said. Mr Moore, who
is 78, pioneered the design of the integrated circuit, and went on to co-found
Intel in 1968, where he served as chief executive between 1975 and 1987.
Asked what area he would research if he were a graduate today, he told the
conference: "I'd probably look at something more in the biology mould. The
interface between computers and biology is a very interesting area."

Also in Tech & Web


• 10 games for 2010
• 2010: the year of the mobile
• Apple not to blame for hearing loss from playing iPods too loud, says
judge

Also in News

• Plain English Campaign says regular coffee is not its cup of tea
• Answers to the 2010 quiz
• Chinese show little sympathy for Akmal Shaikh

Explore Tech & Web

• Personal Tech

• The Web

• Gadgets & Gaming

Future Of Computers

A number of commentaries on the future of computing,


written by leading scientists, has appeared on Nature.In the last two decades
advances in computing technology, from processing speed to network
capacity and the internet, have revolutionized the way scientists work. From
sequencing genomes to monitoring the Earth's climate, many recent
scientific advances would not have been possible without a parallel increase
in computing power - and with revolutionary technologies such as the
quantum computer edging towards reality, what will the relationship
between computing and science bring us over the next 15 years

The list of freely accessible commentaries:


1. Champing at the bits (about quantum computers)
2. Milestones in scientific computing
3. Everything, everywhere
4. Exceeding human limits
5. The creativity machine
6. Science in an exponential world
7. Can computers help explain biology?
8. A two-way street to science's future

The titles are pretty much self explanatory, with the exception
of the first one.The first one is about quantum computers. Here's a little
quote from the article:Five years ago, if you'd have asked anyone working
in quantum computing how long it would take to make a genuinely
useful machine, they'd probably have said it was too far off even to
guess. But not any longer.

"A useful computer by 2020 is realistic," says Andrew


Steane of the quantum-computing group at the University of Oxford,
UK. David Deutsch, the Oxford physicist who more or less came up with
the idea of quantum computation, agrees. Given recent theoretical advances,
he is optimistic that a practical quantum computer "may well be achieved
within the next decade".A quantum simulator would describe and predict the
structure and reactivity of molecules and materials by accurately capturing
their fundamental quantum nature. This is the sort of employment the early
machines are likely to find: doing calculations of interest to chemists,
materials scientists and possibly molecular biologists, says Steane.

Since quantum computers can solve exponential problems in


seconds that would take a conventional computer billions of years, they are
very useful in running simulations of molecular interactions such as the ones
going on in our bodies.
This is very important. Simulations are one of the holy grails of medicinal
science.
The more accurate and faster our simulations are, the easier it will be to
come up with new drugs, solve health problems, and find useful genetic
modifications to upgrade the human body a bit.
Human-Computer Interaction in 2020

That's the focus of a Microsoft Research report, "Being Human:


Human-Computer Interaction in the year 2020," just released.

Among its conclusions for the year 2020 and beyond:

• We're in the Mobility Era now, but we'll be in the


Ubiquity Era in 2020 and beyond, with thousands of
computers per user.
• Silicon and biological material will be knitted in
new ways, enabling new forms of direct inputs and
outputs implantable in our bodies.
• We will increasingly be able to use mobile devices
to interact with objects in the real world, acting more as
if they are extensions of our own hands, by pointing and
gesturing with them.
• Robots will become autonomous machines that
learn.
• We will create a more customized, personalized
digital world for ourselves.
• The vision of one computer for every child world-
wide will be more of a reality.
• There will be very few people left on the planet
who do not have access to a mobile phone.

Brain implants could control computers by 2020, Intel says

The end of the keyboard, mouse and button is near. Users


will be able to control computers using only their brains thanks to chips
embedded in them, researchers said.Scientists an Intel research lab in
Pittsburgh, Pa. are working to read human brain waves to operate electronics
using sensors implanted in people’s brains, according to a Computerworld
report.The move could eventually lead to the ability to manipulate your
computer, television and mobile phone without lifting a finger.

“We’re trying to prove you can do interesting things with


brain waves,” Intel research scientist Dean Pomerleau told Computerworld.
“Eventually people may be willing to be more committed … to brain
implants. Imagine being able to surf the Web with the power of your
thoughts.”Intel researchers are teaming up with scientists from Carnegie
Mellon University and the University of Pittsburgh to figure out how to
decode human brain waves. The team is using Functional Magnetic
Resonance Imaging (FMRI) machines to map brain patterns by monitoring
blood flow to areas of the brain that results from thinking about certain
things.
The brain exhibits response patterns for words and images.
The researchers are attempting to build technology that can detect and
interpret brain waves and, in turn, be used to manipulate an electronic device
such as a computer.For now, that technology is in the form of a headset. But
a sensor meant to be implanted into the brain will soon replace it.

It’s not the first time scientists have attempted to tap the
brain for information — two years ago, U.S. and Japanese scientists used a
monkey brain to control a robot with the hope that advances would help
paralyzed people walk again.But deciphering the complexities of human
brain waves is the biggest hurdle of all. It’s a bit like digging around a 1-acre
supercomputer with hundreds of thousands of processors and trying to figure
out what it’s doing at every turn.

Computing in 2020: Erasing the boundary between human and PC

It's easy to view the computer interface as nearly static.


Since the advent of mouse-driven, windowed interfaces over 20 years ago,
much of human-computer interface (HCI) has gone the same route. But a
proliferation of mobile devices is beginning to change that and, even if that
weren't the case, important differences are developing in what information is
available to computers, and how we access it. In March 2007, Microsoft
Research invited 45 leading researchers to discuss where HCI would be in
2020; a report summarizing their conclusions has now been made available.

The report covers more ground that can possibly be covered


in a cogent summary. Fortunately, it hits some obvious points that are easily
summarized. For example, it concludes that speech and gestures would play
a larger role in HCI, and suggests that nerve impulses themselves would
start to be used for controlling computers, especially for the disabled. It also
predicts that the pervasive connectivity that enables computers to act as
surrogates for human memory would, when combined with enhanced
processing power, begin to allow them to supplement human reasoning.

But a number of its conclusions are less obvious. Computers


are just now starting to identify us by RFID and facial recognition, and track
us through GPS and closed-circuit monitoring. In essence, we're now
involuntarily "interfacing" with computer systems every time we go through
an airport. The experts predict that these trends will accelerate and expand,
raising serious privacy issues. In some cases, such as implanted medical
devices, the boundary between human and computer is nearly erased—is
there really a "human interface" between a device that can monitor and
manipulate heartbeats and the heart itself?

A second trend will interact and combine with this one. The
report suggests that we're just entering the age of mobile computing, but, by
2020, we'll be in an era of ubiquitous computing. Instead of a few computers
and devices, each user, by leveraging pervasive networking, will have access
to thousands of computers, with various information and capabilities
available through each.

With everything about a person being recorded, imaged, or


twittered, and all of that information constantly available, the report claims
that we're about to reach the end of the ephemeral. By having medical
information, personal photos, and even minute-by-minute thoughts
permanently stored online, people will voluntarily provide access to more
information than government spies or advertising agencies could ever
succeed in gathering.

To cope with these changes, the researchers suggest that


careful thought will have to go into allowing people to be notified and opt
out of pervasive recording. Referring to GPS navigation systems, the authors
note, "if people are prepared to stupidly obey instructions given out by
simple computers, this should make us even more concerned about the
relationship between people and ever more complex computers as we move
toward 2020."

HCI 2020: The evolution of GUI and human-computer interaction – 10


Years after Windows 7

Microsoft is looking to the future of user – computer interaction and of the


graphical user interface well beyond its next iteration of the Windows operating system,
until 2020. Scheduled for availability in 2010, Windows 7 is bound to have an increased
role in the Redmond company's vision of the natural user interface at the level of desktop,
notebook, tabletop surface computer, tablet PC, etc. But Microsoft's vision of the
evolution of the GUI and ultimately of the human-computer interaction model is much
broader than Windows 7. The HCI 2020: Human Values in a Digital Age forum, held in
Sanlúcar la Mayor, Spain, in mid-March was the stage where the next generation of
human-computer interaction was placed at the centerstage. The Redmond company even
made a report available offering a complex perspective over the evolution of user –
computer interaction in the next 12 years. "Being Human - Human-Computer Interaction
in the year 2020" can be downloaded from here.

"Computers have shaped so many aspects of the modern world that we


wanted to explore how today's emerging technologies might shape our lives in 2020.
Computing has the potential to enhance the lives of billions of people around the world.
We believe that if technology is to truly bring benefit to humanity, then human values
and the impact of technology must be considered at the earliest possible opportunity in
the technology design process," revealed Abigail Sellen, senior researcher at Microsoft.

Without a doubt the graphical user interface delivered by computers and


operating systems today will be obsolete in 2020. Already emphasis is placed on multi-
touch, gesturing, object recognition, as well as speech and even brain-computer
interfaces. It all falls under the umbrella of natural user interfaces, a segment on which
Microsoft chairman Bill Gates will focus after he will complete the transition out of his
day-to-day role at Microsoft.

"This report makes important recommendations that will help us to decide


collectively when, how, why and where technology impacts upon humanity, rather than
reacting to unforeseen change. The final recommendation is something towards which we
should all aspire: by 2020 HCI will be able to design for and support differences in
human value, irrespective of the economic means of those seeking those values. In this
way, the future can be different and diverse because people want it to be," added Sellen.

Contextual Related Posts:

• Microsoft Research Celebrates the Past, Present and Future of Human-Computer


Interaction
• “2020 FLOSS Roadmap”: Open-source Developers Set out Software Road Map
for 2020
• Designing and developing search engine is a Science
• Evolution of Windows Mobile ‘Pocket PC to Windows Mobile 6.5′ Video
• Google Search and the evolution of Human Evaluators
• Perzonae Unified communications start-up challenges Microsoft, Skype, Google
• New Computer Interfaces on the way!
• Evolution of Windows Media Center – Video
• Windows 7 NUI/NUE (Natural User Interface/Natural User Experience
Conclusion

With the dawn of a new millenium, computers seem to have taken over
the world. Therefore, we have decided to embrace modern technology to
serve you better. 20-20 software allows us to produce wire frame and full
color renderings.As you can imagine the possibilities are endless! , and we
do so with the viewpoint that a single breakthrough in some area of
computer technology may have an impact on our lives in a totally different
dimension. Because all these technologies are becoming more 'inter-twined',
one area often affects another.

Vous aimerez peut-être aussi