Vous êtes sur la page 1sur 11

ICT TRENDS

1. 5G Networks
Spain’s National 5G Plan for 2018-2020 stipulates that throughout 2019, pilot projects
based on 5G will be developed resulting in the release of the second digital dividend.
Hence, the groundwork is being laid so that in 2020 we will be able to browse the Internet
on a smartphone at a speed that will reach 10 gigabytes per second. Data from Statista, a
provider of market and consumer data, indicates that by 2024, 5G mobile network
technology will have reached more than 40 percent of the global population, with close to
1.5 billion users.

2. Artificial Intelligence (AI)


This trend has appeared in all the lineups for a few years now, but everything indicates that
this year will be the year it takes off definitively. This is the year we’ll see its
democratization, while it is even included in the political agenda. At the beginning of
December, the European Commission released a communication on AI directing the
member states to define a national strategy addressing this topic by mid-2019.

3. Autonomous Devices
In respect to the previous point robots, drones, and autonomous vehicles are some of
the innovations in the category the consulting firm Gartner labels “Autonomous Things”
defined as the use of artificial intelligence to automate functions that were previously
performed by people. This trend goes further than mere automation using rigid
programming models, because AI is now being implemented to develop advanced behavior,
interacting in a more natural way with the environment and its users.

4. Blockchain
Blockchain technology is another topic that frequently appears on these end of year lists. It
has now broken free from an exclusive association with the world of cryptocurrencies; its
usefulness has been proven in other areas. In 2019 we will witness many blockchain
projects get off the ground as they try to address challenges that still face the technology in
different fields like banking and insurance. It will also be a decisive year for the roll-out
of decentralized organizations that work with intelligent contracts.
5. Augmented Analytics
This trend represents another stride for big data, by combining it with artificial intelligence.
Using machine learning (automated learning), it will transform the development, sharing,
and consumption of data analysis. It is anticipated that the capabilities of augmented
analytics will soon be commonly adopted not only to work with data, but also to
implement in-house business applications related to human resources, finance, sales,
marketing and customer support – all with the aim to optimize decisions by using deep
data analysis.

6. Digital Twins
A digital twin is a virtual replica of a real-world system or entity. Gartner predicts that there
will be more than 20 billion sensors connected to end points by 2020, but the consulting firm
goes on to point out that there will also be digital twins for thousands upon thousands of
these solutions, with the express purpose of monitoring their behavior. Initially,
organizations will implement these replicas, which will continue to be developed over time,
improving their ability to compile and visualize the right data, make improvements, and
respond effectively to business objectives.

7. Enhanced Edge Computing


Edge computing is a trend that relates most specifically to the Internet of Things. It consists
in placing intermediate points between connected objects. Data can be processed at these
intermediate points, thus facilitating tasks that can be performed closer to where the data
has been received, thus reducing traffic and latency when responses are sent. With this
approach, processing is kept closer to the end point rather than having the data sent to a
centralized server in the cloud. Still, instead of creating a totally new architecture, cloud
computing and edge computing will be developed as complementary models with
solutions in the cloud, administered as a centralized service that runs not only on
centralized servers but also on distributed servers and in the edge devices themselves.

8. Immersive Experiences in Smart Spaces


Chatbots integrated into different chat and voice assistance platforms are changing the way
people interact with the digital world, just like virtual reality (VR), augmented reality (AR),
and mixed reality (MR). The combination of these technologies will dramatically change our
perception of the world that surrounds us by creating smart spaces where more immersive,
interactive, and automated experiences can occur for a specific group of people or for
defined industry cases.

9. Digital Ethics and Privacy


Digital ethics and privacy are topics that are receiving more and more attention from both
private individuals as well as associations and government organizations. For good reason,
people are increasingly concerned about how their personal data is being used by public
and private sector organizations. Therefore, we conclude that the winning organizations will
be those that proactively address these concerns and are able to earn their customers’
trust.

The Core Rules of Netiquette — Summary


Rule 1. Remember the human.

Never forget that the person reading your mail or posting is, indeed, a person, with
feelings that can be hurt.

Corollary 1 to Rule #1: It's not nice to hurt other people's feelings.

Corollary 2: Never mail or post anything you wouldn't say to your reader's face.

Corollary 3: Notify your readers when flaming.

Rule 2. Adhere to the same standards of behavior online that you follow in real life.

Corollary 1: Be ethical.

Corollary 2: Breaking the law is bad Netiquette.

Rule 3. Know where you are in cyberspace.

Corollary 1: Netiquette varies from domain to domain.

Corollary 2: Lurk before you leap.


Rule 4. Respect other people's time and bandwidth.

Corollary 1: It's OK to think that what you're doing at the moment is the most
important thing in the universe, but don't expect anyone else to agree with you.

Corollary 2: Post messages to the appropriate discussion group.

Corollary 3: Try not to ask stupid questions on discussion groups.

Corollary 4: Read the FAQ (Frequently Asked Questions) document.

Corollary 5: When appropriate, use private email instead of posting to the group.

Corollary 6: Don't post subscribe, unsubscribe, or FAQ requests.

Corollary 7: Don't waste expert readers' time by posting basic information.

Corollary 8: If you disagree with the premise of a particular discussion group, don't waste the
time and bandwidth of the members by telling them how stupid they are. Just stay away.

Corollary 9: Conserve bandwidth when you retrieve information from a host or server.

Rule 5. Make yourself look good online.

Corollary 1: Check grammar and spelling before you post.

Corollary 2: Know what you're talking about and make sense.

Corollary 3: Don't post flame-bait.

Rule 6. Share expert knowledge.

Corollary 1: Offer answers and help to people who ask questions on discussion groups.

Corollary 2: If you've received email answers to a posted question, summarize them and post the
summary to the discussion group.
Rule 7. Help keep flame wars under control.

Corollary 1: Don't respond to flame-bait.

Corollary 2: Don't post spelling or grammar flames.

Corollary 3: If you've posted flame-bait or perpetuated a flame war, apologize.

Rule 8. Respect other people's privacy.

Don't read other people's private email.

Rule 9. Don't abuse your power.

The more power you have, the more important it is that you use it well.

Rule 10. Be forgiving of other people's mistakes.


You were a network newbie once too!

Definition

ICT is an acronym that stands for Information and Communications Technology.

ICT is the integration of information processing, computing and communication technologies. ICT is
changing the way we learn, work and live in society and are often spoken of in a particular context,
such as in education, health care, or libraries. A good way to think about ICT is to consider all the
uses of digital technology that already exist to help individuals, businesses and organizations use
information. ICT covers any product that will store, retrieve, manipulate, transmit or receive
information electronically in a digital form and is concerned with these products. Importantly, it is
also concerned with the way these different uses can work with each other. For example, personal
computers, digital television, email, robots.

A look at what we use at home, in the office, in school, or at any business or social function finds
many devices equipped with computer chips. They include access cards, mobile phones, point of
sales scanner, medical instruments, TV remote controls, microwaves ovens, DVD players, digital
cameras, PDAs, etc.

^#top

History

IT defines as Information Technology, consists of study, design, advance development,


accomplishment, support or administration of computer foundation information system, mostly
software application and computer hardware. Information technology works with the use of
electronic computers and computer software to renovate, defend, development, and broadcast and
other information.
Information technology has overstuffed to cover many features of computing and technology, and
this word is more familiar than ever before. Information technology subject can be quite large,
encompassing many fields. IT professionals perform different types of responsibilities that range
from installing applications to designing complex computer networks.

IT professional's responsibilities are data management, networking, database, software design,


computer hardware, management and administration of whole system. IT (Information Technology)
is combined word of computer and communications or "InfoTech". Information Technology
illustrates any technology which helps to manufacture, manipulate, accumulate, communicate or
broadcast information.

Recently it has become popular to broaden the term to explicitly include the field of electronic
communication so that people tend to use the abbreviation ICT (Information and Communications
Technology).

The term "information technology" evolved in the 1970s. Its basic concept, however, can be traced to
the World War II alliance of the military and industry in the development of electronics, computers,
and information theory. After the 1940s, the military remained the major source of research and
development funding for the expansion of automation to replace manpower with machine power.
Since the 1950s, four generations of computers have evolved. Each generation reflected a change to
hardware of decreased size but increased capabilities to control computer operations. The first
generation used vacuum tubes, the second used transistors, the third used integrated circuits, and
the fourth used integrated circuits on a single computer chip. Advances in artificial intelligence that
will minimize the need for complex programming characterize the fifth generation of computers, still
in the experimental stage.

The first commercial computer was the UNIVAC I, developed by John Eckert and John W. Mauchly in
1951. It was used by the Census Bureau to predict the outcome of the 1952 presidential election. For
the next twenty-five years, mainframe computers were used in large corporations to do calculations
and manipulate large amounts of information stored in databases. Supercomputers were used in
science and engineering, for designing aircraft and nuclear reactors, and for predicting worldwide
weather patterns. Minicomputers came on to the scene in the early 1980s in small businesses,
manufacturing plants, and factories.

In 1975, the Massachusetts Institute of Technology developed microcomputers. In 1976, Tandy


Corporation's first Radio Shack microcomputer followed; the Apple microcomputer was introduced
in 1977. The market for microcomputers increased dramatically when IBM introduced the first
personal computer in the fall of 1981. Because of dramatic improvements in computer components
and manufacturing, personal computers today do more than the largest computers of the mid-1960s
at about a thousandth of the cost.
Computers today are divided into four categories by size, cost, and processing ability. They are
supercomputer, mainframe, minicomputer, and microcomputer, more commonly known as a
personal computer. Personal computer categories include desktop, network, laptop, and handheld.

A brief history of ICT


Computers
The term “computer” comes from the Latin “computus” and “computare”. Both
Latin words mean to determine by mathematical means or by numerical
methods. The English verb “compute” has the same meaning.

Basically, a computer is a programmable electronic device that performs


mathematical calculations and logical operations, especially one that can
process, store and retrieve large amounts of information very quickly.
Personal computers are also employed for manipulating text or graphics,
accessing the Internet, or playing games or media.

The main components of a computer are:

1. a Central Processing Unit (CPU),


2. a monitor,
3. a Keyboard,
4. and a mouse.
Originally the first computers were the size of a large room, consuming as
much power as several hundred modern personal computers.

The US-built ENIAC (Electronic Numerical Integrator and Computer) was the first electronic
programmable computer built in the US.
Later, computers have become smaller and much more powerful.
The Internet
The Internet was invented as a result of researches conducted in the early
1960s by visionary people like J.C.R. Licklider of MIT. The latter saw the
added value of allowing computers to share information on research and
development in scientific and military fields. That’s why, he proposed a global
network of computers in 1962, and moved over to the Defense Advanced
Research Projects Agency (DARPA) in late 1962 to head the work to develop
it.
The Web
While many people use the terms Internet and the Web interchangeably, they
are in fact not synonymous. The Internet is a huge network that connects
millions of computers together worldwide. Computers in this network can
communicate with any other computer as long as they are connected to the
Internet. The Web or the World Wide Web (WWW), however, is a way of
accessing information over the medium of the Internet. It is an information
space or a model that is built on top of the Internet where documents and
other web resources are identified by URLs (Uniform Resource Locator),
informally termed a web address. This space is interlinked by hypertext links,
and can be accessed via the Internet.

IMPORTANCE OF INFORMATION AND


COMMUNICATIONS TECHNOLOGY (ICT)
IN OUR DAILY LIFE
pcdreams Uncategorized 0 Comments macbook -2100f

Information and Communications Technology (ICT) has an important role in


the world since we are now in the information age era. With ICT, the company
can make the business easier to happen with the client, supplier and the
distributor. It is also very important in our daily lives. The lack of appropriate
information at the right time will result in low productivity, low quality research
works, and waste of time to pursue information and even to do research which
actually others had done or in other countries. Nowadays ICT cannot be
separated with our daily needs.

ICT has a great impact in our daily lives. For example, we can read our local
newspaper using the online newspaper. Another example is we still can get
connected with our family, relatives, or colleagues even if we are abroad by
using the electronic mail, yahoo messenger, call conference, or video
conference.

Digital computer and networking has changed our economy concept to the
economy with no boundary in time and space because of ICT. It brings a lot of
advantages for economic development enabling millions of transactions to
happen in an easy and fast way.

ICT is one of the economic development pillars to gain national competitive


advantage. It can improve the quality of human life because it can be used as
a learning and education media, the mass communication media in promoting
and campaigning practical and important issues, such as the health and social
area. It provides wider knowledge and can help in gaining and accessing
information.

ICT has become an integral part of everyday life for many people. It increases
its importance in people’s lives and it is expected that this trend will continue,
to the extent that ICT literacy will become a functional requirement for
people’s work, social, and personal lives.

The use of ICT in education add value in teaching and learning, by enhancing
the effectiveness of learning, or by adding a dimension to learning that was
not previously available. ICT may also be a significant motivational factor in
students’ learning, and can support students’ engagement with collaborative
learning.

Information and Communications Technology (ICT) is basically our society’s


efforts to teach its current and emerging citizens valuable knowledge and
skills around computing and communications devices, software that operates
them, applications that run on them and systems that are built with them.

As a matter of fact, we are living in a constantly evolving digital world. ICT has
an impact on nearly every aspect of our lives – from working to socializing,
learning to playing. The digital age has transformed the way young people
communicate, network, seek help, access information and learn. We must
recognize that young people are now an online population and access is
through a variety of means such as computers, TV and mobile phones.

It is in this premise that educational technology and e-learning is taught in or


out of the classroom since educational technology is used by learners and
educators in homes, schools, businesses, and other settings.

Sourcing for second hand laptop singapore, come


to www.laptopfactoryoutlet.com.sg, a Singapore SME that focus on
refurbished mackbook,best budget laptop & macbook refurbished,cheapest
laptop. A digital marketing project for iphone trade in,refurbished mac,best
budget laptop

List of Computer Hardware

Here are some common individual computer hardware components that you'll often
find inside a modern computer. These parts are almost always found inside
the computer's housing, so you won't see them unless you open the computer:

 Motherboard
 Central Processing Unit (CPU)
 Random Access Memory (RAM)
 Power Supply
 Video Card
 Hard Drive (HDD)
 Solid-State Drive (SSD)
 Optical Drive (e.g., BD/DVD/CD drive)
 Card Reader (SD/SDHC, CF, etc.)

Here is some common hardware that you might find connected to the outside of a
computer, although many tablets, laptops, and netbooks integrate some of these items
into their housings:

 Monitor
 Keyboard
 Mouse
 Battery Backup (UPS)
 Flash Drive
 Printer
 Speakers
 External Hard Drive
 Pen Tablet

Here are some less common individual computer hardware devices, either because
these pieces are now usually integrated into other devices or because they've been
replaced with newer technology:
 Sound Card
 Network Interface Card (NIC)
 Expansion Card (Firewire, USB, etc.)
 Hard Drive Controller Card
 Analog Modem
 Scanner
 Projector
 Floppy Disk Drive
 Joystick
 Webcam
 Microphone
 Tape Drive
 Zip Drive

Vous aimerez peut-être aussi