Vous êtes sur la page 1sur 48

r 2017

eptembe
August/S s.com
c e n te rdynamic
MORE data

GHz

EPYC
32 CORES

VR TAKES A RACE AGAINST THE GRID VS


RINGSIDE SEAT THE CLOCK DATA CENTERS
Simulations and Every nanosecond Dont be power
simulacra in the world counts as satellites hungry, become
of data centers show their age energy smart
Contents
August/September 2017

ON THE COVER
30 18 Chip wars:
AMD gets Epyc
Intel strikes back
ARM hangs on

NEWS

34 07  Top Story
China forces Apple to build
07 In Brief
18
2017
ptember
August/Se ics.com
erdynam
datacent
MORE Asgardias orbiting SSD
GHz
FEATURES
EPYC
32 CORES 24 Tape stays strong
30 Virtual reality arrives
34 Do we curb energy demand?
37 Smart city foundations
41 Containers unlock the cloud
OPINION
46 Max Smolaks knows what
really drives the Internet
REGIONAL FEATURES

46 VR TAKES A RACE AGAINST THE GRID VS


14  LATAM
Data centers help Colombia

RINGSIDE SEAT THE CLOCK DATA CENTERS


16  APAC

37 19
Simulations and Every nanosecond Dont be power
Asias security experts speak
simulacra in the world counts as satellites hungry, become
of data centers show their age energy smart
CALENDAR
44 Events and training
DCDs global happenings
INTERVIEW
19 AMD on the offense
An exclusive sit down
with Dr Lisa Su, AMDs
CEO, on her companys
fight to win back the
Dr Lisa Su server market
CEO of AMD

41 EDITORS PICK
 wrinkle in time
A
The Internet has well-developed
techniques for synchronization.
But they may not be good
enough for emerging
applications and regulations

27
Issue 23 August/September 2017 3
From the Editor HEAD OFFICE

98%
102108 Clifton Street
London EC2A 4HW

White Space
+44 (0) 207 377 1907

MEET THE TEAM

Fight Club
Peter Judge
Global Editor
@Judgecorp
Max Smolaks

F
News Editor
@MaxSmolax
ight metaphors aren't More adventures in time for Seb:
normally my cup of tea, but distributed cloud services can only Sebastian Moss
you can't avoid it when you work when the services have an agreed Reporter
look at the conflict in server and shared reference system time. @SebMoss
chips, between the grizzled How is that time delivered? It turns Tanwen Dawn-Hiscox
Share of the x86
reigning champion Intel, out there's a whole sub-industry of Reporter
server market
and AMD, a fan-favorite contender, time servers, using esoteric techniques @Tanwendh
owned by Intel.
making a late bid for glory. like cesium fountain clocks and
AMD's portion David Chernicoff
With tempers rising, I assigned my rubidium crystal oscillators.
"rounds up to 1%," US Correspondent
colleagues Max Smolaks and Sebastian That may sound abstruse, but new AMD CEO Lisa Su @DavidChernicoff
Moss to the ringside, each covering financial rules mean you will need to told DCD (p19)
one corner. know about this (p27). Virginia Toledo
Max got inside information on Intel, Editor LATAM
while Seb got privileged access to AMD, Capacity demands are growing @DCDNoticias
including an interview with the CEO, but our Energy Smart summit in San Celia Villarrubia
Dr Lisa Su (p18). Between them, they Francisco says this is not necessarily an Assistant Editor LATAM
also cover ARM and other players. environmental problem (p34). @DCDNoticias
My only worry is - after getting Data centers use energy steadily, Paul Mah
roped into this bruising contest, will which is good news for the utility grid SEA Correspondent
Max and Seb still be friends? which likes to deliver power that way. @PaulMah
The bad news is that both utilities
and data centers have set up a Tatiane Aquim
Brazil Correspondent
Curbing demand is a system where services delivered
@DCDFocuspt
from data centers appear to be free.
fight between humans Demand grows without limit and the
and an inanimate global infrastructure has so far been scarily DESIGN
good at meeting that demand. Chris Perrins
system we created It turns out that the best way to Head of Design
exit this cycle may be to intervene in Holly Tillier
human behavior, and create feedback Designer
Cooling tech is more standard fare which helps people to realize and limit
for DCD readers, so Tanwen took a their own environmental demands. Mar Perez
Designer
virtual reality look at its history (p30).
Through a combination of VR That sounds a lot like blaming the
headsets and computational fluid users. But put it another way. If curbing ADVERTISING
dynamics modeling, Future Facilities demand were a fight, it would be a fight Yash Puwar
gave her a magical historical tour of between humans and an inanimate Head of Sales
data center efficiency all the stages of global system they created.
Aiden Powell
data center construction reproduced in In dystopian sci-fi, the global
Global Account
virtual space. system might win. In the real world, I'd Manager
As so many people have told us, say the human race could still score a Peter Judge
whatever power and racks you may knockout. DCD Global Editor
have, it's all about the air flow. bit.ly/DCDmagazine

FIND US ONLINE
datacenterdynamics.com datacenterdynamics.es datacenterdynamics.com.br twitter.com/DCDnews | Join DatacenterDynamics Global Discussion group at linkedin.com
SUBSCRIPTIONS
datacenterdynamics.com/magazine
TO EMAIL ONE OF OUR TEAM
firstname.surname@datacenterdynamics.com

2017 Data Centre Dynamics Limited All rights reserved. No part of this publication may be reproduced or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, PEFC Certified
or be stored in any retrieval system of any nature, without prior written permission of Data Centre Dynamics Limited. Applications for written permission should be directed to Jon McGowan, jon.mcgowan@
datacenterdynamics.com. Any views or opinions expressed do not necessarily represent the views or opinions of Data Centre Dynamics Limited or its affiliates. Disclaimer of liability: Whilst every effort has been made to This product is
ensure the quality and accuracy of the information contained in this publication at the time of going to press, Data Centre Dynamics Limited and its affiliates assume no responsibility as to the accuracy or completeness from sustainably
managed forests and
of and, to the extent permitted by law, shall not be liable for any errors or omissions or any loss, damage or expense incurred by reliance on information or any statement contained in this publication. Advertisers are controlled sources
solely responsible for the content of the advertising material which they submit to us and for ensuring that the material complies with applicable laws. Data Centre Dynamics Limited and its affiliates are not responsible
for any error, omission or material. Inclusion of any advertisement is not intended to endorse any views expressed, nor products or services offered, nor the organisations sponsoring the advertisement. PEFC/16-33-254 www.pefc.org

4 DCD Magazine datacenterdynamics.com


News Roundup

Vapor IO puts edge


colo in cell towers
Cylindrical rack enclosure
company Vapor IO
announced at DCD
Webscale that it will put its
gear into the base of cell
towers to deliver colocation
services at the edge of the
network, backed by wireless
infrastructure provider
Crown Castle.

Sri Lanka gets first


Tier III certified
data center
Sri Lankan telecoms
company Dialog Broadband
Networks has been awarded
a Tier III Constructed Facility
certificate for the first phase

New regulations force Apple to


of its Malabe Data Center #2.

Asgardia to put SSD


into low earth orbit
Self-proclaimed space nation
Asgardia is to launch a small
build first Chinese data center
satellite into space with Apple has announced plans The addition of this data has strong data privacy and
512GB of memory in an effort to open its first data center in center will allow us to improve security protections in place and
to test long term data storage China, in the southern province the speed and reliability of our no backdoors will be created
in space. It appears that the of Guizhou, in partnership products and services while also into any of our systems.
total storage available to with data management firm complying with newly passed The company added that this
end users is only just over Guizhou-Cloud Big Data regulations, Apple told Reuters. facility, like Apples other sites,
200MB. Industry Co Ltd (GCBD). These regulations require cloud will use renewable energy, and
The new facility comes services be operated by Chinese that its investments in Guizhou
Schneider extends as China is strengthening its companies so were partnering would pass $1 billion.
Galaxy VX UPSs controversial cybersecurity laws, with GCBD to offer iCloud. Elsewhere, Apple announced
Schneider Electric has mandating that foreign firms GCBD was co-founded by plans to build a second data
launched new scalable have to store user data within the provincial government, center in Denmark for 6 billion
500kW, 1000kW and 1500kW the country. China also said that which wants Guizhou to Danish crowns ($921 million).
UPSs in its Galaxy V-Series businesses transferring over become a data hub. Other In Ireland, meanwhile,
range. Launched at DCD 1,000 gigabytes of data outside projects in the province include the company must wait until
Webscale in San Francisco, of the country will have to a US$23 million facility to October 12 for a judgment on its
these models can be used to undergo yearly security reviews, process and store data from the 2015 application to build a data
shave demand during peak in language that has been worlds largest radio telescope. center in Athenry.
periods. criticized as overly vague and a Apple pointed out the data
threat to proprietary data. would be secure. It said: Apple bit.ly/littletroubleinbigchina

Vox Box
Can the pace of efficiency match the What is different about energy
increase in demand? policy in the US?
For the last five years, efficiency has Energy policy in the US is unique.
kept up, because of Moores Law, The states are sovereign entities
virtualization and the increasing with certain powers under the
efficiency of the infrastructure constitution - and those states have
measured by PUE. There are a lot of been the primary drivers of energy
people questioning whether Moores policy for centuries. As well as this,
Law can continue, you cant get better virtually all the energy assets are
than a PUE of 1, and you cant get more privately owned. In most countries,
than 100 percent utilization. There natural resources are owned by the
Dale Sartor could be some issues beyond 2020. Donald Paul government, not individuals.
Scientist/engineer Director
LBNL bit.ly/energysmartsartor USC Energy Institute bit.ly/energysmartpaul

Issue 23 August/September 2017 7


News Roundup

IBM opens four cloud data


centers
IBM has opened four new data centers: two in London, one in
San Jose (California) and one in Sydney.
The new facilities bring the companys total to 59 data centers
in 19 countries, including four in Australia, five in the UK and 23
Digital Realty and DuPont in the US.
In its second quarter results, net income decreased by two
Fabros announce merger percent since Q2 last year, the 21st consecutive quarter of
revenue decline.
Digital Realty one of the largest data center operators But IBMs cloud revenue grew 17 percent year-on-year with
in the world is set to merge with another American adjusted currency. In 2016, its cloud revenue grew
colocation provider, DuPont Fabros (DFT), in a deal valued at by 35 percent.
approximately $7.6 billion. IBM has stated that it expects
This will create a global giant with 157 facilities under revenues from its so-called strategic
management, and considerably increase Digital Realtys imperatives (such as AI and cloud
capacity across core US metro markets, where DFT usually computing) to increase in the second
locates its facilities. half of 2017 as a result of having signed
The all-stock transaction will see DuPont Fabros new cloud contracts, including a 10
shareholders receive a fixed exchange ratio of 0.545 Digital year cloud outsourcing deal with Lloyds
Realty shares per DuPont Fabros share. Bank and a joint venture with Chinas
The transaction has been unanimously approved by Dalian Wanda.
the boards of directors, and is awaiting approval by the
shareholders of both businesses. It is expected to close in the bit.ly/eyebeeemm
second half of 2017.

bit.ly/megamerger

Google launches appliance


to upload data by FedEx
Vertiv to sell ASCO to Google has launched a storage module designed to let users ship
data to its cloud by courier, a process referred to as Sneakernet.
Schneider Electric for The product, which echoes similar moves by Amazon, holds up
to 480TB, and avoids network uploads which could take years.
$1.25bn Google Transfer Appliance (GTA) is a 19in rackmount storage
appliance, which can be installed in a customers rack, filled with
Vertiv plans to sell ASCO, its automatic transfer switch data, and then shipped to Google for upload to the Google Cloud
business, to Schneider Electric for $1.25 billion Platform (GCP).
Formerly known as Emerson Network Power, Vertiv was Two sizes are available: a 4U-high 480TB unit or a 2U-high
last year rebranded and acquired by Platinum Equity for $4 100TB module. The product, currently only available in the US,
billion. The company said that the sale, which is expected to follows Amazon Web Services similar 50TB/80TB Snowball
close by Q4 2017, will allow Vertiv to focus on its core data device launched in 2015, and the truck-sized 100-petabyte
center, telecoms, and commercial and industrial markets. Snowmobile launched in 2016.
This sale is a significant step forward in our evolution
as the premier provider of digital critical infrastructure bit.ly/googlesneakernet
solutions, Vertiv CEO Rob Johnson said.
ASCOs Automatic Transfer Switches (ATSs) continually
monitor the available current from two different sources and
switch between them when one drops below a desirable level.
In a data center, an ATS would switch to a generator in the
event of a power grid failure.
The market for ATS presents attractive long-term
growth opportunities as more commercial and industrial
customers move towards autonomous or multi-source power
management, Schneider said in a statement.
ASCO first introduced an ATS in 1925, after operating as Peters Sneakernet factoid
an elevator, compressor and generator control company for An Airbus full of microSD cards could
nearly 40 years. It was acquired by Emerson back in 1985. achieve 5 petabytes per second
bit.ly/switchingswitches
between New York and Los Angeles

8 DCD Magazine datacenterdynamics.com


News Roundup

Microsoft to cut Cloud and SaaS


salesforce by 10 revenues to grow by a
percent in major quarter every year for
reshuffle five years
Microsoft is planning a significant round of job Research collated by Synergy suggests that revenues
cuts, as it restructures the company around cloud stemming from cloud and software-as-a-service (SaaS)
computing. will grow between 23 and 29 percent every year, through
The change will mostly impact its sales operations, to 2022.
with less than 10 percent of the sales staff still The study predicts that the Asia-Pacific region will experience the highest
numbering in thousands set to lose their jobs. growth in cloud and SaaS revenues, followed by Europe and the Middle East
The company told CNBC that this is not a cost- and North America.
cutting exercise, but rather an attempt to refine its Specifically, Synergy suggests that public IaaS and PaaS will grow by an
sales process: Microsoft will make changes to its sales average of 29 percent every year, managed and hosted private cloud by 26
departments, dividing their staff by industry verticals percent, enterprise SaaS by 23 percent and that infrastructure sales to major
and target company sizes. cloud providers will increase by 11 percent annually for five consecutive years.
A spokesperson said: We are taking steps to As far as Internet-based virtualized computing and managed platform
notify some employees that their jobs are under growth are concerned, database and IoT-specific services are expected to
consideration or that their positions will be feature most prominently. Enterprise resource planning (ERP) software is
eliminated. predicted to drive SaaS growth.
Like all companies, we evaluate our business At the same time, Synergy expects that hardware and software sales to
on a regular basis. This can result in increased enterprises will dwindle, decreasing by two percent on average every year,
investment in some places and, from time-to-time, reflecting the shift from privately-owned infrastructure to running workloads
redeployment in others. in the cloud.

bit.ly/cloudyfuture bit.ly/DCDgrowbabygrow

Issue 23 August/September 2017 9


News Roundup

the facility will start offering public cloud Gangwon Province. It was designed to
Naver to build services in the second half of 2020. resemble traditional rice terrace farms
Naver, often referred to as the Google and is listed in the Top 10 beautiful data
$420m data center of South Korea, was established in 1999 centers feature from our April/May issue.
as the first Korean web portal to develop Now, the company is expanding with
in Yongin, South its own search engine. The company has a new facility built to power the public
been successful in keeping its competitors cloud platform it launched in April.
Korea at bay by fusing latest technologies Naver Cloud Platform already offers 30
with decisively Korean aesthetics. It also basic infrastructure services related to
South Korean cloud giant Naver is planning enjoys informal support from the Korean computing, data, security and network, and
to spend 480 billion won ($420 million) on government. promises to be price competitive with both
a massive data center in Yongin, just south Naver was the first Internet company AWS and Microsoft Azure.
of Seoul. in Korea to build and operate its own data
According to SK publication Platum, center, the Gak facility in Chuncheon, bit.ly/anaverdatacenter

10 DCD Magazine datacenterdynamics.com


News Roundup

US Air Force and IBM Estonia to create Data


to build brain-inspired Embassy in Luxembourg
supercomputer Estonia will be backing up its government data in
Luxembourgs first foreign data embassy, storing copies of
The US Air Force Research Laboratory (AFRL) will collaborate its most important records to protect them in case of outages,
with IBM on a brain-inspired HPC system powered by a 64- equipment failures or cyber attacks.
chip array of the IBM TrueNorth Neurosynaptic System - a low Last year, the country revealed that it was considering the
energy chip designed to mimic brain activity. UK to host the virtual embassy, but uncertainty surrounding
The company claims that the systems pattern recognition Brexit negotiations led it to turn to Luxembourg instead.
and sensory processing power will be the equivalent of 64 The launch was announced at an annual celebration
million neurons and 16 billion synapses, while the processor will held by Digital Ltzebuerg (Luxembourg) - an initiative set
require just 10 watts of power. out in 2014 to promote the countrys ICT sector and develop
The system fits in a 4U-high (7in) space in standard server its international presence beyond the financial world -
racks, with eight of the systems enabling the 512 million neurons which was attended by the heads of state of Estonia and
per rack. Luxembourg, Xavier Bettel and Jri Ratas.
One processor consists of 5.4 billion transistors organized Peter Sodermans, senior advisor at Digital Ltzebuerg, said
into 4,096 neural cores forming an array that potential Russian invasion was merely a contributing
of one million digital neurons that factor in Estonias decision to use Luxembourg-based
communicate with each other infrastructure to hold a copy of its official records abroad,
via 256 million electrical despite suspicions that Russia may have launched multiple
synapses. The project cyber attacks on neighboring Baltic states in recent years.
was originally funded For one, he said, Luxembourg is one of the safest
by DARPA under its countries in the world.
SyNAPSE program.
bit.ly/estonesthrowaway
bit.ly/planebrain

Atos set to deliver the first


commercial ARM-based
supercomputer
French IT services giant Atos has introduced the first ever commercial
supercomputer to rely on processor cores designed by ARM.
The Bull Sequana X1310 will use the familiar
Sequana X1000 architecture and Caviums
ThunderX2 chips to offer a system suitable for
the most compute-intensive tasks.
Atos designed the Bull Sequana X1000
system as an open platform, to offer
HPC users a large choice of computer
architectures from which to choose, and to
support the future processor technologies
that will make it possible to reach the exaflops
level, explained Agns Boudot, group VP and
head of HPC at Atos.
It is therefore a natural step to extend the range to
include ARM processors, together with the existing CPUs,
core processors and co-processors.
Processor cores designed by ARM are based on a reduced
instruction set computing (RISC) architecture. This approach delivers
less computational power per core, but can support more cores than a
typical chip from Intel or AMD.
Such processors are cheaper, more energy efficient and require less
cooling however they have been struggling to find their niche in the
data center (for more on ARM see page 23).

bit.ly/armingup

Issue 23 August/September 2017 11


News Roundup

NEC is building a Kenya opens two data


supercomputer for Czech centers to ensure electoral
Hydrometeorological Institute integrity
Czech Hydrometeorological Institute (CHMI) is deploying a new Ahead of presidential elections in
supercomputer, designed by Japanese IT giant NEC. Kenya in August, the countrys
The LX-Series machine consists of more than 300 nodes electoral agency has coordinated
connected through a high-speed Mellanox EDR InfiniBand fabric. the opening of two data centers.
It will be used to improve accuracy of weather forecasts, track air This follows an audit of the
pollution and provide early warning in case of extreme weather Independent Electoral and
events. Boundaries Commission (IEBC)
The new system is scheduled to be put into operation in early by KPMG that found major
2018. security gaps in the countrys
For years, we have been successfully collaborating with register.
meteorological institutes, and we look forward to cultivating these The report identified more
partnerships further, said Andreas Gottlicher, senior sales manager than ten security flaws in IEBCs
at NEC Deutschland. existing data center that could be
The new supercomputer at CHMI will use 3,500 of the latest used to flout the democratic process on
Xeon E5-2600 v4 cores to observe weather and make more accurate election day.
predictions. The integrity of the process and the ability to demonstrate
Storage will be provided by an NEC LXFS-z parallel file system it to the citizens of Kenya is all the more important given the
appliance, based on Lustre, with more than 1PB of capacity and a events of the past two elections: in both instances, local and
bandwidth of more than 30GBps. foreign observers found irregularities which cast doubt on the
legitimacy of the elected candidate.
bit.ly/rainraingoawaysimulateanotherday The audit of the electoral commissions current data center
discovered that two active administrator accounts with access
to the voters register had default passwords that had never
been changed, and preventative measures against cyber
attacks were found to be lacking.
Furthermore, the data center at IEBCs headquarters had
an unreliable UPS, a faulty fire alarm system that was last

$3.2 million
serviced in 2005, and its air conditioning system had failed,
raising temperatures by 25 degrees Celsius (45F).
Until now, there had been no disaster recovery site to

Amazons US lobbying ensure that the register was backed up and available in case
of a technical failure or an attack on the IT systems. In fact,

spend in Q2 2017 the register was backed up on tapes stored at the IEBCs head
offices, where its data center was located.

(bit.ly/alexalobbygov) bit.ly/Iwontheelectoralcollege

percent of the total project cost, which the for us - not only because they are a large
US DOE awards DOE says will bring the total investment to customer at various places, but also
at least $430 million. because were also developing our systems
$258m in HPC The announcement comes after oftentimes in a co-designed fashion
continuing concern over DOE funding together with them, where we work together
contracts levels under the Trump administration. The on technologies and try to match software
bipartisan 2017 fiscal year budget agreement and hardware technologies in a better way.
The US Department of Energy has awarded slightly increased DOE funding, but Trumps This is a concern to us, of course.
six American tech companies with contracts own budget proposed to reduce it by up
for high performance computing (HPC) to $900 million and completely eliminate bit.ly/racetoexascale
research. the Advanced Research Projects Agency-
The $258 million will be shared between Energy.
AMD, Cray, HPE, IBM, Intel and Nvidia over Prior to this announcement, when
three years as part of the new PathForward asked about the reports over DOE funding,
program, itself part of DOEs Exascale Crays VP of business operations in EMEA,
Computing Project (ECP). Dominik Ulmer, told DCD: It is always
The companies themselves will provide a concern to us what the Department of
additional funding amounting to at least 40 Energy does, its a very important partner

12 DCD Magazine datacenterdynamics.com


Other
Colombian
providers
GlobeNet
The carrier subsidiary
of Brazilian investment
bank BTG, Globenet
Helping Colombia
is building a 200 sq m
(2,152 sq ft) data center
in Barranquilla, designed
to Tier III levels, which
should be ready in August.
in a time of crisis
Near Globenets Despite the difficult economic times, data center service
submarine cable landing providers are betting on the country, reports Virginia Toledo
station, the site supports
IaaS services. The firm has

M
a system of fiber-optic
cables spanning more onthly data is oil sector (a key component of the
than 23,500 kilometers in still mixed, but economy), a low corporate tax rate,
America. analysts believe and an expansionary monetary
2017 has brought policy, Colombias GDP is expected
Internexa a significant to grow by 2.2 percent in 2017 and
Owned by Colombia ISA improvement for Colombia by 2.9 percent in 2018.
holding, Internexa has compared to 2016. Felipe Gmez,
opened its newest data Virginia Toledo Level 3s head of data centers in Gmez believes the pesos
center in the Bogota Free Editor LATAM Colombia, agrees. He told DCD that troubles are over and expects
Trade Zone, built by the projects which stalled in the past 2017 will be quiet: The dollar will
developer of Free Zones year could now be reactivated. remain in the same range and
of ZFB Group. The facility 2016 was very hard for the IT organizations must accommodate
has a Tier III certification industry because of the devaluation the new reality and slightly increase
for design and of the peso against the dollar, their IT investment budgets.
construction. Internexa he said. Investments became Meanwhile, service providers
has other data centers in difficult because suppliers bought have continued to invest in
the region, including one technology in dollars, but sold Colombias maturing market. Cable
in Medellin, along with services in pesos. & Wireless, for example, reached a
a fiber-optic network of This year there is a positive phased agreement to buy 500 sq m
over 49,000km. change for the Colombian (5,381 sq ft) of data center space in
economy, according to the BTs Colombian data center over
LatinFocus forecast from ten years. The deal is estimated
FocusEconomics. With a strong at $20 million, and C&W also put

14 DCD Magazine datacenterdynamics.com


Latin America

transformation: they expect the slowdown


in the Colombian market is temporary, and
the demand for information storage and
processing will continue to increase rapidly.
Level 3 has three data centers in the
country, two in Bogot and one in Cali,
but it expects demand to grow there. The
Cali facility was built as a backup site for
customers, Felipe Gmez told us. Now, there
are many customers in eastern Colombia
who are migrating their data centers and
outsourcing, creating new opportunities.
At first, the idea was to build a
contingency site, thinking that rules from
the Financial Superintendence of Colombia
would require backup data centers in
Colombia to be more than 100km away, and
at that time we had the two DCs in Bogota,
he said. Level 3 wanted to be prepared for the
regulation, but today is finding other uses
for the data center, since: we do not know
whether the law will be a reality or not.
In Bogot, Level 3 has a data center which
is nearly 80 percent occupied: of four halls,
three are almost full, with very little space
$4 million into infrastructure such as racks, available in the fourth room. We believe that
servers and cleanroom equipment. by 2018, we should start a new room of about
The purchase decision was due to time 600 sq m (6,458 sq ft) in the Colombia XV data
to market, Marcelo Oliveira, director of data center in Bogot, says Gmez.
centers at C&W, told DCD. With [BTs] Tier IV Equinix already had a data center in
data center and solutions we have developed Brazil, so Colombia was the next logical
for IaaS, PaaS and DRaaS, we can provide step. Thanks to its connectivity through
services that the competition cannot offer. submarine cables, Vctor Arnaud believes it
has potential to become a regional hub.
BTs Naos data center, located in the Zona Connectivity in Colombia has come
Franca de Tocancip, opened two years ago. a long way, says Juan Manuel Gonzlez,
It was the first site in Colombia to obtain Tier director of data & Internet products for
IV design and construction certifications Colombia at Level 3: Six years ago there
from the Uptime Institute, and received the were just four or five significant telecoms
DatacenterDynamics Latam Award for Best operators; now there are eleven or twelve
Service Provider Data Center in 2015. companies, so the market is fairly complete
This data center is part of our strategic and this is reflected in both coverage and
plan, not only for Colombia, but also for cost, and even connectivity to the world
the region, being through submarine cables. Level
instrumental in the 3 implemented a Pacific cable
sustained growth of two years ago, and there are
our company, Ricardo six additional cables across the
Imbacuan, C&Ws vice GDP is Atlantic.
president and country A sign of the momentum
manager for Colombia, predicted to of digital transformation is the
told DCD. volume of data and Level 3s
Another company grow by 2.9% network expansion plans. Level
that has recently entered 3s Pacific cable was initially set
the Colombian market in 2018 up with 400Gbps but can reach
with its own data center a capacity of 4.5Tbps, and the
in Bogota is colocation company is planning to enable
giant Equinix, as part of its $3.6 billion new equipment to expand beyond that.
purchase of 24 Verizon facilities. The company is involved in another project
Vctor Arnaud, Equinixs director for which seeks to take infrastructure directly to
Latin America, said that the company sees Quito, the capital of Ecuador.
Colombia as a strategic country, not only For Level 3, Colombia, Chile and
in of itself, but also as a bridge from South Argentina are the three major economies
America to Central America and the United marking growth in the region. If you look at
States, via Equinixs facilities in Miami. the investments that operators are making,
These providers decisions to enter the one of the most important in terms of
country are guided by a belief in digital deployment of connectivity is Colombia.

Issue 23 August/September 2017 15


Dont be
held hostage
Asian experts shared their security tips for data centers
at a Singapore summit. Paul Mah reports

T
he data center industry should irony, said Church, is that executives often
not build security in as an have irrational fears about the cloud, unaware
afterthought, but should up its of the extent to which they are using it.
game and be alert to the latest Yet many damaging security attacks are
threats to digital infrastructure, not related to the cloud at all. One bank hack
security and data center was traced to an outsourced IT support team
experts said at a DCD Security Summit held from a neighboring country, said Church.
in Singapore as part of Interpol World in July. One support staff members laptop was
A data center can be held hostage if the infected with malware for an entire year
digital controls to the UPS and chiller systems before hackers and pulled a digital heist of
are compromised by hackers, said Ka Vin $10 million within the span of an hour.
Wong, an independent consultant with Although overstated, the threat from data
experience helming colocation providers leakage through the cloud is real, and hackers
in Southeast Asia. With control over the love the cloud as it requires just the user
mechanical and electrical systems, attackers account and password to access the data, said
can issue blackmail demands threatening an Church. Almost all (90 percent) of consumers
induced outage. are not aware of what they are storing in the
cloud; some apps save a lot more files than
Mechanical and electrical systems can be they expect.
isolated from the network, but Wongs point Some users consider themselves
illustrates the need to harden modern data unattractive targets, believing that attacks
centers against digital threat vectors. only happen to other people and large
Everyone needs to play a part in security, organizations, but this is simply not true, said
and corporations can no longer be insular in Church. Data is a commodity like gold, and
Paul Mah their data management, said Chris Church, usernames, passwords and email accounts
SEA a senior mobile forensic specialist at Interpol can change hands to the tune of thousands
Correspondent (the International Police Organization). The of dollars, he said.

16 DCD Magazine datacenterdynamics.com


Asia Pacific

To identify genuine threats Finally, a properly secured track people in a near real-time
in modern infrastructure, data center should incorporate manner to strengthen control
users may need to create a capabilities including highly for assets and personnel.
local threat intelligence base, controllable personal access, and Phoon offered practical tips
because traditional defensive safeguards against actions that on securing physical security
measures such as proxy servers, degrade security such as buddy with the use of two-factor
intrusion prevention systems punching and tailgating, says authentication (2FA). An
(IPS) and antivirus software have Phoon Wai Leong of ERS Group. encrypted proximity beacon
repeatedly failed against targeted Physical access control could be issued upon validation
attacks, said Florian Lukavsky, protects against accidents as well and registration, which should
the director of SEC Consult be time synchronized to thwart
Singapore. duplication.
Security information and Active RFID or Bluetooth
event management (SIEM) dongles can allow personnel to
software products and services
A laptop was be tracked for incident response
can generate a large volume of infected with and auditing in the wake of
alerts, which are impossible to security incidents.
review manually. In the infamous malware for Keypad and pin locks should
Target hack of 2013, the hackers be implemented at checkpoints
activities were flagged, but
a whole year, within the facility, though facial
ignored by the security team. before hackers recognition is increasingly seen
as a reliable and cost-effective
A honey pot a fake pulled a $10m method of implementing a Top tips:
environment with intentionally second factor control.
leaked false information can digital heist In fact, three-factor
Isolate mechanical
systems
entrap attackers, said Lukavsky. authentication should be
The local threat intelligence as malice, said Phoon. A systems considered too, suggested Create a local threat
gathered is unique to the data engineer who runs out of power Phoon, and can be implemented intelligence base
center environment, allowing sockets in one rack, could with a mix of physical tokens,
security personnel to know with reach to the next rack to find a personalized passcodes and Handle the cloud
certainty that a security breach vacant slot and then promptly biometric authentication. with care
has occurred. trigger a power trip as an already
When one attacker installed maxed-out circuit is pushed over It will take some time to bring Use physical access
ransomware and APT (advanced the edge. all data centers up to scratch on controls
persistent threat) tools in a honey So how should modern the security front. But as more
pot, the security team was able data centers be secured? For a systems within the data centers Apply two-factor
to collect the fingerprints of start, authentication systems are digitized and networked, this authentication, or more
customized malware, usernames should have the right data to is an area that can no longer be
and passwords for backdoor identify logins from current or ignored. Trust NO ONE
tools, and attack patterns. These former employees, or blacklisted
were checked against other parts personnel. The DCD Security Summit was
of the environment to see if And systems should have held at Interpol World on 6 July
hackers had got in elsewhere. the ability to search, locate and in Singapore

Issue 23 August/September 2017 17


MORE
GHz

EP YC
32 c or
es

The war for the data center will likely never end.
As an old challenger returns, a new battle begins

W
hile data centers are a "x86 servers continue to be the Zen is an x86 microarchitecture, and the
complex combination predominant platform used for large-scale first processor family to implement it is Epyc.
of various pieces of data center build-outs across the globe, We talked to Su and profile the chip on the
computing equipment Gartner's research vice president Jeffrey next two pages.
and industrial machinery Hewitt said. Intel, however, is keen to keep its
all working in tandem like a great, living Within the x86 sector, the vast majority of dominant position. In an earnings call after
creature, at the heart of most lies the Central processors come from Intel; AMD leads a tiny its most recent quarterly results, CEO Brian
Processing Unit (CPU). wedge of compatible manufacturers. Krzanich said: AMD has raised up a bit with
In 2003, AMDs Opteron line looked like it their more recent products, but you see us
Servers form the center of a data center, offered serious x86-compatible competition responding.
and the chips within them the core of those for Intel, but its position has waned. This is a traditional performance battle
servers most of which are x86 servers, that "AMD got a little unfocused and, perhaps that were very accustomed to, and were
is, servers which are backward-compatible because of that lack of focus, had some comfortable in reacting and competing very
with the instruction set which developed execution issues," AMDs data center head aggressively in. And so you see us coming
from Intels original 8086 and 8088 CPUs, Forrest Norrod told DCD. It lost its share of out with our Xeon Scalable.
starting in 1978. With most of the 8086's the market, handing Intel a near monopoly
successors ending in '86,' the name stuck, as that it has only tightened its grip on. We look at the Xeon Scalable Platform and
did the architecture. But now things could once again change. what it could mean on pages 28-29.
In Q3 2016, x86 servers brought in 90 AMD is spoiling for a comeback fight. We may not know the outcome of this
percent of the server market revenue, Under the stewardship of CEO Dr Lisa battle for some time, but one thing is for sure
according to IDC. Outside the x86 sector, Su, the company has launched Zen its first the war for the heart of the data center is
vendors like IBM and ARM fight for scraps. new microarchitecture in years. back on.

18 DCD Magazine datacenterdynamics.com


Cover Feature

RETURN TO SPLENDOR Strange


We know how to win in the data center. would not prove fruitful, and that it had to bedfellows
Dr Lisa Su, AMDs CEO, appears confident. go back to the drawing board and design an AMD and Intel may be
Were backstage, sitting in a small room entirely new microarchitecture Zen. That fighting tooth and nail over
just after the official launch of the Epyc was a big decision. But the corollary with that the x86 market, but they are
server chip, where journalists, analysts and is that it would take us years. We were okay actually strangely joined.
potential customers were presented with a with that. Since 1976, the two compa-
bevy of impressive stats, selling points and The result is, by all accounts, a processor nies have shared numerous
partnerships. family that can contend with Intels best. cross-licensing agreements
I think we've shown there's a huge pent-up Depending on the workload, or the budget, to allow each partner to use
demand for different solutions. The current each vendor offers a compelling argument the other's patented x86
offerings in the CPU space, frankly, aren't very but the fact that AMD is even considered in innovations after a certain
flexible for all of these different workloads that such discussions is a major improvement. time period.
are there. It's more than just technology, it's I feel really good about our end, Su said.
While the agreement has
giving the customer the choice. The differences that we're talking about in
changed over the years
terms of price performance, I/O flexibility,
with the 1996 deal ensuring
Choice has indeed been lacking for some memory capability these are not small
AMD could no longer make
time in the x86 server market, currently differences. This is not about a price game,
direct copies of Intel chips
dominated by Intel, which maintains a roughly about buying sockets, this is about enabling
beyond the 486 processor,
98 percent market share. someone to do something that they can't do
or use the same bus as Intel's
At our peak we were at 25 percent, right today, or something that used to be much,
chips it has been funda-
now we're at let's call it less than one percent, much more expensive.
mental in allowing both com-
Su said. It's all upside. I think a very reasonable Processors in the Epyc 7000 series range
panies to work in the sector
goal for us, hopefully sooner than 2020, is to from the eight-core, 16-thread Epyc 7251
on compatible, infringe-
get back to double digit share. If we get back running at 2.1 to 2.9GHz in a 120W power
ment-free products.
to 10 percent share plus think about it, just envelope, to the 32-core, 64-thread Epyc 7601
the CPU market is a $16 billion market. That's running at 2.2 to 3.2GHz, with a 180W TDP. The agreement is, however,
a significant opportunity for us, and that's an canceled if AMD declares
intermediate goal, not the final goal. But while AMD has certainly upped its bankruptcy or is taken over.
That 25 percent peak, achieved over a game, Intel has not rested on its laurels.
decade ago with the Opteron processor line, Since Epycs launch in late June, the reigning
feels like a distant memory. We became champion launched the next generation of
distracted and we were doing many different Xeon processors (for more on this, read Max
things, server was not the first priority, Su told Smolaks deep dive on the following pages). The real
DCD in her first interview since Epycs launch. Perhaps sensing that Epyc represents a reason for this
As I look forward, I see the data center greater threat than any of AMDs recent efforts, partnership? Lawsuits.
space as a driver of high performance Intel has launched an unprecedented assault In 1991, AMD filed an
technologies and a showcase of what we can on the rival processor family. Until recently, the antitrust lawsuit against
do. AMD has the potential, it has always been company rarely referred to its main competitor Intel claiming it was trying to
a company of potential, and my job with this by name; this time, it invited journalists to maintain an x86 monopoly.
management team has been to unlock that a special event, just before Epycs launch, to
The court ruled against
Intel, awarding AMD
potential with strong execution. explain that the product was inferior.
$10m and patent
For Epyc, that journey started four years Key to its argument was the fact that Epyc
licenses.
ago, when the company realized that pushing processors consist of four Zen-based dies
forward with the Bulldozer microarchitecture in a single package, which Intel slammed u

Issue 23 August/September 2017 19


Sebastian Moss
Reporter

uas a "repurposed desktop product for "For many of these customers, if we


server," claiming it delivered "inconsistent
performance from four glued-together
only showed them Epyc, it would not have
been enough, they've seen the second and
In the left corner:
desktop dies." the third generation, and that gives them
When asked about the decision to confidence in investing in AMD.
go with four individual dies, Su told She added: I view this as a long term
DCD: Honestly, it was one of the best investment. [Intel] are very capable, but I
decisions that we've made, I love it. The think we have shown that we can be very,
biggest issues that we have in advanced very capable as well.
technologies is that, as you get much After the interview, I wandered outside
larger die sizes, your yields and your costs to the Epyc launch party. As the sun set
go up. on another blistering Texan summer day,
The way we're able to offer all of this a few AMD employees were suddenly
Weighing in at amarket
I/O and all of this capability up and down chanting: Twenty dollars!
capof $12.5bn
the stack is because of that decision we Twenty dollars! Twenty
made to go with the multi-chip module dollars!
approach. I think this idea of breaking up The companys share
large die is absolutely here to stay, so I am price had rocketed over
very pleased we made that bed. the course of the
week, gaining
As with all commercial products, nearly 40
the true arbiter of whether the percent and
right decision was made will be the standing at
AMD
market. Early customers announced roughly $14.
by AMD include Microsoft and Baidu The dark
in cloud computing, and the likes of times of near-
HPE, Dell EMC and SuperMicro in the bankruptcy seemed Best known for performance per dollar
server space. behind them, but as
Founding member of The Green Grid
I think you will hear about other I contemplated the
cloud providers deploying AMD in the rise and fall of AMDs On Nov 2012, set the record for the highest
second half of the year, Su said. In fortunes, I knew they frequency of a computer processor
some cases, the main challenge has still had a way to 8.794GHz
been to convince customers to put go.
their trust in AMD, a company that
may know about winning in the

!
data center, but certainly knows
about losing in it. Two years ago,
with large debts and no profits,
its share price hovered below $2,
and its future was being openly
questioned.
At the end, it always comes Is a fabless semiconductor manufacturer,
back to 'are we reliable, do we
turning to for-hire foundries
have a good price-performance Is behind Intel in CPUs and Nvidia in
advantage, can they expect to GPUs in market share
use us for multiple years?'

20 DCD Magazine datacenterdynamics.com


Cover Feature

Max Smolaks
News Editor

In the right corner:


WE HATE SURPRISES
Intel sometimes refers to itself as the computing, artificial intelligence and 5G
guardian of Moores Law an observation connectivity.
made by one of its founders, Gordon Enterprises will need to think about
Moore, who stated that the number of how they handle these new data workloads,
transistors that can be packed into seamlessly move them between public
an integrated circuit will increase and private cloud, he said during
twofold approximately every two the launch event. Cloud service
years. providers will need to
Weighing in at amarket This has been a double-edged think about how
capof $172bn
sword: on one hand, it they improve
highlighted Intels performance,
importance as the security, agility,
dominant supplier utilization
of CPUs, responsible across their
for roughly 98 percent infrastructure.
of the x86 server market today,
and as a result, the direction of
the industry as a whole. On the other
INTEL
hand, it made Intels product launches as
predictable as the rising and the setting of
the sun: we know that performance of a
Best known for performance per watt new generation of chips will conform to the
well-established tradition.
Claims to be conflict mineral free
Acquired McAfee for $7.8bn, Altera for The fact that the Xeon Scalable Platform
$16.7bn, Saffron for an unknown sum and (Xeon SP) processor family, launched in
Nervana for $400m June, delivers approximately 1.65 times
more performance than last years Xeon
E5 v4 will surprise absolutely no one.
The new CPUs feature more cores, more

!
inter-CPU bandwidth, and support more
memory channels. But they also introduce
never-before-seen features that should,
in theory, enable IT equipment to do new
and exciting things.
According to Navin Shenoy, who
Accused by the European Commission of assumed responsibilities for Intels data
anti-competitive practices in 2007 center division in May, Xeon SP was
Settled with AMD for $1.2bn after claims designed to tackle three mega
of unfair business practices in 2009 trends that are changing the
face of the industry: cloud

Issue 23 August/September 2017 21


uCommunications service providers will on the PCIe bus, and were just seeing the tip
need to deliver a completely new level of of the iceberg with some of the performance
network agility and efficiency, in the face of that it can offer.
the new service demands as 5G is deployed. As the CPUs become more capable,
And in the world of artificial intelligence, and as we see more cores added through
we will see a broad range of algorithms FPGAs and GPUs, we need to be able to feed
develop, and this will require a broad range those cores and NVMe is one way that we
of solutions. can really crank up the performance on the
All new Xeon SKUs are now divided storage side.
into four categories, based on their relative With Xeon SP, Intel is indeed improving
level of performance and target workloads: its HPC credentials: besides increased
from Bronze for low-power applications to performance, the new Platinum chips are
Platinum for high-performance computing. the first ever to offer support for AVX-512
Theoretically, the same hardware could instruction set extensions and integrated
be made to serve very different purposes, OmniPath networking. Both will be useful in
depending on the choice of silicon.

The processor family is the first to replace


4.2 the design of systems that deal with machine
learning, analytics and 3D rendering.
Before the official launch, Intel supplied
the tried and tested ring architecture with a The no. Lenovo with several thousand Xeon SP
mesh that adds more interconnects between Platinum 8160 chips, which were then
cores. It also supports a larger number of of times more integrated into Mare Nostrum 4 a beast
cores on the same die up to 28, whereas of a machine hosted at the Barcelona
Broadwell would max out at 22. virtual machines Supercomputing Center. This immediately
Its a fundamental change to the
processor design that allows us to
that data center propelled it to the position of the 13th
most powerful computer on the planet,
increase performance by optimizing data
sharing and memory access between all
operators will be able to with 11.1 petaflops of raw processing
power.
CPU cores, Lisa Spelman, the director of host compared to systems But Mare Nostrum 4 is currently
product marketing for Intel's data center outperformed by supercomputers
group, explained. from four based on CPUs from IBM, Oracle and
We realized that Xeon Scalable even AMD thats got to hurt if you
wouldnt actually scale the performance years ago are positioning yourself as the worlds
that we wanted to see from the product,
if we didnt increase the bandwidth for the
(Intel) foremost authority on chip design.
Lenovo and Intel plan to get a Xeon SP
interconnect between the cores, the cache, supercomputer into the Top 10 in November,
the memory and the I/O. Without change, and well see a lot more Xeons in the Top500
the interconnect, which adds a lot of value, going forward but whether the company
suddenly becomes the limiter. can unseat the reigning champion, Chinas
Thats why Im excited about this Sunway TaihuLight (based on processors
innovation: it provides a more direct data from The National High Performance IC
path for traveling between cores than the Xeon SP places more emphasis on Design Center, Shanghai), is anyones guess.
previous ring architecture did. security, with Intels Quick Assist Technology As for Intels references to 5G while
One of the clear benefits of Xeon previously available as an optional extra, there has been plenty of industry buzz
SP is increased server density: a single now bundled as standard. It involves a around next generation wireless, the first
motherboard can now support up to eight dedicated chip for tasks like compression, networks based on this standard are not
sockets without the need for additional node encryption and key management and expected to appear before 2020, and will
controllers. This means system builders according to Intel, the new Xeons encrypt be limited to China. Meanwhile, major
can squeeze more compute into a single data twice as fast as the previous generation's manufacturers still have no idea how to
rack unit, and for data center operators, this processors. build this technology into handsets without
means an increase in demand for power and Another interesting feature that aims to having to double their size.
cooling per square meter. Average power cash in on a change in server hardware is
densities in data centers are growing all Intel Volume Management Device (VMD), Intel will release several generations of
the time, so this is nothing new, but a rapid which brings RAID-like features to your processors before it needs to contend with
increase could catch some providers off- expensive, high performance NVMe solid 5G theres simply no market for it at this
guard. state drives, making them hot-swappable. stage. But theres plenty of market for servers
With PowerEdge FX, we can have four What this means is we can take that can run existing telecommunications
dual-socket Xeon SPs in 2U that gives advantage of some of the capabilities in networks, as more and more operators
me 224 cores in 2U, with a large memory the new CPUs, with enhanced bandwidth experiment with SDN and NFV in order to
footprint, 1.5TB worth of RAM per sled, Mark around PCIe, to have better NVMe capability, cut their costs.
Maclean, server product manager at Dell James Leach, director of platform strategy Taking all of this into account, we can
EMC, told DCD. Its not just a speed bump for Cisco UCS, told DCD. conclude that Xeon SP is much more than
its an architectural change. Estimates will Because thats the real difference when a marketing exercise. It offers real solutions
always vary, but in certain workloads, we we were switching between SAS and SATA, it to actual challenges in server design with
are seeing up to 50 percent performance was very simple because we were routing the Moores Law gradually losing its role as the
increase. same kind of connectivity. NVMe depends force driving Intels agenda.

22 DCD Magazine datacenterdynamics.com


Cover Feature

ARM BIDES
Cavium from having their own shot at version continues the legacy of technical
the title of the welterweight champion. innovation, with up to 54 cores running at

ITS TIME
The former is an expert in mobile devices, up to 3GHz. Each ThunderX2 supports up to
and wants to apply its knowledge in the six memory channels and up to 1.5TB of total
enterprise IT market. The latter used to memory just like the latest Xeon SP.
specialize in networking, before deciding The platform promises to be
to try its luck with servers. extremely workload-specific,
Qualcomms Centriq 2400 is with hundreds of integrated
While the big boys of the CPU market are the worlds first server processor hardware accelerators
slugging it out on the ring, a number of based on the 10 nanometer for security, storage,
smaller, more agile chip vendors are quietly process. For comparison, networking and virtualization
making inroads into the data center using Xeon SP is still using 14nm, applications, across four main

RM
cores designed by ARM (which was recently which was also used in last varieties. Just like Centriq 2400,
acquired by SoftBank). Despite the venerable years Xeon E5. The number ThunderX2 is expected to ship in
(and vulnerable) Opteron branding, AMDs of nanometers defines the the second half of 2017.
A1100 launched in the beginning of 2016 resolution of the chip lithography This will be a fight to
has failed to set the world on fire, and no one, equipment smaller numbers mean remember.
including AMD itself, is even mentioning it more transistors on the same size of the

A
these days the Zen architecture looks like a die, increased speed and reduced power
much stronger contender. requirements.
Su told DCD: "I think ARM is a good Qualcomms first foray into server chips
architecture, and an important architecture. offers up to 64 cores and will be able to run
We partner with ARM, we use ARM in not just Linux, but Windows Server too a
some of our semi-custom products. But I testament to its lofty ambitions. It should
think relative to the data center, x86 is the ship in the second half of 2017.
dominant ecosystem and will continue to be Meanwhile, Caviums ThunderX2 is
the dominant ecosystem." a refined, updated version of the silicon
launched in 2015, with new cores and I/O.
That hasnt stopped another two The original chip was among the first to
American companies Qualcomm and implement 64-bit on ARM, and the latest

Issue 23 August/September 2017 23


Why tape
is still strong
Tape storage has been written off in the past,
but it turns out the cloud era needs it more
than ever, says Dan Robinson

as Amazon Glacier have proven attractive to


businesses of all sizes for off-site storage of
backup or archive data because of the low Keeping tabs on your
cost per gigabyte and the ease of transferring tapes with a tape
data over the Internet. library
This does not mean that tape storage Tape drives, like disk drives,
is about to vanish. For one thing, many can scale in capacity by simply
Dan Robinson regulated industries such as the finance or adding more of them. While
Correspondent legal sectors have strict regulations which disk drives might be combined
require providers to retain data and to be into an array or multiple arrays,

T
able to prove that its content has not been tape systems can also expand
ape storage is one of those altered. Modern tape systems offer a write- capacity by simply using
technological hangovers from once-read-many (WORM) capability that multiple tape cartridges for
the early days of computing, delivers this, and for this reason, tape is often eachdrive.
associated in the minds of mandatory for archiving data. Doing this on any kind
many with rooms full of There are other reasons why tape is likely of scale calls for careful
massive mainframe cabinets. to be around for some time, according to management of the tape
Somewhat like the mainframe, tape shows no Clive Longbottom, service director at analyst cartridges, and this is where
signs of going away just yet, and ironically, firmQuocirca. tape library systems come into
could even be handed a new lease of life The biggest one is still investment the picture. These combine one
thanks to the burgeoning volumes of data protection: the cost of large tape libraries or more tape drives with storage
that are being accumulated in modern and robotic retrieval systems is high, and just slots for multiple tapes, which
datacenters. dumping these because disks are now cheap can be loaded as necessary to
(but failure-prone) is just not a good financial write data or read back data that
Modern tape storage is a world away argument, he said. has previously been stored.
from the movie clich of huge tape reels Then there is the ongoing cost. Sure, Individual tape cartridges
spinning backwards and forwards as the spinning disks are becoming cheaper and are identified using barcodes or
computer chomps its way through some cheaper to acquire. However, keeping RFID tags on the cartridge itself,
complex computation. Todays tape drives the disks spinning has a large ongoing and an automated mechanism
use cartridges capable of holding up to 15TB operational cost due to the required power for loads the tapes into the drives
of data, and are more often used for backup spinning. A tape, once written, is close to zero as required, then removes them
or archiving purposes than for online storage. cost it holds its data until it is needed again. and places them into a storage
However, few in the data center industry Hard disks can be spun down, but rarely are, slot in the library when not
can have failed to notice the rapid changes he added. inuse.
that have been taking place at the storage Tape libraries come in
layer of the infrastructure. Flash-based solid Meanwhile, the shift towards cloud- a variety of sizes, from 2U
state storage devices have enabled new tiers based services for storage has simply moved rack-mount systems that can
of low latency storage with higher IOPS, the problem from the business to the cloud hold 8 or 16 tape cartridges, up
while hard drive makers have responded by service providers. While the enterprise tape to monsters such as Spectra
continuing to push the storage density of market has declined each year, cloud service Logics TFinity that takes up
rotating media, driving the cost per gigabyte providers are turning to tape as the optimal three data center rack frames
ever lower. solution for backing up the ever expanding and can expand to 40 frames
The end result is that the cost of disk volumes of customer data they are storing, containing over 50,000 tape
storage has fallen to a level where many or for actually delivering archive services to cartridges for a total storage
businesses have begun to use disk-based customers. capacity in the region of 1.6
backup systems where once they would have Cloud providers have a bit of a problem: exabytes of data.
used tape drives or tape library systems. In they have put heavy focus on the incredible
addition, cloud-based storage services such scale of their storage capabilities.

24 DCD Magazine datacenterdynamics.com


Servers + Storage

The trouble is longevity of several decades if stored under new tape heads and lubricant. This could
that customers conditions of low temperature and humidity. lead to a theoretical maximum of 330TB in
have fallen for In the past, there were many competing a single standard palm-sized tape cartridge,
the message. tape formats, but most of these have largely half the size of a 60TB SSD.
Therefore, the given way to Linear Tape Open (LTO),
big players are which was developed as an open standard Compatibility is a key concern for
looking at a need for not controlled by any single vendor. IBM technologies that will be used for long-
zettabytes of storage and Oracle still have their own proprietary term archival of information. For this
capability to meet formats while also supporting LTO. reason, the LTO Consortium enforces
customer expectations, LTO has been through multiple iterations, strict rules to ensure that any LTO drive
said Longbottom. with LTO-7 introduced in 2015 delivering can read cartridges from the two
Fortunately, a large a native capacity of 6TB per cartridge, or preceding generations as
proportion of this data is up to 15TB with data compression. The well as its own, and can
unlikely to be accessed ever again,
so if the service provider can figure
next generation, LTO-8, is expected
later this year or early next year, and is
330TB write data to cartridges
from the previous
out what data is likely to be accessed, that anticipated to boost native capacity to tape cartridge generation. IBM's TS1155,
can go onto disk while the bulk of it can be 16TB, with up to 32TB possible using promised by IBM for instance, supports
written to tape with SLAs stipulating that compression. and Sony existing JD and JC tape
some data may take an hour or more to be IBMs 3592 series of tape systems has cartridges.
recovered. likewise been through multiple iterations, If tape vendors can
Amazon does not say what technology but the firm has recently introduced the sixth continue to boost storage
its Glacier service uses, but it is widely generation in the shape of the TS1155 Tape density, and keep the price per gigabyte of
believed that it is based on tape storage, Drive, which offers a native capacity of 15TB, tape at rock-bottom levels, there is no reason
simply because the retrieval times quoted to or up to 45TB using the 3:1 compression ratio why the old medium should not continue for
customers are as much as several hours. that IBM quotes for the technology. several more decades for backup and archive.
Tape is well suited for archiving or There is no sign yet of an end to increased An enterprise with just less than a
long-term storage as it offers by far the tape capacities. Most recently (July 2017) petabyte of data should focus on disk-based
lowest price points of any storage medium, IBM and Sony have pushed the record tape backup and archive. Greater than that, and
with a raw storage cost of around $0.02 density to 200Gbits per square inch in an Id be looking at how and where tape could
per gigabyte, and also boasts a potential experimental process which uses sputtering, possibly play, saidLongbottom.

> Colo + Cloud | Dallas

THE FUTURE OF DIGITAL INFRASTRUCTURE FOR


HOSTING, COLO, TELCO, CLOUD, AND MSP
September 26 2017 // Fairmont Dallas Hotel
Pre-conference programs September 25 Headline Sponsor Lead Sponsors

#DCDColoCloud Visit www.DCD.events

Issue 23 August/September 2017 25


>Awards | 2017

The Data Center Awards


are Open for Entries!
Until September 29th

Category 1 Category 2 Category 3 Category 4 Category 5


Living at the Edge The Infrastructure The Smart Data The Data Center Energy Efficiency
Award Scale-Out Award Center Award Eco-Sustainability Improvers Award
Award

Category 6 Category 7 Category 8 Category 9 Category 10


Mission Critical The Open Data Cloud Journey Data Center Data Center
Innovation Award Center Project of the Year Operations Team Operations Team
Award of the Year of the Year
Enterprise Colo+Cloud

Category 11 Category 12 Category 13 Category 14 Category 15


Design Team of Best Data Center Young Mission Business Leader of Outstanding
the Year Initiative of the Critical Engineer the Year Award Contribution to
Year of the Year the Data Center
Industry

Charity Partner

Open for Public Voting


October 16 2017
Category 16
Worlds Most Beautiful
For more sponsorships and table booking Data Center
information, please contact Dan Scarbrough
at global.awards@datacenterdynamics.com

www.dcdawards.global
Servers + Storage

A wrinkle in time
The Internet has ways to keep time, but they may not
be good enough for a new breed of applications and
regulations, reports Sebastian Moss Sebastian Moss
Reporter

Time synchronization
solutions
In centralized systems, the
solutions for asynchronization
are simple as the main server
dictates time to its counterparts.
Examples of such solutions
are Cristians algorithm and the
Berkeley algorithm. The former
relies on the existence of an
accurate time source connected
to the centralized server and
variations of this algorithm
exist to account for the network
propagation time. The Berkeley
algorithm, on the other hand,
works in systems where a
time measuring device is not
present. As such, a time server
will periodically ask clients their
time measurements, average
them out and send the adjusted
time back.
In distributed systems, the

O
clock asynchronization is more
n January 26, 2016, time went Network Time Protocol (NTP) servers do visible as parts of such a system
wrong. As the Space Command some of the work, but mission-critical or might reside in different time
division for the the US Air large-scale applications often run their own zones and within different
Force began to decommission network time servers for increased security environments. For this, the
SVN 23, a satellite in the GPS and redundancy. For more accurate time most widely used solution is
constellation, things went awry by a whole synchronization, Precision Time Protocol NTP (Network Time Protocol).
13 microseconds. (PTP) is used (see boxout for more). Tried and tested over the years,
Fifteen GPS satellites broadcast the With new regulations around the corner, this is now the norm for time
wrong time, immediately causing issues on understanding how to keep time, and synchronization across the
the Earth below. In a post-mortem, time- how to prepare for when clocks go bad, is Internet.
monitoring company Chronos detailed becoming crucial for an increasing number Because of network latency
numerous anonymous telecoms customers of businesses. this protocol can still fail in
who suffered 12 hours of system errors, while For example, the financial sector will certain cases where an offset
BBC Radio also experienced disruptions. be subject to the EUs Markets in Financial of a few milliseconds is not
Instruments Directive II (MiFID II) from acceptable, in which case
As those affected by the error can January 2018, in which subsection RTS 25 PTP (IEEE 1588 Precision
attest, accurate timekeeping has become requires timestamping to be 1,000 times Time Protocol) can be used,
increasingly important to computer more accurate than is required by current which, when coupled with
networks. Distributed systems need to be legislation. Synchronous Ethernet, can
synchronized, and some require traceable MiFID II has been put forward in a very, deliver sub-nanosecond
and accurate timestamps for everything from very robust way, the National Physical synchronization.
financial transactions, to alarm events, to the Laboratorys strategic business development
time and duration of phone calls. manager Dr Leon Lobo told DCD. u

Issue 23 August/September 2017 27


uThey've specified the traceability Most organizations trying to track time jamming is anecdotal, with Equinix's LD4
requirements, and the figures that they've put turn to satellite receivers, but are then data center in Slough being one of the few
forward necessitate infrastructural upgrades. reliant on the satellites being accurate to publicly discuss GPS interference. In that
IT systems need to be upgraded. predominantly via GPS, but also via other case, the issue was with a 20 year old 'rogue
NPL acts as the UKs National Global Navigation Satellite System (GNSS) GPS antenna' that was reflecting a powerful
Measurement Institute, looking after networks such as GLONASS, Galileo and GPS signal and accidentally jamming those
Coordinated Universal Time (UTC), the Beidou. around it.
world's primary clock standard. A lot of the receivers used for this task
are repurposed navigational receivers, that Outside of data centers, one example
Another regulation to watch out for is are dumb or naive they're just taking the frequently cited is that of Newark Airport,
the Payment Card Industry Data Security time from the satellite stream, and they're not which in 2009 opened a new GNSS-based
Standard. Jeremy Onyan, director of time qualifying or error checking it, Ron Holm, landing approach system. For two years they
sensitive networks at precision time and sales and marketing manager of time and were struggling to get it to work sometimes
frequency instruments manufacturer frequency company EndRun, said. it'd work great, sometimes it wouldn't, and
Spectracom, said: There's a section there, In the case of the January 2016 error, a they just couldn't figure it out, Onyan said.
10.4, that's all about accurate and traceable smart receiver would have looked at that, So they called in the FCC who came in
time. Traceable is the key word because if done some integrity checks, not only on that with some special equipment and discovered
you're using time off the Internet, it may part of the GPS data, but on other aspects of that there was a red Ford pickup truck that
seem sufficient for your application, but it's the GPS data, to reject that input. one of the contractors working the site was
in no way traceable. Even with a smart receiver, however, using which had one of these jammers
the GPS signal itself can be susceptible to installed. Every time he was on site, he
jamming or spoofing. was taking down their landing approaches
Jamming is becoming more prevalent without meaning to.

80-120/month particularly because the devices are very


cheap to buy and the jamming is often
In a 2014 report on such GNSS
interference and jamming, Chronos looked at
The number of GPS jamming inadvertent, Dr Lobo said. It's a van driver how widespread the issue was at the time.
instances in the City of London taking out the tracker in his vehicle to "Jamming is getting worse," the paper by
prevent his boss knowing where he is. So Professor Charles Curry said. "Some probes
(Chronos) he inadvertently is taking out the city block are now detecting five to ten events per
that sort of an event is occurring on a day; over 50 websites are actively selling
regular basis. jammers; and the devices being seized by
Onyan concurred: Look at most law enforcement agencies are now more
If there is ever an event, especially in data centers, where are they typically powerful and so have considerably greater
a distributed system, and you wanted to located? They're in warehousing districts jamming ranges.
correlate how hackers got in and what they or manufacturing districts, places where "Whereas in 2008 GPS was the only
did, it's almost impossible to do because the they've got lots of big buildings, and you've satellite [Positioning, Navigation & Timing]
disparity in the timestamps if you're using got trucks driving by all the time. All it takes is system under threat, and jamming targeted
Internet-based time is so big that it's not even somebody driving by with a jammer. its L1 frequency alone, now all frequencies of
usable in a lot of cases. Much of the conversation around GPS all GNSS are under attack.

28 DCD Magazine datacenterdynamics.com


Servers + Storage

When a signal is jammed, or The University of Texas at Austin's


otherwise interrupted, time servers have Todd Humphreys looked into the
a backup ready to take over. They go into possibility of spoofing data centers in All the time off the world
holdover and turn to what are essentially the financial sector. In a research paper, The GPS project was started in the
uninterruptible power supplies for time: he noted that "traders could use GPS United States in 1973, with the first
crystal oscillators. These electronic spoofing as a weapon against each satellite launching five years later.
circuits use the mechanical resonance other. A trader could manipulate, via GPS While initially developed for the US
of a vibrating crystal of piezoelectric spoofing, the timing of a competitors military, it was opened up to civilian
material to create an electrical signal with trading engines, driving the competitor use in the 1980s - quickly becoming the
a precise frequency. out of the marketplace during a crucial predominant navigation system and way
trading interval. After the attack, the to synchronize time around the world.
Holm explained: The base level rogue trader covers his tracks by bringing But telling time from space is not an
oscillator is typically a TCXO, a his competitors timing back into proper easy process, with matters of relativity
temperature compensated crystal alignment with true time." coming into play.
oscillator, and that'll provide acceptable The solution proposed by Spectracom The atomic clocks on the satellites
performance for 24 hours. The next step is to turn to a different satellite system. need an accuracy of 20-30 nanoseconds,
up is an ovenized oscillator, that will Instead of, or in addition to, a GNSS signal as observed by those on the ground.
provide acceptable time for approximately from a satellite system 16-21,000 km (10- However, with the satellites not in
30 days. An atomic oscillator provides 13,000 miles) above Earth, the company geosynchronous or geostationary orbits,
time for several months. offers access to the Iridium constellation they move in relation to Earth-based
Depending on how much one spends, launched by Motorola in the '90s, in low observers, which Special Relativity
these oscillators are meant to keep time earth orbit at 800 km (500 miles). predicts makes the clocks appear to
while the jamming, or more likely broken This is encrypted, Onyan said. Which tick more slowly, from our perspective
cable or roof antenna, are mitigated or makes it harder to spoof because you - in this case to the tune of about seven
fixed. That still leaves spoofing, which need to know both the encryption key, microseconds per day.
is particularly difficult to detect, Dr which is RSA-level encryption, and you Meanwhile, as the satellites orbit at
Lobo said, because it's about somebody also need to know the serial number of 12,550 miles (20,200 km), spacetime is
drifting the time rather than it being an the specific receiver in question. less curved by Earth's mass than on the
out and out denial of service. surface, which General Relativity predicts
The worry is that spoofing devices are The other advantage is that the signal means the clocks will tick faster (from the
becoming easier to own and to manage. is roughly a thousand times stronger, perspective of the Earth) - this time to the
They are becoming software-defined which means it can work indoors in tune of about 45 microseconds per day.
radios, as opposed to very expensive most environments." Together, if left unchecked, the
hardware solutions. Dr Lobo has another idea: to do away satellites clocks would be 38 microseconds
with satellites as the primary faster than our time on our planet, quickly
timekeeping method altogether. rendering the system useless.
NPL has started to offer a precise Thankfully, the satellites were designed
time service, NPLTime, over fiber with this in mind. The atomic clocks are
in the UK, with plans to expand slightly slower than those on the ground
access to Europe and beyond. to accommodate for General Relativistic
We are delivering over fiber time differences. At the same time, an
infrastructure in a completely onboard microcomputer computes
managed way, so we are Special Relativistic timing calculations as
monitoring the latency to every required.
endpoint on the network and
effectively offsetting the time at
those endpoints, as a result what's
coming out of those end points is infrared laser beams then cool the atoms
exactly the same as what's going down to near absolute zero temperatures and
in, Lobo said. stop them vibrating.
We see it as the replacement The two vertical lasers gently toss the ball
where GPS could form a second upward, before letting it fall back down in a
priority input as a backup system, round trip time of about one second.
but the primary traceability Effectively if it was considered to be a
requirement for MiFID II is easily clock, it would lose or gain a second over 158
satisfied by this managed service million years, Lobo said.
that we are providing, where But NPL is not content to stay at that
we provide certification of the level of accuracy it is currently working
accuracy at that point of ingress in on a clock where you're looking at losing
a customer's infrastructure. or gaining a second over the lifetime of the
The time measurement is universe 13.8 billion years. So theres some
provided directly by NPLs cesium pretty significant capabilities there.
fountain clock, NPL CsF2. Questions remain over whether data
In it, a gas of cesium atoms centers will need that level of temporal
is introduced into the clock's accuracy, but one day we will find out after
vacuum chamber. Six opposing all, its just a matter of time.

Issue 23 August/September 2017 29


Virtually there?
Will virtual reality simulation become an
indispensable tool in the data center industry?
Tanwen Dawn-Hiscox immerses herself

S
imulation is becoming integral
to data centers: as architects
ponder building plans, and sales
teams show off a virtual facility
to customers unwilling to dive
into a contract without visiting
the site. Even engineers and technicians
increasingly use simulations to improve data
center efficiency, redundancy and capacity.

Future Facilities, a London-based software


company which specializes in tools for the
design and operation of data centers, is
testing out what it believes could be the next
step in data center operations: VR simulation.
Since its inception in 2004, the company
has been refining its modeling software,
which is based on computational fluid
dynamics (CFD) the study of fluid and gas
flows using numerical analysis and data
structures to obtain quantitative predictions
and simulate the interaction between gases
and fluids in the data center.
To do so, it collects live operational data
from the facilitys monitoring and DCIM
tools; these are then informed by guidelines
and standards, such as measurements for
data center resource efficiency as defined by
the Green Grid and ASHRAEs thermal and
humidity guidelines.

Tanwen Dawn-Hiscox
Reporter

30 DCD Magazine datacenterdynamics.com


Power + Cooling

Initially, the company mostly dealt with reality, but after toying with the technology is in design and execution. The development
clients seeking to understand operational seeing much potential. makes sense in that it follows the industrys
failures, but eventually customers turned The company has developed a learning curve (and by extension, the
to it for general planning and pre-emptive demonstration using Oculus Rift which companys).
purposes. allows one to wander through a series of
That is the number one benefit of simulations of data centers throughout the The final virtual room, an edge data
simulation: whatever the change, you can do ages. From the 1950s, when a data center center containing a Vapor IO chamber, a
it upfront, its a kind of virtual playground. was effectively just a single low powered, self-sufficient cylindrical block containing
Youve got a model of your room and you uncooled mainframe, through the '80s six racks and an integrated adiabatic cooling
can do whatever you want to do, whether it is a time of blue carpets, glass door racks, system, could bring one of the possible use
maintenance on a cooling unit, or installing monitors on shelves and untidy cables and cases for VR in the data center:
new IT equipment, said Mark Fenton, the 2000s, when operators discovered the If, as is predicted by some with the
chartered engineer and product manager at joys of raised flooring and contained aisles. emergence of 5G, we bring data centers
Future Facilities. In the simulation, one can overlay all closer to us to power our equally hypothetical
sorts of data: airflows and their temperature autonomous cars, so-called smart cities
As well as whitespace modeling, the are represented by arrows in a gradient of and virtual reality social media, then offices,
company models generator units, cooling colors ranging from deep blue to red. One former phone boxes, rooftops and cell towers
plants, and its simulations take into account can, for example, check the operational status may contain a handful of racks.
internal and external factors that can affect of the cooling equipment and simulate its In this scenario, our IT would either need
design and operations, like the weather: You reaction to different actions (helpfully color to be fully self-healing, or technicians would
could build a beautiful whitespace and not coded in green for ideal temperature, red for need to have access to multiple facilities at
get your cooling stuff quite right and end up overheated, and blue for overcooled), or test once. So, theoretically, were everything to be
with a really awful performance and actually the effect of a new piece of IT. software-defined, a single person could sit in
overheating even if you've designed that The immersive demonstration gives a control room with a mask on, and remotely
whitespace well. an idea of the industrys progress in operate dozens of edge sites at once without
Future Facilities' team was initially understanding how to run a data center, from having to make the strenuous journey to
skeptical about the importance of virtual zero planning to a higher level of complexity every one of them.

Issue 23 August/September 2017 31


But before this can happen, the next overlay DCIM tools and simulations to get "this is it, this is how it goes" and it's going
step in improving existing VR simulation information streamed directly in front of to be the Google and the Facebook, the big
technology has to be the ability to interact ones eyes. carriers. And whoever it will be is going to
with the model and the real world. This For Mark Fenton, thats kind of where we take over that area.
is the early stage beta, but really the next see the next stage for VR and then the final
thing we want to be able to do - because bit we really want to get to. But for now, the As it stands, taking modeling into
at the moment you can control where you benefits of a lot of these new technologies virtual reality for marketing, pre-sales and
move and where you look but you can't have yet to reach their full potential. education feels like a small step up from its
interact with the model in the same way An important factor in that process, less immersive predecessor.
that we eventually see it happening would according to Future Facilities director But being able to interact with a VR
be that you could pick a device, get some Jonathan Leppard, is a lack of courage (and model of a data center would open a whole
information about it, move it, decommission of course resources) in the industry. new page of potential for engineers and
it, whatever. It's almost like no-one wants to put their technicians.
Such advances could prove useful for foot forward first. The financial sector never Before this happens, however, DCIM
training purposes. With the data center does it, its going to wait until generation two and monitoring tools, network functions
industry growing steadily all the time, or generation three. It takes someone to go virtualization (NFV) and software-defined
there is a massive shortage of networking
specialized (SDN) will
staff. An Irish need to
university become
recently ubiquitous.
launched We're
the first ever sitting on a
Bachelors plateau of
degree in technology
data center at the
engineering, moment;
an online course that will require its students I don't see how data centers are moving
to visit the north western Irish town of forward quickly at this point in time. But just
Sligo in Ireland or Le Mons in Belgium, for as soon as that next level goes and youll see
practical sessions. In theory, with interactive a whole new raft of data centers, big, small,
VR, they could do these modules remotely. in water, in space, doesn't matter. Once it
becomes automated and self-healing and
Future Facilities even sees itself taking managed, then you'll see an explosion in the
the technology one step further, and using industry, that's when we'll take off again to
augmented reality in the data center. One another step of progress.
could wear a Google Glass-like optical For now, though, we can only speculate.
head-mounted display (OHMD) device Were sitting on the verge of something that
whilst walking around a facility, and feels important, but isnt quite there yet.

32 DCD Magazine datacenterdynamics.com


Advertorial: Sify Technologies

Keeping You Ahead with Sify


Hyper Connected Data Centers
Keeping your business on cloud nine with futuristic Data Center services is what we excel
at. We are Sify. We design and establish Data Centers, and are one of the Indias leading Data
Center service providers. Our enviable track record of providing uninterrupted services
extends to over 9,000 organizations across sectors and scale.

S
ify Technologies provides data organizations. Sifys cloud cover connects
center services like colocation, over 43 data centers, across the country,
managed hosting, DC IT on a high speed network. The company
services and DC migration. provides high capacity multi-protocol
Through our evolved low latency networks across multiple
infrastructure and refreshed cities in India to provide solutions to
technology, we offer state-of-the-art ICT address the unique requirements for data
solutions that keep you ahead. As your center data center traffic as compared
technology partner, we empower you with to traditional data center branch traffic.
100% uptime and transform your business Sify Technologies has marquee customers
by opening innovative possibilities. Our who leverage this network to connect
Data Centers are ideally suited to be both their disaster recovery and near disaster
the primary Data Center and DR. These are recovery facilities to their primary data
strategically located in different seismic center.
zones across India, with highly redundant Sify Technologies has evolved from a
power and cooling systems that meet and network and data center service provider
even exceed the industrys highest standards. to becoming a full-fledged converged ICT
player with capabilities for data center
Our impressive portfolio of over 425,000 transformation, application integration
sq.ft. of server farm is spread across 6 Tier III and transformation integration services.
Data Centers, 15 Tier II Data Centers, 6 State Sify Technologies has demonstrated
Data Centers and several more for private its prowess in many DC transformation
clients, all built to exacting specifications and projects by implementing best-of-breed
best-in-class global standards. solutions which enabled customers to three years. Its cloud platform has scale-out
experience the best-fit solution to address capabilities to compete more effectively
Hyper Connected Data Center Features their current and future requirements for public cloud requirements and is also
and de-risk technology adoption. This extending its support to AWS and Microsoft
43 Data Centers connected by high
transition was played on the premise that Azure, including offering managed services
capacity fiber access
the brand had built a sound infrastructure on top of third-party cloud providers.
Internet Exchange present in Data foundation and aligned services on them As a full scale ICT player, Sify
Center in Mumbai in sync with the ever evolving needs of Technologies is in the best position to align
Zero Latency access to AWS Direct enterprises. Therefore, it believes that it these expectations to the market place for
Connect in Mumbai is able to serve the requisite services with any enterprise.
Google POP enables peering within the same level of SLAs across the board.
Data Center in Noida Sify Technologies is also well
positioned to address the increasing
Carrier neutral with presence of major
market opportunities on upcoming
telecom operators within Data Centers
platforms of Software as a service
(SaaS), Platform as a service (PaaS) and
With over 17 years of operational and Infrastructure as a service (IaaS) as
technical expertise serving over 300 DC enterprises move from build to subscribe
customers spread across BFSI, telecom, and from control to visibility.
pharma, retail, manufacturing, media, etc. As Contact Details
a cloud and network services provider, Sify It is positioned as one of the leading Mark Ryder
Technologies understands that reliable and vendors in the prestigious Gartners Managing Director Europe
affordable connectivity is key to leveraging magic quadrant for cloud enabled Phone: +447429169426
data center and cloud investments made by managed hosting, Asia Pacific for the past Email: mark.ryder@sifycorp.com

Advertorial: Sify Technologies


Do we curb demand
- or feed it?
Data centers insatiable demand for power may actually be just
what the utility grids need and the thing that has to change is
Peter Judge
not the infrastructure but the people. Peter Judge reports from
Global Editor DCDs Energy Smart event in San Francisco
i d M u rr a

M
D av y|
any people are asking if data and the cloud applications running on H
y
centers use too much energy. it can have a lifetime of months. This

dr
o-
The real issue is bigger: Can data Protean package ironically depends

Qu
centers and the energy supply on a much less flexible life support

be c
industry work together to deliver system: the electricity grid.
electricity and online services which meet our
needs... without costing the earth? Energy investments are
Ten years ago, a report from Lawrence intrinsically generational in length,
Berkeley National Laboratory (LBNL) warned or you can never make them pay
that data center power usage was out of control. for themselves, explained Don Paul
Last year an updated report found that energy of the University of Southern California
use by US data centers has actually leveled off. Energy Institute. But the digital world, which
But there is a lot more to the story than that, is wrapped around that now, like two strands of
according to senior executives from both the DNA, is cycling around at ten times the rate.
data center and the power industries, who The two worlds are very different, said David
gathered for DCDs first Energy Smart Summit in Rinard, director of sustainability at Equinix:
San Francisco in June. You have an entrepreneurial approach [in the
Online services are expanding and cloud] versus a blue-chip, older approach to the
developing rapidly, making data centers a network grids. And the consequences of their
s fundamental part of the infrastructure failure is different too: If you lose the Internet
on
as supporting human society. According you go back to the 1980s. If you lose the grid,
M

to Jim Connaughton, a former you go back to the 1880s.


re
In fra s tr u ct u

presidential environmental adviser But regardless of the timescales, there are


now working for Nautilus Data big changes happening in both worlds, which
Technologies: They are the have an impact on the way both services are
foundation of the new economy. delivered. Energy grids are moving (unevenly)
But data centers are intrinsically towards renewable sources. And data centers
n|

fast moving. The IT equipment is are changing the way they handle reliability.
so
el

N
n replaced every three to five years, Both of these changes are interlocking.
Dea

34 DCD Magazine datacenterdynamics.com


Power + Cooling

It seems that while we may want to curb called demand response schemes. Data like we always have. History shows that as
the data center sectors energy demands, centers could in this way effectively economies grow and civilizations advance,
their very size could actually help the utility erase themselves from the grid at critical they always consume their efficiency gains.
sector in its move to renewables, by being moments, said David Murray president of Energy demand will rise again, he says,
good, and very large and demanding distribution at Hydro-Quebec, Canadas unless the efficiency per byte continues to
customers. largest utility. Alternatively, they could even have dramatic changes, said Paul.
Gary Demasi, Googles director of data sometimes contribute power from their The demand for digital services is limitless
center energy and location strategy, said: We backup sources to the grid. as long as they appear to be free, said
have always locked horns with the utilities. There is a dire need for energy efficiency Nelson: Were not going to change human
They are not structured to deliver us the to be married with demand response, behavior.
product that we want, and weve really got to according to Priscilla Johnson, data center
challenge that. Weve been successful, but its water and energy strategist at utility PG&E. Gamification might be one approach to
been a very rocky road and weve had to do a The idea of sharing precious backup try to change behavior, said Dr Julie Albright
lot of what I would consider to be unnatural systems has usually been anathema to data of USCs department of applied psychology.
things. center operators focused on delivering Utilities could feed energy usage figures back
At first, Google found utilities a reliable service, but that could to consumers and apply social pressures so
reluctant to sell it renewable change, if they deliver reliability in they compete with friends to use less. We
energy, but as a champion different ways, according to could make it fun, she said, though the
of power purchase Between Peter Gross, vice president word fun and the word utility in the same
agreements, Google is 1955 and 2005, of mission critical systems sentence could be a problem.
now the worlds largest computations per at fuel cell vendor Bloom David Rinard of Equinix agreed, taking
renewable power Energy. us back to the culture clash between utilities
customer.
Watt increased by 1bn Historically there and tech. The tech world makes easy to use
was always a buffer smartphones, while the tools provided by
In Nevada, Switch between the utility and the utilities are not of the same consumer grade
helped change state policy server, said Gross. We had this quality. They are not intuitive and easy to use.
so renewable power contracts are available very complex infrastructure support system They are not something you would find at
to all, because it has purchasing muscle, consisting of UPSs and generators and Best Buy.
according to Adam Kramer, vice president transfer switches. It was expensive but In the end, data center operators may
of strategy: In 2016, in Las Vegas, Switch resiliency and reliability drove pretty much have to change peoples behavior, rather than
accounted for the entire growth in a large every decision in the design and operation of themselves.
rate class which includes a lot of casinos. data centers.
The growth rate was one percent." Things are different in 2017, where the DCD is planning more Energy Smart events.
We are the utilities best customers, cloud is increasingly using a distributed For news and coverage, check out features
because we are have a consistent baseline of resiliency concept where the reliability of and videos at bit.ly/DCDEnergySmart
demand, pointed out Dean Nelson, CTO of individual data centers is not quite so vital,
Uber and founder of Infrastructure Masons. he said.
That demand is going to increase which will So is everything going smoothly? Don
help stabilize the demand for the grid. Paul isnt so sure. The leveling of data center Dr J
u li
Connaughton believes its perfectly ok energy demands found by LBNL could just eA
lb
to have growth. If data centers had more be a temporary hiatus caused by a period

ri
demand, we could actually refurbish some when efficiency gains canceled out the

gh
t|
of the sagging infrastructure faster, because uncontrolled growth in digital demand.

US
theres more money flowing to the utilities. People are underestimating the

C
Beyond that, data centers have backup digital growth rate, warned Paul. Energy
power systems, which could be used demand is flat because there have been
to help the grid out, perhaps by massive efficiency gains, but they have
using them so data centers can been consumed by increased demands.
power themselves at peak times in so- Weve consumed our efficiency gains, just

Issue 23 August/September 2017 35


> Webinars | 2017

ON DEMAND
Are you Hybrid Ready?
Its Where IT is Going
Speakers:
Chris Ortbals, Executive Vice President of Product
Management, QTS Data Centers
Andrew Boardman, Director of Sales, North East and
Mid-Atlantic, QTS Data Centers
Moderator:
Stephen Worn, CTO, DCD
This on-demand webinar examines how a hybrid
methodology is impacting IT management, data center
and cloud deployments and the ways that providers have
evolved to meet a growing demand for fully-integrated,
hybrid solutions.
Watch and listen, as our subject experts address four key
issues, including the challenges IT managers face when
implementing a holistic hybrid strategy.

u Watch Now: bit.ly/HybridITWebinar

ON DEMAND
4 Steps to a Successful High-Speed
Migration Strategy
Speakers:
John Schmidt, Vice President Global Data Centre
Solutions, CommScope
Peter Judge, Global Editor, DCD
This recent one-hour webinar is packed full of insight and
provides clarity on the subject of data center migration.
Learn how the shift to collaborative, parallel processing is
impacting infrastructure design.

u Watch Now: bit.ly/MigrationWebinar

For more information please contact webinars@datacenterdynamics.com


Xxxxxxxxxxxxx

widely-circulated estimate that data centers


use about two percent of the electrical power Crosshead here Met optiame siminul

Foundations of
of the US, and the figure is thought to be lestruption nonsent ma sinienti dolluptas
similar elsewhere. The National Resources autemolore laut aliquat ibeate lat ipsus
Defense Council (NRDC) broke down that voluptatur rero comnia sitas dis elestru
amount of power used within different ptatissi remolum, consed quatur?There is a
sectors of the industry, and its pie chart tells widely-circulated estimate that data centers

the smart city


an interesting story. use about two percent of the electrical power Chris MacKinnon
Google, Amazon, Facebook, Apple and of the US, and the figure is thought to be Canada
that crowd, together make up the hyper- similar elsewhere. The National Resources Correspondent
scale cloud computing sector, which Defense Council (NRDC) broke down that
consumes less than five percent of the amount of power used within different
power used by US data centers Then the sectors of the industry, and its pie chart tells
Data centers that support the smart grid, but are
independent from it, will be the first step towards
building smart cities, says Chris MacKinnon
financial crash hit. There was no budget for an interesting story.

T
greenwash. Cuts were needed. But this didnt Google, Amazon, Facebook, Apple and
kill the green shoots:
he smartcutting
city, built
energyon use in Bythatproviding
crowd,power
togetherto cities
makevia
up athe
smart
hyper- mission critical applications in time
data centers also andcut
around
energy thecosts,
Internet
so Green grid,
scale
Tanzella
cloudtoldcomputing
DCD, cities
sector,
can which zones that are not in peak power times
IT lived on in the of Things
movement(IoT),tois improve
an data communicate
consumes less loads,
than
minimize
five percent
costs,of the provides a methodology for saving cost
center. interconnected network restore
powerpower
usedfaster
by USafter
dataacenters
disruption,
Thenandthe and energy. Renewable energy sources
of devices that gather, integrate
financial
renewable
crash hit. energy
Theresources,
was no budget
and forthat the Smart Grid will have access to will
Crosshead here store,
Met
andoptiame
share data siminul
while provide
greenwash.
reliableCutsoperations
were in allow data centers to supplement peak
communicating
lestruption nonsent withmaonesinienti
another dolluptas
to multi-carrier
needed. But networks.
this power requirements efficiently.
improve
autemolore efficiencies
laut aliquat
across
ibeatea variety
lat ipsus In
didnt
Tanzellas
kill the Neetika Sathe, director of
of
voluptatur
functions. reroSmart
comnia
grids,
sitas
on dis
theelestru
other opinion,
greentheshoots:
smart advanced planning at Alectra
hand,
ptatissiutilize
remolum,the same
consedIoTquatur?There
concepts, butis a gridcutting
promisesenergy
to let Energy Solutions, agrees that the
in
widely-circulated
the energy grid estimate
space wherethat two-way
data centers data use
centers
in data take smart grid can transform data
communications
use about two percent between
of thetheelectrical
utility power advantage
centers ofalsoenergy
cut centers. IT is changing quickly, and
company
of the US, and the end figurecustomer
is thought canto be price
energy
arbitrage
costs,bysomoving
Green many data centers around the world
improve
similar elsewhere.
provisionThe of energy
NationalwhereResources
and loads
IT between
lived on in geographical
the movement to are outgrowing their existing capacity
when
Defense it isCouncil
needed.(NRDC) broke down that areas
improve
serveddataby different
center. different utilities. to power and cool IT systems. When
amount
Fred Tanzella,
of powerCTO usedof within
Greendifferent
House He says energy costs fluctuate across connected to a legacy distribution system,
Data,
sectorsdefines
of the theindustry,
smart and
concept
its pie chart tells theCrosshead
well. countries and here
time Met
zones.
optiame
Running
siminul these data centers need to upgrade or u
an interesting story. lestruption nonsent ma sinienti dolluptas
Google, Amazon, Facebook, Apple and autemolore laut aliquat ibeate lat ipsus
that crowd, together make up the hyper- voluptatur rero comnia sitas dis elestru
scale cloud computing sector, which ptatissi remolum, consed quatur?There is a
consumes less than five percent of the widely-circulated estimate that data centers
power used by US data centers Then the use about two percent of the electrical power
financial crash hit. There was no budget for of the US, and thele and that crowd, together
greenwash. Cuts were needed. But this didnt make up the hyper-scale cloud computing
kill the green shoots: cutting energy use in sector, which consumes less than five
data centers also cut energy costs, so Green percent of the power used by US data centers.
IT lived on in the movement to improve data u
center.

Issue 23 August/September 2017 37


Issue XX XXXXXX 201X
Power + Cooling

u replace their power networks, the distributed ledger technology behind the power bill. Dynamically migrating these
transformers, switches, UPSs, cabling and the Bitcoin currency and other systems, loads across geographic regions is simple in
connectors. This requires power shutdown can replace human validation in finance, concept, but can be difficult in practice.
for the data centers, which poses added cost. because it is more efficient and more Tanzella says the more risk-averse data
Sathe elaborates: Energy costs represent hacker-proof. center managers will likely be slow to adopt,
the single largest component of operating LaFrance says self-driving cars but as the technology and software becomes
expense, and a potential barrier to are another example of how we are more prevalent, data center managers will
future expansion. Also, while the level replacing human decisions by machine embrace the smart grids functionality.
of power quality and reliability offered intelligence because, in his opinion, LaFrance says in the future we will
by legacy distribution systems it is more reliable. This new way of see more small edge computing facilities
are acceptable for the majority of doing things will require a lot more responding to the always-growing proximity
customers, voltage and frequency energy because we will need places to demand for IoT, mobile devices and other
fluctuations, harmonics and store massive amounts of information low-latency needs. He added: Large
short-term power outages can have and servers to make these decisions hyperscale facilities should be located further
costly and disruptive effects on data for us. away from dense areas to avoid the growing
centers. Whats more, LaFrance says power demand and pressure on utilities that are
Distributed energy storage and utilities and data centers can meet in located near dense areas, which minimizes
generation, and microgrids offered by strategic partnership on integrated micro- a utilitys investment due to the lack of space
smart grids, are seen as complementary or grids and demand response, but mainly and power. Large hyperscale operators
alternative solutions to meet data center during peak periods. The utility will should, and hopefully will, also base site
energy requirements, Sathe says. Smart be able to save a lot of money when, Caption
selection decisions here that mainly
on criteria
grid technology can enhance reliability and during peak periods, it can rely on consider the use of renewable energy.
also offer local, distributed energy solutions customer generators, batteries, or power On the power network side of data
to keep a data center running even during production facilities, instead of buying centers, Sathe says the trend could be
outages. Using a smart grid allows for the expensive energy on the spot market. Hydro- towards meeting energy requirements of
diversification of energy resources that make Qubec is already a leader in this field and data centers in a decentralized fashion.
up a data centers onsite generation. They will seek nearly 1,000MW from its industrial The data center of the future could be
also allow data centers to improve the ROI on and commercial customers during the winter fully integrated with the grid in a way that
Peter Judge
assets
Globalthat traditionally have been sunk costs
Editor peaks. LaFrance says decentralized energy enhances the local resiliency and reliability.
(that is, most data centers relied on natural production, coupled with large battery packs, For those data centers where reliability is the
gas generators for resiliency). Using energy will ensure data centers are online when their number one priority, being connected to the
storage allows for the increased utilization utilities are down. bulk grid while having their own distributed
of the asset for other functions like ancillary But smart energy does not come without energy resource (DER) would be advised.
services, disaster recovery, etc. its challenges, as Tanzella pointed out. There They could also participate in transactional
Eric LaFance, senior trade commissioner are indeed current concerns or hot topics for energy markets to stack up values of the
at Hydro-Qubec says: We are slowly data centers when it comes to smart grids. smart grid assets they own.
replacing the human brain with machine Data center managers have historically been As DCD has found in its own events,
decisions, which will require far more tasked to maintain power and cooling for data centers will be fundamental to moves
electricity. He suggests that blockchain, mission critical applications regardless of towards smart cities, as well as smart grids.

38
38 DCD Magazine datacenterdynamics.com
PRO Mission Critical Training Solutions

New website
launching Q3

A complete solution for


the data center industry.
For professional development and entire workforce programmes

Online Classroom Academy


An industry leading suite of online DCPRO offers a wide range of Reduce per-person training costs
courses for knowledge top-ups or classroom courses taught by subject and ensure continuing professional
remote learning. matter experts in an interactive manner. development for your workforce.

DCProfessional Development provides scalable eLearning and face-to-face data center training for the global mission critical sector. Courses are
developed and delivered by an international network of industry experts using the latest educational techniques. Globally recognized and accredited,
our training addresses the real world tactical demands faced within the data center industry.

www.dcpro.training | info@dc-professional.com
Containers
unlock the cloud
Dan Robinson
They told us the cloud would allow workloads to move freely Correspondent

between in-house facilities and public services, but so far it


hasnt happened. Dan Robinson believes that containers
might finally get the cloud moving

C
loud computing is no longer workloads, while new-build applications
viewed as some exotic and services are developed and deployed
new technology and has in a public cloud platform such as AWS.
become an accepted part of Even where traditional workloads have been
an organizations IT toolkit farmed out to a service provider, this has
for meeting business needs. typically been a case of colocation or of a
At the same time, cloud is less mature than hosted private cloud arrangement.
other IT sectors, and can prove to be more According to Ovum principal analyst Roy
complex than unwary adopters may expect. Illsley, this is because many organizations are
still at an early stage of cloud adoption, and
One of the promises of cloud computing are just looking to get their foot on the first
has long been that users should be able to rung of the ladder.
move workloads from their own data center We are not really seeing companies have
to that of a cloud service provider and back [workload mobility] as their first concern.
again, if required, or even between public What they are really looking to do is lift and
clouds. This might be because an application shift to the cloud, so they might have this
or service calls for more resources than the three-tier app or database based on legacy
organization has available, or simply because hardware, and they want to know how to
it is more cost-effective to run it off-premises get that to the cloud as a first step. Only
at that time. once theyve done that, they start looking
Despite this, cloud adoption has broadly at how to transpose that workload, change
followed the pattern of private clouds that workload, and think about moving it
being used to operate traditional enterprise somewhere else, he said. u

Issue 23 August/September 2017 41


Colo + Cloud

u There are good reasons why workload operate that container on a platform that
mobility has never really taken off. VMware runs in various other cloud environments,
with its vMotion feature has supported live saidIllsley. What is Docker?
migration of virtual machines from one host Containers are a less mature technology Docker is the company and
server to another for over a decade, but it is than virtual machines, and the various platform that everyone thinks of
one thing to perform this inside a data center platforms are thus still in the process of when containers are mentioned,
and quite another to do it from one data developing and perfecting associated but it did not create the concept.
center to another. components such as orchestration, Instead, it took some capabilities
monitoring, persistent storage support and (known as LXC) built into the Linux
Things get more complicated if you lifecycle management tools. These are largely kernel and developed them into a
want to move a workload from your own essential for operating container-based finished product that made it easy
cloud to another one that you do not application frameworks at any kind of scale, for users to create and operate
control. It is not practical to move a virtual as the Internet giants such as Google or a workload using one or more
machine to a cloud based on a different Facebook do. containers, making it popular with
platform, because of differences in the Docker has established its platform developers for use as a dev-and-test
hypervisor, the APIs and the management as the leading format for packaging and platform.
tools used. Even if it is based on the same running containers, but there are several Part of Dockers success is
platform, you may not have the same level tools available for orchestration, such as that it not only created a set of
of management oversight and control as the Kubernetes project, which originated at APIs with which to control the
you do over workloads running on your Google, or the Mesos project from the Apache Docker host, but also a package
owninfrastructure. Foundation, as well as Dockers Swarm tool. format for storing and distributing
Then there is the fact that enterprise Kubernetes is integrated into several applications as container images.
workloads seldom operate in a vacuum; they enterprise platforms, such as VMwares The Docker architecture also calls
depend on other resources to function, such Photon, Red Hats OpenShift application for a registry to act as an image
as shared storage, an SQL database server or platform and even Microsofts Azure library, and users can maintain their
directory service. Unless such services are Container Service. Meanwhile, Mesos is own registry or fetch ready-made
also replicated to the cloud, traffic between used by many large web companies, such images from a public one such as
the workload and these resources will as Twitter, Airbnb, and eBay as it can scale the Docker Hub.
have to be routed to and fro across a WAN to manage tens of thousands of nodes A container under Dockers
connection instead of across the data center runningcontainers. platform is therefore basically a
itself. The elephant in the room is that wrapper for delivering and running
Perhaps it is this reason that led VMware containers are tied to a particular operating an application or service along with
to change direction and ditch its vCloud Air system kernel. This is typically Linux, as any code libraries it depends on, all
service for Cloud Foundation, a platform that container platforms such as Docker build on inside its own little bubble.
it describes as a self-contained software- features in the Linux kernel, but Microsoft However, the underlying LXC
defined data center (SDDC) that can be has recently begun adding support for capabilities it built on were more
provisioned onto commonly used public containers into Windows Server and its Azure geared towards partitioning a Linux
clouds such as AWS and IBM SoftLayer. cloudservice. server into multiple separate user
However, the prospect of workload spaces, similar to Zones in Oracles
mobility could move nearer to reality thanks While a container image created on a Solaris operating system or even
to containers. Most people are familiar with Linux system should run on any other Linux older mainframe concepts.
containers thanks to the efforts of Docker, system, you cannot take a container image
but the ecosystem is made of various created for Windows and run it on Linux,
technologies and implementations. and vice versa. Having said that, Microsoft
What all container platforms share is announced at DockerCon in April 2017
that they enable a piece of software and its that it will support Linux container images
dependencies (such as code libraries) to be in Hyper-V Containers, under which the
wrapped together into an isolated space container lives inside a small virtualmachine.
the container. Multiple containers can run Containers are unlikely to completely
on the same host, like virtual machines, replace virtual machines, but for many
but containers are more lightweight on workloads, especially scale-out distributed
resources, and a given server can operate workloads, they are becoming the preferred
more containers than virtualmachines. method of deployment. This is not just
The containers approach gives because they are easier and quicker to
developers the opportunity of writing a provision, but also because they are easier to
new cloud-native app that, provided youve migrate from one system to another, finally
got support for containers and if youre beginning to fulfill more of the promise of the
on Linux, you will have you can then cloud.

42 DCD Magazine datacenterdynamics.com


> Zettastructure | London

OF DRIVERLESS CARS
AND SERVERLESS
DATA CENTERS
Europe
Z Summit
:
Novem
ber 8

Free passes for qualified end users

November 7-8 2017 // Old Billingsgate Market, London


Headline Sponsor Principal Sponsors Lead Sponsors

For more information visit www.DCD.events

@DCDConverged #DCDZettastructure Datacenter Dynamics DCD Global Discussions


Calendar
Events
DCD> Edge Roundtables
> Colo + Cloud | Dallas
25 September, Dallas September 26 2017
Data center leaders explore the > Canada 4.0 | Toronto
impact of IoT, mobility and edge December 14 2017
Z
A Colo + Cloud pre-event special
200 places only
> Zettastructure | London
November 7-8 2017 // Old Billingsgate Market

Open Compute Project workshops


on servers, racks, power and more
at Dallas, Toronto and London events

Events
> Mxico | Mexico City
September 26-27 2017

> Peru | Lima


October 18 2017

> Brasil | So Paulo


October 30-31 2017

> Chile | Santiago


November 15 2017

Training
Data Center Cooling Professional: Energy and Cost Management:
Lima, Peru Buenos Aires, Argentina
September 25-27 2017 October 16-18 2017 Data Center Design Awareness:
Barcelona, Spain
Energy Efficiency Best Practice: Critical Operations Professional: Mxico, October 23-25 2017
Mxico, D.F., Mexico D.F., Mexico
September 28-29 2017, Hotel Novit October 23-25 2017 Data Center Power Professional:
Porto Alegre, Brazil
Data Center Cooling Professional: Data Center Design Awareness: October 25-27 2017
Santiago de Chile, Chile Lima, Peru
October 2-4 2017 October 23-25 2017 Energy Efficiency Best Practice:
Madrid, Spain
Data Center Cooling Professional: Data Center Design Awareness: October 26-27 2017, Hotel Zenit Abeba
Mxico, D.F., Mexico Madrid, Spain
October 9-11 2017, Hotel Novit October 23-25 2017, Hotel Zenit Abeba Critical Operations Professional: Lima, Peru
October 30-November 1 2017

44 DCD Magazine datacenterdynamics.com


DCD Calendar

Training Energy Efficiency Best Practice: Events


Data Center Design Awareness: London, United Kingdom > Zettastructure | Singapore
Stockholm, Sweden September 21-22 2017 September 20-21 2017
September 11-13 2017
Data Center Power Professional: > Hyperscale China | Beijing
Energy Efficiency Best Practice: London, United Kingdom November 2 2017
Stockholm, Sweden October 2-4 2017
> Enterprise | Mumbai
September 14-15 2017
Data Center Cooling Professional: November 9 2017
Data Center Design Awareness: London, United Kingdom
October 9-11 2017 > Converged | Hong Kong
London, United Kingdom
November 10 2017
September 18-20 2017

Training
Data Center Design Awareness:
Osaka, Japan
August 7-9 2017

Data Center Design Awareness:


Beijing, China
August 29-30 2017

Data Center Design Awareness:


Sydney, Australia
August 30-1 September 2017

Data Center Design Awareness:


Hong Kong
September 4-5 2017

Critical Operations Professional:


Hong Kong
September 6-8 2017

Data Center Design Awareness:


Shanghai, China
September 8-9 2017

Data Center Design Awareness:


Singapore
September 11-13 2017

Data Center Design Awareness:


Tokyo, Japan
September 11-13 2017

Energy Efficiency Best Practice:


Singapore
September 14-15 2017

Critical Operations Professional:


Events Singapore
September 18-20 2017
> Trkiye | Istanbul
December 5 2017 Energy and Cost Management:
Wyndham Grand Istanbul Levent Singapore
September 25-27 2017

DCD>Events // www.dcd.events
SE Asia Data Center Week
Sales: chris.hugall@datacenterdynamics.com
15-21 September, Singapore
Marketing: jake.mcnulty@datacenterdynamics.com
D
 CD>Zettastructure conference Speaking: rebecca.davison@datacenterdynamics.com
& expo
DCD>Online // www.datacenterdynamics.com
CxO and cyber security summits
Sales: yash.puwar@datacenterdyanmics.com
DCPRO training workshops Marketing: ryan.hadden@datacenterdynamics.com

Data Center Week Awards DCPRO>Training // www.dc-professional.com


F1 Grand Prix dinner Contact: liam.moore@datacenterdynamics.com

Issue 23 August/September 2017 45


Viewpoint

The unsung hero

P
eople say data centers are not sexy, but they are wrong:
in fact, this industry is filled to the brim with all kinds of
smut. Data centers have been shaped by pornography
as much as they have been shaped by developments in
military technology or mobile communications.
According to conservative estimates, around 10
percent of all data transferred across the Internet features naked people.
During 2016, the worlds most popular adult website was responsible for
streaming 99GB of data every second.
Some of the advancements in technology that were directly
influenced by our penchant for X-rated material include video codecs
and compression, online advertising and payments security.
They say YouTube democratized access to online video, but they are
wrong: after all, no matter what content you post on YouTube, you will
have to use the platforms proprietary tools. Porn gave rise to thousands
of different video engines and hosting platforms, has kept an army of
web developers in work and created a shadow economy worth billions.
Adult websites were also some of the first to support Bitcoin at scale
as a way to avoid embarrassing items in customers bank statements.

Carnal pleasures are a big business: the porn industry is worth


around $97 billion, more money than Major League Baseball, the
National Football League and the National Basketball Association
combined.
Interestingly, most of the largest online porn websites you
might have heard about YouPorn or RedTube are all owned by a
Luxembourg-based company with a fairly neutral name MindGeek
(formerly Manwin).
MindGeek describes itself as a leader in web design, IT, web
development and SEO. It employs more than 1,000 people, and serves
more than 115 million users every single day. It uploads 15TB of new
content every 24 hours.
Initially, MindGeek didnt even produce porn instead focusing on
During 2016, the analytics and content delivery. But the company played its cards so well
it eventually acquired some of Americas largest studios, and became
most popular one of the top bandwidth consumers in the world.
adult website Whatever you think about the subject, sex has always been a great
motivator, driving technological progress forward even as it made
streamed 99GB of people blush. So thank you, Internet pornography, for making our
infrastructure better.
data every second
Max Smolaks
News Editor

46 DCD Magazine datacenterdynamics.com

Vous aimerez peut-être aussi