Académique Documents
Professionnel Documents
Culture Documents
eptembe
August/S s.com
c e n te rdynamic
MORE data
GHz
EPYC
32 CORES
ON THE COVER
30 18 Chip wars:
AMD gets Epyc
Intel strikes back
ARM hangs on
NEWS
34 07 Top Story
China forces Apple to build
07 In Brief
18
2017
ptember
August/Se ics.com
erdynam
datacent
MORE Asgardias orbiting SSD
GHz
FEATURES
EPYC
32 CORES 24 Tape stays strong
30 Virtual reality arrives
34 Do we curb energy demand?
37 Smart city foundations
41 Containers unlock the cloud
OPINION
46 Max Smolaks knows what
really drives the Internet
REGIONAL FEATURES
37 19
Simulations and Every nanosecond Dont be power
Asias security experts speak
simulacra in the world counts as satellites hungry, become
of data centers show their age energy smart
CALENDAR
44 Events and training
DCDs global happenings
INTERVIEW
19 AMD on the offense
An exclusive sit down
with Dr Lisa Su, AMDs
CEO, on her companys
fight to win back the
Dr Lisa Su server market
CEO of AMD
41 EDITORS PICK
wrinkle in time
A
The Internet has well-developed
techniques for synchronization.
But they may not be good
enough for emerging
applications and regulations
27
Issue 23 August/September 2017 3
From the Editor HEAD OFFICE
98%
102108 Clifton Street
London EC2A 4HW
White Space
+44 (0) 207 377 1907
Fight Club
Peter Judge
Global Editor
@Judgecorp
Max Smolaks
F
News Editor
@MaxSmolax
ight metaphors aren't More adventures in time for Seb:
normally my cup of tea, but distributed cloud services can only Sebastian Moss
you can't avoid it when you work when the services have an agreed Reporter
look at the conflict in server and shared reference system time. @SebMoss
chips, between the grizzled How is that time delivered? It turns Tanwen Dawn-Hiscox
Share of the x86
reigning champion Intel, out there's a whole sub-industry of Reporter
server market
and AMD, a fan-favorite contender, time servers, using esoteric techniques @Tanwendh
owned by Intel.
making a late bid for glory. like cesium fountain clocks and
AMD's portion David Chernicoff
With tempers rising, I assigned my rubidium crystal oscillators.
"rounds up to 1%," US Correspondent
colleagues Max Smolaks and Sebastian That may sound abstruse, but new AMD CEO Lisa Su @DavidChernicoff
Moss to the ringside, each covering financial rules mean you will need to told DCD (p19)
one corner. know about this (p27). Virginia Toledo
Max got inside information on Intel, Editor LATAM
while Seb got privileged access to AMD, Capacity demands are growing @DCDNoticias
including an interview with the CEO, but our Energy Smart summit in San Celia Villarrubia
Dr Lisa Su (p18). Between them, they Francisco says this is not necessarily an Assistant Editor LATAM
also cover ARM and other players. environmental problem (p34). @DCDNoticias
My only worry is - after getting Data centers use energy steadily, Paul Mah
roped into this bruising contest, will which is good news for the utility grid SEA Correspondent
Max and Seb still be friends? which likes to deliver power that way. @PaulMah
The bad news is that both utilities
and data centers have set up a Tatiane Aquim
Brazil Correspondent
Curbing demand is a system where services delivered
@DCDFocuspt
from data centers appear to be free.
fight between humans Demand grows without limit and the
and an inanimate global infrastructure has so far been scarily DESIGN
good at meeting that demand. Chris Perrins
system we created It turns out that the best way to Head of Design
exit this cycle may be to intervene in Holly Tillier
human behavior, and create feedback Designer
Cooling tech is more standard fare which helps people to realize and limit
for DCD readers, so Tanwen took a their own environmental demands. Mar Perez
Designer
virtual reality look at its history (p30).
Through a combination of VR That sounds a lot like blaming the
headsets and computational fluid users. But put it another way. If curbing ADVERTISING
dynamics modeling, Future Facilities demand were a fight, it would be a fight Yash Puwar
gave her a magical historical tour of between humans and an inanimate Head of Sales
data center efficiency all the stages of global system they created.
Aiden Powell
data center construction reproduced in In dystopian sci-fi, the global
Global Account
virtual space. system might win. In the real world, I'd Manager
As so many people have told us, say the human race could still score a Peter Judge
whatever power and racks you may knockout. DCD Global Editor
have, it's all about the air flow. bit.ly/DCDmagazine
FIND US ONLINE
datacenterdynamics.com datacenterdynamics.es datacenterdynamics.com.br twitter.com/DCDnews | Join DatacenterDynamics Global Discussion group at linkedin.com
SUBSCRIPTIONS
datacenterdynamics.com/magazine
TO EMAIL ONE OF OUR TEAM
firstname.surname@datacenterdynamics.com
2017 Data Centre Dynamics Limited All rights reserved. No part of this publication may be reproduced or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, PEFC Certified
or be stored in any retrieval system of any nature, without prior written permission of Data Centre Dynamics Limited. Applications for written permission should be directed to Jon McGowan, jon.mcgowan@
datacenterdynamics.com. Any views or opinions expressed do not necessarily represent the views or opinions of Data Centre Dynamics Limited or its affiliates. Disclaimer of liability: Whilst every effort has been made to This product is
ensure the quality and accuracy of the information contained in this publication at the time of going to press, Data Centre Dynamics Limited and its affiliates assume no responsibility as to the accuracy or completeness from sustainably
managed forests and
of and, to the extent permitted by law, shall not be liable for any errors or omissions or any loss, damage or expense incurred by reliance on information or any statement contained in this publication. Advertisers are controlled sources
solely responsible for the content of the advertising material which they submit to us and for ensuring that the material complies with applicable laws. Data Centre Dynamics Limited and its affiliates are not responsible
for any error, omission or material. Inclusion of any advertisement is not intended to endorse any views expressed, nor products or services offered, nor the organisations sponsoring the advertisement. PEFC/16-33-254 www.pefc.org
Vox Box
Can the pace of efficiency match the What is different about energy
increase in demand? policy in the US?
For the last five years, efficiency has Energy policy in the US is unique.
kept up, because of Moores Law, The states are sovereign entities
virtualization and the increasing with certain powers under the
efficiency of the infrastructure constitution - and those states have
measured by PUE. There are a lot of been the primary drivers of energy
people questioning whether Moores policy for centuries. As well as this,
Law can continue, you cant get better virtually all the energy assets are
than a PUE of 1, and you cant get more privately owned. In most countries,
than 100 percent utilization. There natural resources are owned by the
Dale Sartor could be some issues beyond 2020. Donald Paul government, not individuals.
Scientist/engineer Director
LBNL bit.ly/energysmartsartor USC Energy Institute bit.ly/energysmartpaul
bit.ly/megamerger
bit.ly/cloudyfuture bit.ly/DCDgrowbabygrow
the facility will start offering public cloud Gangwon Province. It was designed to
Naver to build services in the second half of 2020. resemble traditional rice terrace farms
Naver, often referred to as the Google and is listed in the Top 10 beautiful data
$420m data center of South Korea, was established in 1999 centers feature from our April/May issue.
as the first Korean web portal to develop Now, the company is expanding with
in Yongin, South its own search engine. The company has a new facility built to power the public
been successful in keeping its competitors cloud platform it launched in April.
Korea at bay by fusing latest technologies Naver Cloud Platform already offers 30
with decisively Korean aesthetics. It also basic infrastructure services related to
South Korean cloud giant Naver is planning enjoys informal support from the Korean computing, data, security and network, and
to spend 480 billion won ($420 million) on government. promises to be price competitive with both
a massive data center in Yongin, just south Naver was the first Internet company AWS and Microsoft Azure.
of Seoul. in Korea to build and operate its own data
According to SK publication Platum, center, the Gak facility in Chuncheon, bit.ly/anaverdatacenter
bit.ly/armingup
$3.2 million
serviced in 2005, and its air conditioning system had failed,
raising temperatures by 25 degrees Celsius (45F).
Until now, there had been no disaster recovery site to
Amazons US lobbying ensure that the register was backed up and available in case
of a technical failure or an attack on the IT systems. In fact,
spend in Q2 2017 the register was backed up on tapes stored at the IEBCs head
offices, where its data center was located.
(bit.ly/alexalobbygov) bit.ly/Iwontheelectoralcollege
percent of the total project cost, which the for us - not only because they are a large
US DOE awards DOE says will bring the total investment to customer at various places, but also
at least $430 million. because were also developing our systems
$258m in HPC The announcement comes after oftentimes in a co-designed fashion
continuing concern over DOE funding together with them, where we work together
contracts levels under the Trump administration. The on technologies and try to match software
bipartisan 2017 fiscal year budget agreement and hardware technologies in a better way.
The US Department of Energy has awarded slightly increased DOE funding, but Trumps This is a concern to us, of course.
six American tech companies with contracts own budget proposed to reduce it by up
for high performance computing (HPC) to $900 million and completely eliminate bit.ly/racetoexascale
research. the Advanced Research Projects Agency-
The $258 million will be shared between Energy.
AMD, Cray, HPE, IBM, Intel and Nvidia over Prior to this announcement, when
three years as part of the new PathForward asked about the reports over DOE funding,
program, itself part of DOEs Exascale Crays VP of business operations in EMEA,
Computing Project (ECP). Dominik Ulmer, told DCD: It is always
The companies themselves will provide a concern to us what the Department of
additional funding amounting to at least 40 Energy does, its a very important partner
M
a system of fiber-optic
cables spanning more onthly data is oil sector (a key component of the
than 23,500 kilometers in still mixed, but economy), a low corporate tax rate,
America. analysts believe and an expansionary monetary
2017 has brought policy, Colombias GDP is expected
Internexa a significant to grow by 2.2 percent in 2017 and
Owned by Colombia ISA improvement for Colombia by 2.9 percent in 2018.
holding, Internexa has compared to 2016. Felipe Gmez,
opened its newest data Virginia Toledo Level 3s head of data centers in Gmez believes the pesos
center in the Bogota Free Editor LATAM Colombia, agrees. He told DCD that troubles are over and expects
Trade Zone, built by the projects which stalled in the past 2017 will be quiet: The dollar will
developer of Free Zones year could now be reactivated. remain in the same range and
of ZFB Group. The facility 2016 was very hard for the IT organizations must accommodate
has a Tier III certification industry because of the devaluation the new reality and slightly increase
for design and of the peso against the dollar, their IT investment budgets.
construction. Internexa he said. Investments became Meanwhile, service providers
has other data centers in difficult because suppliers bought have continued to invest in
the region, including one technology in dollars, but sold Colombias maturing market. Cable
in Medellin, along with services in pesos. & Wireless, for example, reached a
a fiber-optic network of This year there is a positive phased agreement to buy 500 sq m
over 49,000km. change for the Colombian (5,381 sq ft) of data center space in
economy, according to the BTs Colombian data center over
LatinFocus forecast from ten years. The deal is estimated
FocusEconomics. With a strong at $20 million, and C&W also put
T
he data center industry should irony, said Church, is that executives often
not build security in as an have irrational fears about the cloud, unaware
afterthought, but should up its of the extent to which they are using it.
game and be alert to the latest Yet many damaging security attacks are
threats to digital infrastructure, not related to the cloud at all. One bank hack
security and data center was traced to an outsourced IT support team
experts said at a DCD Security Summit held from a neighboring country, said Church.
in Singapore as part of Interpol World in July. One support staff members laptop was
A data center can be held hostage if the infected with malware for an entire year
digital controls to the UPS and chiller systems before hackers and pulled a digital heist of
are compromised by hackers, said Ka Vin $10 million within the span of an hour.
Wong, an independent consultant with Although overstated, the threat from data
experience helming colocation providers leakage through the cloud is real, and hackers
in Southeast Asia. With control over the love the cloud as it requires just the user
mechanical and electrical systems, attackers account and password to access the data, said
can issue blackmail demands threatening an Church. Almost all (90 percent) of consumers
induced outage. are not aware of what they are storing in the
cloud; some apps save a lot more files than
Mechanical and electrical systems can be they expect.
isolated from the network, but Wongs point Some users consider themselves
illustrates the need to harden modern data unattractive targets, believing that attacks
centers against digital threat vectors. only happen to other people and large
Everyone needs to play a part in security, organizations, but this is simply not true, said
and corporations can no longer be insular in Church. Data is a commodity like gold, and
Paul Mah their data management, said Chris Church, usernames, passwords and email accounts
SEA a senior mobile forensic specialist at Interpol can change hands to the tune of thousands
Correspondent (the International Police Organization). The of dollars, he said.
To identify genuine threats Finally, a properly secured track people in a near real-time
in modern infrastructure, data center should incorporate manner to strengthen control
users may need to create a capabilities including highly for assets and personnel.
local threat intelligence base, controllable personal access, and Phoon offered practical tips
because traditional defensive safeguards against actions that on securing physical security
measures such as proxy servers, degrade security such as buddy with the use of two-factor
intrusion prevention systems punching and tailgating, says authentication (2FA). An
(IPS) and antivirus software have Phoon Wai Leong of ERS Group. encrypted proximity beacon
repeatedly failed against targeted Physical access control could be issued upon validation
attacks, said Florian Lukavsky, protects against accidents as well and registration, which should
the director of SEC Consult be time synchronized to thwart
Singapore. duplication.
Security information and Active RFID or Bluetooth
event management (SIEM) dongles can allow personnel to
software products and services
A laptop was be tracked for incident response
can generate a large volume of infected with and auditing in the wake of
alerts, which are impossible to security incidents.
review manually. In the infamous malware for Keypad and pin locks should
Target hack of 2013, the hackers be implemented at checkpoints
activities were flagged, but
a whole year, within the facility, though facial
ignored by the security team. before hackers recognition is increasingly seen
as a reliable and cost-effective
A honey pot a fake pulled a $10m method of implementing a Top tips:
environment with intentionally second factor control.
leaked false information can digital heist In fact, three-factor
Isolate mechanical
systems
entrap attackers, said Lukavsky. authentication should be
The local threat intelligence as malice, said Phoon. A systems considered too, suggested Create a local threat
gathered is unique to the data engineer who runs out of power Phoon, and can be implemented intelligence base
center environment, allowing sockets in one rack, could with a mix of physical tokens,
security personnel to know with reach to the next rack to find a personalized passcodes and Handle the cloud
certainty that a security breach vacant slot and then promptly biometric authentication. with care
has occurred. trigger a power trip as an already
When one attacker installed maxed-out circuit is pushed over It will take some time to bring Use physical access
ransomware and APT (advanced the edge. all data centers up to scratch on controls
persistent threat) tools in a honey So how should modern the security front. But as more
pot, the security team was able data centers be secured? For a systems within the data centers Apply two-factor
to collect the fingerprints of start, authentication systems are digitized and networked, this authentication, or more
customized malware, usernames should have the right data to is an area that can no longer be
and passwords for backdoor identify logins from current or ignored. Trust NO ONE
tools, and attack patterns. These former employees, or blacklisted
were checked against other parts personnel. The DCD Security Summit was
of the environment to see if And systems should have held at Interpol World on 6 July
hackers had got in elsewhere. the ability to search, locate and in Singapore
EP YC
32 c or
es
The war for the data center will likely never end.
As an old challenger returns, a new battle begins
W
hile data centers are a "x86 servers continue to be the Zen is an x86 microarchitecture, and the
complex combination predominant platform used for large-scale first processor family to implement it is Epyc.
of various pieces of data center build-outs across the globe, We talked to Su and profile the chip on the
computing equipment Gartner's research vice president Jeffrey next two pages.
and industrial machinery Hewitt said. Intel, however, is keen to keep its
all working in tandem like a great, living Within the x86 sector, the vast majority of dominant position. In an earnings call after
creature, at the heart of most lies the Central processors come from Intel; AMD leads a tiny its most recent quarterly results, CEO Brian
Processing Unit (CPU). wedge of compatible manufacturers. Krzanich said: AMD has raised up a bit with
In 2003, AMDs Opteron line looked like it their more recent products, but you see us
Servers form the center of a data center, offered serious x86-compatible competition responding.
and the chips within them the core of those for Intel, but its position has waned. This is a traditional performance battle
servers most of which are x86 servers, that "AMD got a little unfocused and, perhaps that were very accustomed to, and were
is, servers which are backward-compatible because of that lack of focus, had some comfortable in reacting and competing very
with the instruction set which developed execution issues," AMDs data center head aggressively in. And so you see us coming
from Intels original 8086 and 8088 CPUs, Forrest Norrod told DCD. It lost its share of out with our Xeon Scalable.
starting in 1978. With most of the 8086's the market, handing Intel a near monopoly
successors ending in '86,' the name stuck, as that it has only tightened its grip on. We look at the Xeon Scalable Platform and
did the architecture. But now things could once again change. what it could mean on pages 28-29.
In Q3 2016, x86 servers brought in 90 AMD is spoiling for a comeback fight. We may not know the outcome of this
percent of the server market revenue, Under the stewardship of CEO Dr Lisa battle for some time, but one thing is for sure
according to IDC. Outside the x86 sector, Su, the company has launched Zen its first the war for the heart of the data center is
vendors like IBM and ARM fight for scraps. new microarchitecture in years. back on.
!
data center, but certainly knows
about losing in it. Two years ago,
with large debts and no profits,
its share price hovered below $2,
and its future was being openly
questioned.
At the end, it always comes Is a fabless semiconductor manufacturer,
back to 'are we reliable, do we
turning to for-hire foundries
have a good price-performance Is behind Intel in CPUs and Nvidia in
advantage, can they expect to GPUs in market share
use us for multiple years?'
Max Smolaks
News Editor
!
inter-CPU bandwidth, and support more
memory channels. But they also introduce
never-before-seen features that should,
in theory, enable IT equipment to do new
and exciting things.
According to Navin Shenoy, who
Accused by the European Commission of assumed responsibilities for Intels data
anti-competitive practices in 2007 center division in May, Xeon SP was
Settled with AMD for $1.2bn after claims designed to tackle three mega
of unfair business practices in 2009 trends that are changing the
face of the industry: cloud
ARM BIDES
Cavium from having their own shot at version continues the legacy of technical
the title of the welterweight champion. innovation, with up to 54 cores running at
ITS TIME
The former is an expert in mobile devices, up to 3GHz. Each ThunderX2 supports up to
and wants to apply its knowledge in the six memory channels and up to 1.5TB of total
enterprise IT market. The latter used to memory just like the latest Xeon SP.
specialize in networking, before deciding The platform promises to be
to try its luck with servers. extremely workload-specific,
Qualcomms Centriq 2400 is with hundreds of integrated
While the big boys of the CPU market are the worlds first server processor hardware accelerators
slugging it out on the ring, a number of based on the 10 nanometer for security, storage,
smaller, more agile chip vendors are quietly process. For comparison, networking and virtualization
making inroads into the data center using Xeon SP is still using 14nm, applications, across four main
RM
cores designed by ARM (which was recently which was also used in last varieties. Just like Centriq 2400,
acquired by SoftBank). Despite the venerable years Xeon E5. The number ThunderX2 is expected to ship in
(and vulnerable) Opteron branding, AMDs of nanometers defines the the second half of 2017.
A1100 launched in the beginning of 2016 resolution of the chip lithography This will be a fight to
has failed to set the world on fire, and no one, equipment smaller numbers mean remember.
including AMD itself, is even mentioning it more transistors on the same size of the
A
these days the Zen architecture looks like a die, increased speed and reduced power
much stronger contender. requirements.
Su told DCD: "I think ARM is a good Qualcomms first foray into server chips
architecture, and an important architecture. offers up to 64 cores and will be able to run
We partner with ARM, we use ARM in not just Linux, but Windows Server too a
some of our semi-custom products. But I testament to its lofty ambitions. It should
think relative to the data center, x86 is the ship in the second half of 2017.
dominant ecosystem and will continue to be Meanwhile, Caviums ThunderX2 is
the dominant ecosystem." a refined, updated version of the silicon
launched in 2015, with new cores and I/O.
That hasnt stopped another two The original chip was among the first to
American companies Qualcomm and implement 64-bit on ARM, and the latest
T
able to prove that its content has not been tape systems can also expand
ape storage is one of those altered. Modern tape systems offer a write- capacity by simply using
technological hangovers from once-read-many (WORM) capability that multiple tape cartridges for
the early days of computing, delivers this, and for this reason, tape is often eachdrive.
associated in the minds of mandatory for archiving data. Doing this on any kind
many with rooms full of There are other reasons why tape is likely of scale calls for careful
massive mainframe cabinets. to be around for some time, according to management of the tape
Somewhat like the mainframe, tape shows no Clive Longbottom, service director at analyst cartridges, and this is where
signs of going away just yet, and ironically, firmQuocirca. tape library systems come into
could even be handed a new lease of life The biggest one is still investment the picture. These combine one
thanks to the burgeoning volumes of data protection: the cost of large tape libraries or more tape drives with storage
that are being accumulated in modern and robotic retrieval systems is high, and just slots for multiple tapes, which
datacenters. dumping these because disks are now cheap can be loaded as necessary to
(but failure-prone) is just not a good financial write data or read back data that
Modern tape storage is a world away argument, he said. has previously been stored.
from the movie clich of huge tape reels Then there is the ongoing cost. Sure, Individual tape cartridges
spinning backwards and forwards as the spinning disks are becoming cheaper and are identified using barcodes or
computer chomps its way through some cheaper to acquire. However, keeping RFID tags on the cartridge itself,
complex computation. Todays tape drives the disks spinning has a large ongoing and an automated mechanism
use cartridges capable of holding up to 15TB operational cost due to the required power for loads the tapes into the drives
of data, and are more often used for backup spinning. A tape, once written, is close to zero as required, then removes them
or archiving purposes than for online storage. cost it holds its data until it is needed again. and places them into a storage
However, few in the data center industry Hard disks can be spun down, but rarely are, slot in the library when not
can have failed to notice the rapid changes he added. inuse.
that have been taking place at the storage Tape libraries come in
layer of the infrastructure. Flash-based solid Meanwhile, the shift towards cloud- a variety of sizes, from 2U
state storage devices have enabled new tiers based services for storage has simply moved rack-mount systems that can
of low latency storage with higher IOPS, the problem from the business to the cloud hold 8 or 16 tape cartridges, up
while hard drive makers have responded by service providers. While the enterprise tape to monsters such as Spectra
continuing to push the storage density of market has declined each year, cloud service Logics TFinity that takes up
rotating media, driving the cost per gigabyte providers are turning to tape as the optimal three data center rack frames
ever lower. solution for backing up the ever expanding and can expand to 40 frames
The end result is that the cost of disk volumes of customer data they are storing, containing over 50,000 tape
storage has fallen to a level where many or for actually delivering archive services to cartridges for a total storage
businesses have begun to use disk-based customers. capacity in the region of 1.6
backup systems where once they would have Cloud providers have a bit of a problem: exabytes of data.
used tape drives or tape library systems. In they have put heavy focus on the incredible
addition, cloud-based storage services such scale of their storage capabilities.
The trouble is longevity of several decades if stored under new tape heads and lubricant. This could
that customers conditions of low temperature and humidity. lead to a theoretical maximum of 330TB in
have fallen for In the past, there were many competing a single standard palm-sized tape cartridge,
the message. tape formats, but most of these have largely half the size of a 60TB SSD.
Therefore, the given way to Linear Tape Open (LTO),
big players are which was developed as an open standard Compatibility is a key concern for
looking at a need for not controlled by any single vendor. IBM technologies that will be used for long-
zettabytes of storage and Oracle still have their own proprietary term archival of information. For this
capability to meet formats while also supporting LTO. reason, the LTO Consortium enforces
customer expectations, LTO has been through multiple iterations, strict rules to ensure that any LTO drive
said Longbottom. with LTO-7 introduced in 2015 delivering can read cartridges from the two
Fortunately, a large a native capacity of 6TB per cartridge, or preceding generations as
proportion of this data is up to 15TB with data compression. The well as its own, and can
unlikely to be accessed ever again,
so if the service provider can figure
next generation, LTO-8, is expected
later this year or early next year, and is
330TB write data to cartridges
from the previous
out what data is likely to be accessed, that anticipated to boost native capacity to tape cartridge generation. IBM's TS1155,
can go onto disk while the bulk of it can be 16TB, with up to 32TB possible using promised by IBM for instance, supports
written to tape with SLAs stipulating that compression. and Sony existing JD and JC tape
some data may take an hour or more to be IBMs 3592 series of tape systems has cartridges.
recovered. likewise been through multiple iterations, If tape vendors can
Amazon does not say what technology but the firm has recently introduced the sixth continue to boost storage
its Glacier service uses, but it is widely generation in the shape of the TS1155 Tape density, and keep the price per gigabyte of
believed that it is based on tape storage, Drive, which offers a native capacity of 15TB, tape at rock-bottom levels, there is no reason
simply because the retrieval times quoted to or up to 45TB using the 3:1 compression ratio why the old medium should not continue for
customers are as much as several hours. that IBM quotes for the technology. several more decades for backup and archive.
Tape is well suited for archiving or There is no sign yet of an end to increased An enterprise with just less than a
long-term storage as it offers by far the tape capacities. Most recently (July 2017) petabyte of data should focus on disk-based
lowest price points of any storage medium, IBM and Sony have pushed the record tape backup and archive. Greater than that, and
with a raw storage cost of around $0.02 density to 200Gbits per square inch in an Id be looking at how and where tape could
per gigabyte, and also boasts a potential experimental process which uses sputtering, possibly play, saidLongbottom.
Charity Partner
www.dcdawards.global
Servers + Storage
A wrinkle in time
The Internet has ways to keep time, but they may not
be good enough for a new breed of applications and
regulations, reports Sebastian Moss Sebastian Moss
Reporter
Time synchronization
solutions
In centralized systems, the
solutions for asynchronization
are simple as the main server
dictates time to its counterparts.
Examples of such solutions
are Cristians algorithm and the
Berkeley algorithm. The former
relies on the existence of an
accurate time source connected
to the centralized server and
variations of this algorithm
exist to account for the network
propagation time. The Berkeley
algorithm, on the other hand,
works in systems where a
time measuring device is not
present. As such, a time server
will periodically ask clients their
time measurements, average
them out and send the adjusted
time back.
In distributed systems, the
O
clock asynchronization is more
n January 26, 2016, time went Network Time Protocol (NTP) servers do visible as parts of such a system
wrong. As the Space Command some of the work, but mission-critical or might reside in different time
division for the the US Air large-scale applications often run their own zones and within different
Force began to decommission network time servers for increased security environments. For this, the
SVN 23, a satellite in the GPS and redundancy. For more accurate time most widely used solution is
constellation, things went awry by a whole synchronization, Precision Time Protocol NTP (Network Time Protocol).
13 microseconds. (PTP) is used (see boxout for more). Tried and tested over the years,
Fifteen GPS satellites broadcast the With new regulations around the corner, this is now the norm for time
wrong time, immediately causing issues on understanding how to keep time, and synchronization across the
the Earth below. In a post-mortem, time- how to prepare for when clocks go bad, is Internet.
monitoring company Chronos detailed becoming crucial for an increasing number Because of network latency
numerous anonymous telecoms customers of businesses. this protocol can still fail in
who suffered 12 hours of system errors, while For example, the financial sector will certain cases where an offset
BBC Radio also experienced disruptions. be subject to the EUs Markets in Financial of a few milliseconds is not
Instruments Directive II (MiFID II) from acceptable, in which case
As those affected by the error can January 2018, in which subsection RTS 25 PTP (IEEE 1588 Precision
attest, accurate timekeeping has become requires timestamping to be 1,000 times Time Protocol) can be used,
increasingly important to computer more accurate than is required by current which, when coupled with
networks. Distributed systems need to be legislation. Synchronous Ethernet, can
synchronized, and some require traceable MiFID II has been put forward in a very, deliver sub-nanosecond
and accurate timestamps for everything from very robust way, the National Physical synchronization.
financial transactions, to alarm events, to the Laboratorys strategic business development
time and duration of phone calls. manager Dr Leon Lobo told DCD. u
S
imulation is becoming integral
to data centers: as architects
ponder building plans, and sales
teams show off a virtual facility
to customers unwilling to dive
into a contract without visiting
the site. Even engineers and technicians
increasingly use simulations to improve data
center efficiency, redundancy and capacity.
Tanwen Dawn-Hiscox
Reporter
Initially, the company mostly dealt with reality, but after toying with the technology is in design and execution. The development
clients seeking to understand operational seeing much potential. makes sense in that it follows the industrys
failures, but eventually customers turned The company has developed a learning curve (and by extension, the
to it for general planning and pre-emptive demonstration using Oculus Rift which companys).
purposes. allows one to wander through a series of
That is the number one benefit of simulations of data centers throughout the The final virtual room, an edge data
simulation: whatever the change, you can do ages. From the 1950s, when a data center center containing a Vapor IO chamber, a
it upfront, its a kind of virtual playground. was effectively just a single low powered, self-sufficient cylindrical block containing
Youve got a model of your room and you uncooled mainframe, through the '80s six racks and an integrated adiabatic cooling
can do whatever you want to do, whether it is a time of blue carpets, glass door racks, system, could bring one of the possible use
maintenance on a cooling unit, or installing monitors on shelves and untidy cables and cases for VR in the data center:
new IT equipment, said Mark Fenton, the 2000s, when operators discovered the If, as is predicted by some with the
chartered engineer and product manager at joys of raised flooring and contained aisles. emergence of 5G, we bring data centers
Future Facilities. In the simulation, one can overlay all closer to us to power our equally hypothetical
sorts of data: airflows and their temperature autonomous cars, so-called smart cities
As well as whitespace modeling, the are represented by arrows in a gradient of and virtual reality social media, then offices,
company models generator units, cooling colors ranging from deep blue to red. One former phone boxes, rooftops and cell towers
plants, and its simulations take into account can, for example, check the operational status may contain a handful of racks.
internal and external factors that can affect of the cooling equipment and simulate its In this scenario, our IT would either need
design and operations, like the weather: You reaction to different actions (helpfully color to be fully self-healing, or technicians would
could build a beautiful whitespace and not coded in green for ideal temperature, red for need to have access to multiple facilities at
get your cooling stuff quite right and end up overheated, and blue for overcooled), or test once. So, theoretically, were everything to be
with a really awful performance and actually the effect of a new piece of IT. software-defined, a single person could sit in
overheating even if you've designed that The immersive demonstration gives a control room with a mask on, and remotely
whitespace well. an idea of the industrys progress in operate dozens of edge sites at once without
Future Facilities' team was initially understanding how to run a data center, from having to make the strenuous journey to
skeptical about the importance of virtual zero planning to a higher level of complexity every one of them.
S
ify Technologies provides data organizations. Sifys cloud cover connects
center services like colocation, over 43 data centers, across the country,
managed hosting, DC IT on a high speed network. The company
services and DC migration. provides high capacity multi-protocol
Through our evolved low latency networks across multiple
infrastructure and refreshed cities in India to provide solutions to
technology, we offer state-of-the-art ICT address the unique requirements for data
solutions that keep you ahead. As your center data center traffic as compared
technology partner, we empower you with to traditional data center branch traffic.
100% uptime and transform your business Sify Technologies has marquee customers
by opening innovative possibilities. Our who leverage this network to connect
Data Centers are ideally suited to be both their disaster recovery and near disaster
the primary Data Center and DR. These are recovery facilities to their primary data
strategically located in different seismic center.
zones across India, with highly redundant Sify Technologies has evolved from a
power and cooling systems that meet and network and data center service provider
even exceed the industrys highest standards. to becoming a full-fledged converged ICT
player with capabilities for data center
Our impressive portfolio of over 425,000 transformation, application integration
sq.ft. of server farm is spread across 6 Tier III and transformation integration services.
Data Centers, 15 Tier II Data Centers, 6 State Sify Technologies has demonstrated
Data Centers and several more for private its prowess in many DC transformation
clients, all built to exacting specifications and projects by implementing best-of-breed
best-in-class global standards. solutions which enabled customers to three years. Its cloud platform has scale-out
experience the best-fit solution to address capabilities to compete more effectively
Hyper Connected Data Center Features their current and future requirements for public cloud requirements and is also
and de-risk technology adoption. This extending its support to AWS and Microsoft
43 Data Centers connected by high
transition was played on the premise that Azure, including offering managed services
capacity fiber access
the brand had built a sound infrastructure on top of third-party cloud providers.
Internet Exchange present in Data foundation and aligned services on them As a full scale ICT player, Sify
Center in Mumbai in sync with the ever evolving needs of Technologies is in the best position to align
Zero Latency access to AWS Direct enterprises. Therefore, it believes that it these expectations to the market place for
Connect in Mumbai is able to serve the requisite services with any enterprise.
Google POP enables peering within the same level of SLAs across the board.
Data Center in Noida Sify Technologies is also well
positioned to address the increasing
Carrier neutral with presence of major
market opportunities on upcoming
telecom operators within Data Centers
platforms of Software as a service
(SaaS), Platform as a service (PaaS) and
With over 17 years of operational and Infrastructure as a service (IaaS) as
technical expertise serving over 300 DC enterprises move from build to subscribe
customers spread across BFSI, telecom, and from control to visibility.
pharma, retail, manufacturing, media, etc. As Contact Details
a cloud and network services provider, Sify It is positioned as one of the leading Mark Ryder
Technologies understands that reliable and vendors in the prestigious Gartners Managing Director Europe
affordable connectivity is key to leveraging magic quadrant for cloud enabled Phone: +447429169426
data center and cloud investments made by managed hosting, Asia Pacific for the past Email: mark.ryder@sifycorp.com
M
D av y|
any people are asking if data and the cloud applications running on H
y
centers use too much energy. it can have a lifetime of months. This
dr
o-
The real issue is bigger: Can data Protean package ironically depends
Qu
centers and the energy supply on a much less flexible life support
be c
industry work together to deliver system: the electricity grid.
electricity and online services which meet our
needs... without costing the earth? Energy investments are
Ten years ago, a report from Lawrence intrinsically generational in length,
Berkeley National Laboratory (LBNL) warned or you can never make them pay
that data center power usage was out of control. for themselves, explained Don Paul
Last year an updated report found that energy of the University of Southern California
use by US data centers has actually leveled off. Energy Institute. But the digital world, which
But there is a lot more to the story than that, is wrapped around that now, like two strands of
according to senior executives from both the DNA, is cycling around at ten times the rate.
data center and the power industries, who The two worlds are very different, said David
gathered for DCDs first Energy Smart Summit in Rinard, director of sustainability at Equinix:
San Francisco in June. You have an entrepreneurial approach [in the
Online services are expanding and cloud] versus a blue-chip, older approach to the
developing rapidly, making data centers a network grids. And the consequences of their
s fundamental part of the infrastructure failure is different too: If you lose the Internet
on
as supporting human society. According you go back to the 1980s. If you lose the grid,
M
fast moving. The IT equipment is are changing the way they handle reliability.
so
el
N
n replaced every three to five years, Both of these changes are interlocking.
Dea
It seems that while we may want to curb called demand response schemes. Data like we always have. History shows that as
the data center sectors energy demands, centers could in this way effectively economies grow and civilizations advance,
their very size could actually help the utility erase themselves from the grid at critical they always consume their efficiency gains.
sector in its move to renewables, by being moments, said David Murray president of Energy demand will rise again, he says,
good, and very large and demanding distribution at Hydro-Quebec, Canadas unless the efficiency per byte continues to
customers. largest utility. Alternatively, they could even have dramatic changes, said Paul.
Gary Demasi, Googles director of data sometimes contribute power from their The demand for digital services is limitless
center energy and location strategy, said: We backup sources to the grid. as long as they appear to be free, said
have always locked horns with the utilities. There is a dire need for energy efficiency Nelson: Were not going to change human
They are not structured to deliver us the to be married with demand response, behavior.
product that we want, and weve really got to according to Priscilla Johnson, data center
challenge that. Weve been successful, but its water and energy strategist at utility PG&E. Gamification might be one approach to
been a very rocky road and weve had to do a The idea of sharing precious backup try to change behavior, said Dr Julie Albright
lot of what I would consider to be unnatural systems has usually been anathema to data of USCs department of applied psychology.
things. center operators focused on delivering Utilities could feed energy usage figures back
At first, Google found utilities a reliable service, but that could to consumers and apply social pressures so
reluctant to sell it renewable change, if they deliver reliability in they compete with friends to use less. We
energy, but as a champion different ways, according to could make it fun, she said, though the
of power purchase Between Peter Gross, vice president word fun and the word utility in the same
agreements, Google is 1955 and 2005, of mission critical systems sentence could be a problem.
now the worlds largest computations per at fuel cell vendor Bloom David Rinard of Equinix agreed, taking
renewable power Energy. us back to the culture clash between utilities
customer.
Watt increased by 1bn Historically there and tech. The tech world makes easy to use
was always a buffer smartphones, while the tools provided by
In Nevada, Switch between the utility and the utilities are not of the same consumer grade
helped change state policy server, said Gross. We had this quality. They are not intuitive and easy to use.
so renewable power contracts are available very complex infrastructure support system They are not something you would find at
to all, because it has purchasing muscle, consisting of UPSs and generators and Best Buy.
according to Adam Kramer, vice president transfer switches. It was expensive but In the end, data center operators may
of strategy: In 2016, in Las Vegas, Switch resiliency and reliability drove pretty much have to change peoples behavior, rather than
accounted for the entire growth in a large every decision in the design and operation of themselves.
rate class which includes a lot of casinos. data centers.
The growth rate was one percent." Things are different in 2017, where the DCD is planning more Energy Smart events.
We are the utilities best customers, cloud is increasingly using a distributed For news and coverage, check out features
because we are have a consistent baseline of resiliency concept where the reliability of and videos at bit.ly/DCDEnergySmart
demand, pointed out Dean Nelson, CTO of individual data centers is not quite so vital,
Uber and founder of Infrastructure Masons. he said.
That demand is going to increase which will So is everything going smoothly? Don
help stabilize the demand for the grid. Paul isnt so sure. The leveling of data center Dr J
u li
Connaughton believes its perfectly ok energy demands found by LBNL could just eA
lb
to have growth. If data centers had more be a temporary hiatus caused by a period
ri
demand, we could actually refurbish some when efficiency gains canceled out the
gh
t|
of the sagging infrastructure faster, because uncontrolled growth in digital demand.
US
theres more money flowing to the utilities. People are underestimating the
C
Beyond that, data centers have backup digital growth rate, warned Paul. Energy
power systems, which could be used demand is flat because there have been
to help the grid out, perhaps by massive efficiency gains, but they have
using them so data centers can been consumed by increased demands.
power themselves at peak times in so- Weve consumed our efficiency gains, just
ON DEMAND
Are you Hybrid Ready?
Its Where IT is Going
Speakers:
Chris Ortbals, Executive Vice President of Product
Management, QTS Data Centers
Andrew Boardman, Director of Sales, North East and
Mid-Atlantic, QTS Data Centers
Moderator:
Stephen Worn, CTO, DCD
This on-demand webinar examines how a hybrid
methodology is impacting IT management, data center
and cloud deployments and the ways that providers have
evolved to meet a growing demand for fully-integrated,
hybrid solutions.
Watch and listen, as our subject experts address four key
issues, including the challenges IT managers face when
implementing a holistic hybrid strategy.
ON DEMAND
4 Steps to a Successful High-Speed
Migration Strategy
Speakers:
John Schmidt, Vice President Global Data Centre
Solutions, CommScope
Peter Judge, Global Editor, DCD
This recent one-hour webinar is packed full of insight and
provides clarity on the subject of data center migration.
Learn how the shift to collaborative, parallel processing is
impacting infrastructure design.
Foundations of
of the US, and the figure is thought to be lestruption nonsent ma sinienti dolluptas
similar elsewhere. The National Resources autemolore laut aliquat ibeate lat ipsus
Defense Council (NRDC) broke down that voluptatur rero comnia sitas dis elestru
amount of power used within different ptatissi remolum, consed quatur?There is a
sectors of the industry, and its pie chart tells widely-circulated estimate that data centers
T
greenwash. Cuts were needed. But this didnt Google, Amazon, Facebook, Apple and
kill the green shoots:
he smartcutting
city, built
energyon use in Bythatproviding
crowd,power
togetherto cities
makevia
up athe
smart
hyper- mission critical applications in time
data centers also andcut
around
energy thecosts,
Internet
so Green grid,
scale
Tanzella
cloudtoldcomputing
DCD, cities
sector,
can which zones that are not in peak power times
IT lived on in the of Things
movement(IoT),tois improve
an data communicate
consumes less loads,
than
minimize
five percent
costs,of the provides a methodology for saving cost
center. interconnected network restore
powerpower
usedfaster
by USafter
dataacenters
disruption,
Thenandthe and energy. Renewable energy sources
of devices that gather, integrate
financial
renewable
crash hit. energy
Theresources,
was no budget
and forthat the Smart Grid will have access to will
Crosshead here store,
Met
andoptiame
share data siminul
while provide
greenwash.
reliableCutsoperations
were in allow data centers to supplement peak
communicating
lestruption nonsent withmaonesinienti
another dolluptas
to multi-carrier
needed. But networks.
this power requirements efficiently.
improve
autemolore efficiencies
laut aliquat
across
ibeatea variety
lat ipsus In
didnt
Tanzellas
kill the Neetika Sathe, director of
of
voluptatur
functions. reroSmart
comnia
grids,
sitas
on dis
theelestru
other opinion,
greentheshoots:
smart advanced planning at Alectra
hand,
ptatissiutilize
remolum,the same
consedIoTquatur?There
concepts, butis a gridcutting
promisesenergy
to let Energy Solutions, agrees that the
in
widely-circulated
the energy grid estimate
space wherethat two-way
data centers data use
centers
in data take smart grid can transform data
communications
use about two percent between
of thetheelectrical
utility power advantage
centers ofalsoenergy
cut centers. IT is changing quickly, and
company
of the US, and the end figurecustomer
is thought canto be price
energy
arbitrage
costs,bysomoving
Green many data centers around the world
improve
similar elsewhere.
provisionThe of energy
NationalwhereResources
and loads
IT between
lived on in geographical
the movement to are outgrowing their existing capacity
when
Defense it isCouncil
needed.(NRDC) broke down that areas
improve
serveddataby different
center. different utilities. to power and cool IT systems. When
amount
Fred Tanzella,
of powerCTO usedof within
Greendifferent
House He says energy costs fluctuate across connected to a legacy distribution system,
Data,
sectorsdefines
of the theindustry,
smart and
concept
its pie chart tells theCrosshead
well. countries and here
time Met
zones.
optiame
Running
siminul these data centers need to upgrade or u
an interesting story. lestruption nonsent ma sinienti dolluptas
Google, Amazon, Facebook, Apple and autemolore laut aliquat ibeate lat ipsus
that crowd, together make up the hyper- voluptatur rero comnia sitas dis elestru
scale cloud computing sector, which ptatissi remolum, consed quatur?There is a
consumes less than five percent of the widely-circulated estimate that data centers
power used by US data centers Then the use about two percent of the electrical power
financial crash hit. There was no budget for of the US, and thele and that crowd, together
greenwash. Cuts were needed. But this didnt make up the hyper-scale cloud computing
kill the green shoots: cutting energy use in sector, which consumes less than five
data centers also cut energy costs, so Green percent of the power used by US data centers.
IT lived on in the movement to improve data u
center.
u replace their power networks, the distributed ledger technology behind the power bill. Dynamically migrating these
transformers, switches, UPSs, cabling and the Bitcoin currency and other systems, loads across geographic regions is simple in
connectors. This requires power shutdown can replace human validation in finance, concept, but can be difficult in practice.
for the data centers, which poses added cost. because it is more efficient and more Tanzella says the more risk-averse data
Sathe elaborates: Energy costs represent hacker-proof. center managers will likely be slow to adopt,
the single largest component of operating LaFrance says self-driving cars but as the technology and software becomes
expense, and a potential barrier to are another example of how we are more prevalent, data center managers will
future expansion. Also, while the level replacing human decisions by machine embrace the smart grids functionality.
of power quality and reliability offered intelligence because, in his opinion, LaFrance says in the future we will
by legacy distribution systems it is more reliable. This new way of see more small edge computing facilities
are acceptable for the majority of doing things will require a lot more responding to the always-growing proximity
customers, voltage and frequency energy because we will need places to demand for IoT, mobile devices and other
fluctuations, harmonics and store massive amounts of information low-latency needs. He added: Large
short-term power outages can have and servers to make these decisions hyperscale facilities should be located further
costly and disruptive effects on data for us. away from dense areas to avoid the growing
centers. Whats more, LaFrance says power demand and pressure on utilities that are
Distributed energy storage and utilities and data centers can meet in located near dense areas, which minimizes
generation, and microgrids offered by strategic partnership on integrated micro- a utilitys investment due to the lack of space
smart grids, are seen as complementary or grids and demand response, but mainly and power. Large hyperscale operators
alternative solutions to meet data center during peak periods. The utility will should, and hopefully will, also base site
energy requirements, Sathe says. Smart be able to save a lot of money when, Caption
selection decisions here that mainly
on criteria
grid technology can enhance reliability and during peak periods, it can rely on consider the use of renewable energy.
also offer local, distributed energy solutions customer generators, batteries, or power On the power network side of data
to keep a data center running even during production facilities, instead of buying centers, Sathe says the trend could be
outages. Using a smart grid allows for the expensive energy on the spot market. Hydro- towards meeting energy requirements of
diversification of energy resources that make Qubec is already a leader in this field and data centers in a decentralized fashion.
up a data centers onsite generation. They will seek nearly 1,000MW from its industrial The data center of the future could be
also allow data centers to improve the ROI on and commercial customers during the winter fully integrated with the grid in a way that
Peter Judge
assets
Globalthat traditionally have been sunk costs
Editor peaks. LaFrance says decentralized energy enhances the local resiliency and reliability.
(that is, most data centers relied on natural production, coupled with large battery packs, For those data centers where reliability is the
gas generators for resiliency). Using energy will ensure data centers are online when their number one priority, being connected to the
storage allows for the increased utilization utilities are down. bulk grid while having their own distributed
of the asset for other functions like ancillary But smart energy does not come without energy resource (DER) would be advised.
services, disaster recovery, etc. its challenges, as Tanzella pointed out. There They could also participate in transactional
Eric LaFance, senior trade commissioner are indeed current concerns or hot topics for energy markets to stack up values of the
at Hydro-Qubec says: We are slowly data centers when it comes to smart grids. smart grid assets they own.
replacing the human brain with machine Data center managers have historically been As DCD has found in its own events,
decisions, which will require far more tasked to maintain power and cooling for data centers will be fundamental to moves
electricity. He suggests that blockchain, mission critical applications regardless of towards smart cities, as well as smart grids.
38
38 DCD Magazine datacenterdynamics.com
PRO Mission Critical Training Solutions
New website
launching Q3
DCProfessional Development provides scalable eLearning and face-to-face data center training for the global mission critical sector. Courses are
developed and delivered by an international network of industry experts using the latest educational techniques. Globally recognized and accredited,
our training addresses the real world tactical demands faced within the data center industry.
www.dcpro.training | info@dc-professional.com
Containers
unlock the cloud
Dan Robinson
They told us the cloud would allow workloads to move freely Correspondent
C
loud computing is no longer workloads, while new-build applications
viewed as some exotic and services are developed and deployed
new technology and has in a public cloud platform such as AWS.
become an accepted part of Even where traditional workloads have been
an organizations IT toolkit farmed out to a service provider, this has
for meeting business needs. typically been a case of colocation or of a
At the same time, cloud is less mature than hosted private cloud arrangement.
other IT sectors, and can prove to be more According to Ovum principal analyst Roy
complex than unwary adopters may expect. Illsley, this is because many organizations are
still at an early stage of cloud adoption, and
One of the promises of cloud computing are just looking to get their foot on the first
has long been that users should be able to rung of the ladder.
move workloads from their own data center We are not really seeing companies have
to that of a cloud service provider and back [workload mobility] as their first concern.
again, if required, or even between public What they are really looking to do is lift and
clouds. This might be because an application shift to the cloud, so they might have this
or service calls for more resources than the three-tier app or database based on legacy
organization has available, or simply because hardware, and they want to know how to
it is more cost-effective to run it off-premises get that to the cloud as a first step. Only
at that time. once theyve done that, they start looking
Despite this, cloud adoption has broadly at how to transpose that workload, change
followed the pattern of private clouds that workload, and think about moving it
being used to operate traditional enterprise somewhere else, he said. u
u There are good reasons why workload operate that container on a platform that
mobility has never really taken off. VMware runs in various other cloud environments,
with its vMotion feature has supported live saidIllsley. What is Docker?
migration of virtual machines from one host Containers are a less mature technology Docker is the company and
server to another for over a decade, but it is than virtual machines, and the various platform that everyone thinks of
one thing to perform this inside a data center platforms are thus still in the process of when containers are mentioned,
and quite another to do it from one data developing and perfecting associated but it did not create the concept.
center to another. components such as orchestration, Instead, it took some capabilities
monitoring, persistent storage support and (known as LXC) built into the Linux
Things get more complicated if you lifecycle management tools. These are largely kernel and developed them into a
want to move a workload from your own essential for operating container-based finished product that made it easy
cloud to another one that you do not application frameworks at any kind of scale, for users to create and operate
control. It is not practical to move a virtual as the Internet giants such as Google or a workload using one or more
machine to a cloud based on a different Facebook do. containers, making it popular with
platform, because of differences in the Docker has established its platform developers for use as a dev-and-test
hypervisor, the APIs and the management as the leading format for packaging and platform.
tools used. Even if it is based on the same running containers, but there are several Part of Dockers success is
platform, you may not have the same level tools available for orchestration, such as that it not only created a set of
of management oversight and control as the Kubernetes project, which originated at APIs with which to control the
you do over workloads running on your Google, or the Mesos project from the Apache Docker host, but also a package
owninfrastructure. Foundation, as well as Dockers Swarm tool. format for storing and distributing
Then there is the fact that enterprise Kubernetes is integrated into several applications as container images.
workloads seldom operate in a vacuum; they enterprise platforms, such as VMwares The Docker architecture also calls
depend on other resources to function, such Photon, Red Hats OpenShift application for a registry to act as an image
as shared storage, an SQL database server or platform and even Microsofts Azure library, and users can maintain their
directory service. Unless such services are Container Service. Meanwhile, Mesos is own registry or fetch ready-made
also replicated to the cloud, traffic between used by many large web companies, such images from a public one such as
the workload and these resources will as Twitter, Airbnb, and eBay as it can scale the Docker Hub.
have to be routed to and fro across a WAN to manage tens of thousands of nodes A container under Dockers
connection instead of across the data center runningcontainers. platform is therefore basically a
itself. The elephant in the room is that wrapper for delivering and running
Perhaps it is this reason that led VMware containers are tied to a particular operating an application or service along with
to change direction and ditch its vCloud Air system kernel. This is typically Linux, as any code libraries it depends on, all
service for Cloud Foundation, a platform that container platforms such as Docker build on inside its own little bubble.
it describes as a self-contained software- features in the Linux kernel, but Microsoft However, the underlying LXC
defined data center (SDDC) that can be has recently begun adding support for capabilities it built on were more
provisioned onto commonly used public containers into Windows Server and its Azure geared towards partitioning a Linux
clouds such as AWS and IBM SoftLayer. cloudservice. server into multiple separate user
However, the prospect of workload spaces, similar to Zones in Oracles
mobility could move nearer to reality thanks While a container image created on a Solaris operating system or even
to containers. Most people are familiar with Linux system should run on any other Linux older mainframe concepts.
containers thanks to the efforts of Docker, system, you cannot take a container image
but the ecosystem is made of various created for Windows and run it on Linux,
technologies and implementations. and vice versa. Having said that, Microsoft
What all container platforms share is announced at DockerCon in April 2017
that they enable a piece of software and its that it will support Linux container images
dependencies (such as code libraries) to be in Hyper-V Containers, under which the
wrapped together into an isolated space container lives inside a small virtualmachine.
the container. Multiple containers can run Containers are unlikely to completely
on the same host, like virtual machines, replace virtual machines, but for many
but containers are more lightweight on workloads, especially scale-out distributed
resources, and a given server can operate workloads, they are becoming the preferred
more containers than virtualmachines. method of deployment. This is not just
The containers approach gives because they are easier and quicker to
developers the opportunity of writing a provision, but also because they are easier to
new cloud-native app that, provided youve migrate from one system to another, finally
got support for containers and if youre beginning to fulfill more of the promise of the
on Linux, you will have you can then cloud.
OF DRIVERLESS CARS
AND SERVERLESS
DATA CENTERS
Europe
Z Summit
:
Novem
ber 8
Events
> Mxico | Mexico City
September 26-27 2017
Training
Data Center Cooling Professional: Energy and Cost Management:
Lima, Peru Buenos Aires, Argentina
September 25-27 2017 October 16-18 2017 Data Center Design Awareness:
Barcelona, Spain
Energy Efficiency Best Practice: Critical Operations Professional: Mxico, October 23-25 2017
Mxico, D.F., Mexico D.F., Mexico
September 28-29 2017, Hotel Novit October 23-25 2017 Data Center Power Professional:
Porto Alegre, Brazil
Data Center Cooling Professional: Data Center Design Awareness: October 25-27 2017
Santiago de Chile, Chile Lima, Peru
October 2-4 2017 October 23-25 2017 Energy Efficiency Best Practice:
Madrid, Spain
Data Center Cooling Professional: Data Center Design Awareness: October 26-27 2017, Hotel Zenit Abeba
Mxico, D.F., Mexico Madrid, Spain
October 9-11 2017, Hotel Novit October 23-25 2017, Hotel Zenit Abeba Critical Operations Professional: Lima, Peru
October 30-November 1 2017
Training
Data Center Design Awareness:
Osaka, Japan
August 7-9 2017
DCD>Events // www.dcd.events
SE Asia Data Center Week
Sales: chris.hugall@datacenterdynamics.com
15-21 September, Singapore
Marketing: jake.mcnulty@datacenterdynamics.com
D
CD>Zettastructure conference Speaking: rebecca.davison@datacenterdynamics.com
& expo
DCD>Online // www.datacenterdynamics.com
CxO and cyber security summits
Sales: yash.puwar@datacenterdyanmics.com
DCPRO training workshops Marketing: ryan.hadden@datacenterdynamics.com
P
eople say data centers are not sexy, but they are wrong:
in fact, this industry is filled to the brim with all kinds of
smut. Data centers have been shaped by pornography
as much as they have been shaped by developments in
military technology or mobile communications.
According to conservative estimates, around 10
percent of all data transferred across the Internet features naked people.
During 2016, the worlds most popular adult website was responsible for
streaming 99GB of data every second.
Some of the advancements in technology that were directly
influenced by our penchant for X-rated material include video codecs
and compression, online advertising and payments security.
They say YouTube democratized access to online video, but they are
wrong: after all, no matter what content you post on YouTube, you will
have to use the platforms proprietary tools. Porn gave rise to thousands
of different video engines and hosting platforms, has kept an army of
web developers in work and created a shadow economy worth billions.
Adult websites were also some of the first to support Bitcoin at scale
as a way to avoid embarrassing items in customers bank statements.