Latest Data Centre Opinions
Over the past few decades, technological advances have enabled telecom service providers to consolidate their network infrastructure. The reduction in equipment size, coupled with an improved ability to bridge larger distances, has also allowed service providers to reduce their data centre footprint.
But with the reduction in data centre sites, the modern facility has also become comparatively supersized and energy hungry, mainly due to the never-ending increase of network traffic. To mitigate this challenge, companies such as Verizon are turning to machine learning and data analytics to improve the energy efficiency of facilities.
When I first heard about KaoData, I was immediately intrigued. Born and raised in North London, the idea of a facility popping up in Harlow that boasted itself as one of the largest developments in the UK, with the potential to support an IT load of 35MW across 150,000sq ft of space, seemed unthinkable. Naturally, when I was invited to have a look around, I jumped at the chance.
The first thing that hit me was just how easy it was to find the data centre. Kao Park is situated just minutes off the M25 but the location itself wasn’t a fluke. Gerard Thibault, chief technical officer, explained that the site was selected based on a number of factors. Accessibility was crucial, the facility needed to be easy for staff and customers to reach by road, train or plane. Positioned in the heart of the London-Stansted-Cambridge corridor, with the M1, M11, M25 all in close proximity, Harlow train station just 10 minutes away by taxi, and Luton, Stansted and City. airports all in close vicinity, they were able to pinpoint the perfect spot.
Cyber-Physical Systems (CPS) are a mix of computation, networking and physical processes, in which the embedded computational algorithms and networks have the power to monitor and control the physical components.
By using a combination of machines, sensory devices, embedded computational intelligence and various communication mechanisms, CPS monitor physical elements with computer-based algorithms tied to the internet. This means they are capable of autonomously functioning based on their physical surroundings.
In light of advancements in analytics, artificial intelligence (AI) and communications, there is increased demand for intelligent machines that can interact with the environment around them, such as driverless cars which monitor and communicate with their surroundings, and smart appliances that optimise energy consumption. CPS are stimulating significant changes in quality of life and forming the basis of smart infrastructure, products, and services.
As this kind of technology continues to become more integrated into our everyday lives, here are four areas of CPS we can expect to come to the fore.
Given their many advantages over alternative technologies, lithium-ion batteries are gaining in popularity as a power backup option for data centre Uninterruptible Power Supply (UPS) systems. A 2018 Bloomberg New Energy Finance report forecasted that Li-ion technology will comprise 40 percent of all data centre backup batteries by 2025, and that in the hyperscale sector, Li-ion will become the predominant battery technology, accounting for 55 percent of UPS batteries.
Compared with traditional valve-regulated lead-acid (VRLA) alternatives, Li-ion batteries offer greater power density, smaller size, less weight and longer operating life. They can also withstand many more charge/recharge cycles, typically more than 1,000 compared with 200-400, before losing their ability to provide effective backup power.
As a result, they occupy less space, incur lower maintenance costs and require less frequent replacement than VRLA batteries offering the user a lower total cost of ownership (TCO) over the lifecycle. This helps to offset their chief disadvantage, an initial cost premium, but even that is steadily diminishing thanks to ongoing technology development and increased manufacturing volumes.
Additionally, recent studies conducted by Schneider Electric’s Data Centre Science Centre, detailed in White Paper #229: ‘Battery Technology for Data Centers’, found that over a 10-year period, Li-ion delivered a TCO that is between 10 percent and 40 percent lower than equivalent UPS systems based on VRLA batteries.
Hyperconverged infrastructure (HCI) has gone mainstream, yet myths still remain that lead to misconception and confusion even among those that already have various HCI solutions deployed. These are five of the most prevalent myths debunked.
First of all, the acquisition price of HCI solution varies by vendor and often by the brand of hypervisor used in the solution. Secondly, while it can often be the case that purchasing the individual components needed to create a virtualisation infrastructure may be less expensive than purchasing an HCI solution, that is only part of the cost of the solution. The true and total cost of infrastructure goes far beyond the initial purchase.
Not a lot of 55 year-olds look as good as the mainframe.
Back in 1964, along with the release of The Beatles’ legendary A Hard Day’s Night and cinematic classic Mary Poppins, came the release of IBM’s groundbreaking IBM System/360.
Since then, mainframes have established themselves in the vast majority of business critical applications as the beating heart of technical infrastructure. So much so that the mainframe market is expected to grow by a further 4.3% by 2025, bringing it to a worth of nearly $3bn per annum.
With the proliferation of 5G, AI, Big Data, IoT and social media, all of which reside in the cloud, there is an ever-increasing demand for energy in data centres.
Take popular consumer data storage platforms, which for the majority of society constitute the cloud. At the Data Centre Re-transformation Conference held last month in Manchester, Uptime Institute revealed it requires 666GWh/year on average to store a selfie on the cloud (assuming the photo is 2.5MB and 6.5kWh/GB is required for annual storage, and given that on average each of the UK’s 41M cloud users store 1000 photos per year).
In an environment where consumption of all things appears to be heading skywards, data use, bandwidth and storage requirements are right on trend. With levels of data creation and use now surpassing zettabytes volumes per year, the data centre technology suites that facilitate this compute power need to be carefully controlled.
From the standpoint of an infrastructure supplier, today’s discussions with data centre operators and corporate end users, with owned data centre capability, are framed around increased bandwidth, reduced latency and energy concerns including lower power usage and effective climate control to optimise the compute environment.
Technology is rightly hailed as a force for good in making the world a fairer, more sustainable place. However, meeting the needs of an increasingly digitalised society has a material cost.
With more and more transactions, processes and communication transferring to the online world, energy usage and greenhouse gasses are increasing faster in this sector than most.
A significant proportion of this is embodied energy in the hardware, which accounts for roughly 50 percent of overall impact. While this is far from exclusive to the data centre sector, it is something we will need to be increasingly aware of going forward. The good news is that the industry is responding with new research into more sustainable solutions.
Not so many years ago, our demands of networks were more or less limited to instant dial tone and a static-free telephony connection, while the internet seemed like something out of a science fiction movie. Now there are billions of data users who expect instant response from their devices and applications.
This evolution continues to accelerate with a constant stream of new developments making us even more dependent on IT and internet infrastructure.
For many of today’s cloud and colocation service providers, a primary consideration is whether to invest in an entirely new facility or to upgrade and retrofit an existing building. Such a decision will involve various technical and financial considerations in order to determine which is the best solution to solve the organisations challenges.
An alternative to building an entirely new facility is to opt for a prefabricated data centre solution, allowing the user to add capacity, whether in power, cooling or IT increments, to spaces both inside and out of a building. For a colocation provider, this presents a unique opportunity to scale up quickly, in some cases in as little as 12-24 weeks, or indeed add power provision and free up new white space to increase revenue generation.
With the Internet of Things (IoT) generating more data than ever before, organisations must seriously consider what edge computing has to offer. According to a study from the International Data Corporation (IDC), 45 percent of all data created by IoT devices will be stored, processed, analysed and acted upon close to or at the edge of a network by 2020.
In a world that is increasingly data-driven, a large amount of data is being generated outside of the traditional data centre. Edge computing places the physical computing infrastructure at the edges of the network where the data is being generated, and in many cases, this is where the data is needed most.