By Vimal Kaw, Associate Vice President, Products & Services, NTT – Netmagic
While the year 2019 was exciting, the years ahead promise to be even more exciting for the data center space.
Looking at the rapid technological developments, we believe the following technology trends will shape the future of data centers:
#1 Hyperscale Data Centers
Hyperscale refers to the capability of an IT system or architecture to scale exponentially and rapidly to respond to demand that is increasingly heavily. A report by Markets & Markets estimates the hyperscale data center market to grow from USD 25.08 billion in 2017 to USD 80.65 billion by 2022, at a CAGR of 26.32%. Cisco estimates that by 2021, traffic within hyperscale data centers will quadruple, and hyperscale data centers will account for 55% of all data center traffic by 2021.
In the case of a hyperscale data center, enterprises can replace individual physical components compared to the traditional approach of replacing the entire server, which not only increases costs, but also increases the downtime. This approach also gives extreme flexibility in scaling at the physical level, as components can be added modularly. For example, in case a server fails, an application can be moved from one server to another server, without downtime. Hyperscale data centers are expected to change every aspect to the data center—fromthe way hardware components are sourced to the way they are designed.
# 2 Artificial Intelligence
Ever since Google published research that it used AI in the data center to improve the power efficiency of its data center, many firms have followed suit to explore the transformational potential of AI. For example, in a span of just 18 months, Google used its AI powered Google DeepMind system to bring about a 40% reduction in the amount of energy required for cooling, which is equivalent to a 15% reduction in overall PUE overheads.
Hiring people with the right skill sets is a massive challenge in the digital era. Gartner, for instance, predicts that by 2020, 75% of organizations will experience visible business disruptions due to I&O skills gaps (an increase from less than 20% in 2016). AI can play a big role in automating many of the tasks that human agents do today. Similarly, AI can be used with great impact in a SOC in a data center. AI can complement current Security Incidents and Event Management (SIEM) systems, by analyzing incidents and inputs from multiple systems, and devising an appropriate incident response system. AI-based systems can improve the security operations centre monitoring and basic L1 jobs can be reduced. For example, when more than 10,000 events per second are logged, it becomes difficult for human beings to monitor these events. AI-based systems can help in identifying the malicious traffic from the false positives and help data center administrators handle cyber security threats more efficiently.
Researchers from MIT found that AI can help data center owners save millions by automating scheduling of data-processing operations across thousands of servers. The AI system developed by the researchers’ system completes jobs about 20-30% faster, and twice as fast during high traffic.
# 3 Edge Computing
We live in a connected world, and every connected device produces data. As more devices get connected, it will increasingly become economically unviable to transfer data consistently to a centralized location. A Gartner study for example, forecasts that 14.2 billion connected things will be in use in 2019, and that this total will reach 25 billion by 2021, producing immense volume of data. A McKinsey study claims that 127 new IoT devices connect to the internet every second. The rise in the number of connected devices calls for building localized data centers or edge data centers to process local traffic.
Gartner defines Edge Computing as an approach that enables and optimizes extreme decentralization, placing nodes as close as possible to the sources of data and content. Unlike the traditional approach which adopts a centralized approach and sends every bit of data to the cloud,edge data centers keep the heaviest traffic and data close to end-user applications.In the future, expect more adoption of edge data centers as IoT devices grow exponentially.
Gartner, for instance, predicts that while currently around 10% of enterprise-generated data is created and processed outside a traditional centralized data center or cloud, by 2025, this figure will reach 75%. An IDC FutureScape report states that by 2022, 40% of enterprises will have doubled their IT asset spending in edge locations. IDC also believes that 45% of all data created by IoT devices will be stored, processed, analyzed, and acted upon close to or at the edge of a network by 2020.
# 4 Security at the chip level
With attacks growing in scale and complexity against data centers, global firms such as Google are trying to embed security at the chip level. Called OpenTitan, the project is a collaborative open-source chip design project that is designed to build trustworthy chip designs for use in data centers and other components. Google believes that security at the chip level will help in ensuring that the hardware infrastructure and the software that runs on it remain in their intended, trustworthy state by verifying that the critical system components boot securely using authorized and verifiable code. This can ensure that a server or a device boots with the correct firmware and has not been infected by a low-level malware.
While there have been similar attempts in the past (Intel – Software Guard Extensions, Arm – TrustZone, AMD – Secure Encrypted Virtualization), Google’s initiative is the only one today that is not proprietary. By deciding to go the open source route, Google is hoping to build a foundation for building secure chips.
If you have an interesting article / experience / case study to share, please get in touch with us at [email protected]