The power of AI for data centers

Data centers contain many components (cooling, workloads, servers, storage, networking), and experts believe that AI can help firms unlock huge efficiencies through the ability to constantly learn from past patterns. Google showed the potential of AI in the data center when it used the Google DeepMind system to significantly improve the power efficiency of its data center. In just a span of 18 months, the system helped Google reduce a 40% reduction in energy used for cooling and 15% reduction in overall energy. Considering that energy costs account for a majority of a data center’s overall cost, Google’s example showed that enterprises can significantly reduce their energy related costs using AI.

Google is not the only company trying to use AI in the data center. Siemens, for example, has joined hands with a company called Vigilent, to jointly provide customers with a AI-based thermal optimization solution that addresses data center cooling challenges. Using a combination of IoT and machine learning, the firm’s solution collects data from thousands of sensors, which is later analysed to determine what changes are required to maintain desired temperatures with the least amount of cooling spend using advanced algorithms. Vigilent believes that close to 40% of data center cooling capacity is typically under utilized as most data center managers do not have the required knowledge or the tools that are required to improve efficiency of cooling systems.

A similar partnership has been done by Nlyte, which has collaborated with IBM to leverage and integrate IBM Watson into its data center product. While Nlyte’s software will collect data from various power and cooling systems, this information is analysed by IBM Watson to create a predictive model that will show, for example, predict which machines or processors are likely to get hot or break down.

Besides the giants, there are many exciting startups that have built interesting products using AI. Startup, LitBit, for example, has built what it calls ‘Dac’, the world’s first AI powered data center operator. Specific to the data center, the product can for detect loose electrical terminations before they cause problems; can help prevent server or network hardware failures, by pre-emptively detecting the sound of failing power supplies and can even understand precise normal and abnormal operating conditions of equipment based on their precise sound / vibrations patterns.

Virtual Power Systems has created a product which it calls, Software Defined Power, an approach similar to software defined approaches in the compute, storage and network segments, for automating and optimising distribution of power and cooling resources. The firm uses a machine learning algorithm that helps data centers recuperate stranded power allocated for long-duration peaks.

While these are still early days, the potential is huge, and the supporting ecosystem is ready. With the combination of cloud (for storing and analysing huge amount of data) and IoT (for capturing huge number of data across multiple points in the data center), the potential for automating a large number of critical functions , and proactively preventing data center outages is huge.

AIArtificial Intelligencedata centersGoogle DeepMindibm watsonNlyteVigilent
Comments (0)
Add Comment