The rapid acceleration of artificial intelligence (AI), driven by GenAI, is redefining the role of data centres. As AI begins to change industries from healthcare to finance, the expectation is that the demand on data centres to support intensive machine learning processes will be unprecedented. According to analyst Gartner, spending on data centre systems is expected to increase 24% in 2024 due in large part to increased planning for GenAI.
The International Energy Agency (IEA) says that data centres are already responsible for around 1% of global electricity use, and it is expected that energy demands will grow exponentially as AI adoption increases. This highlights the increasing need for energy-efficient solutions and has prompted regulatory bodies like the European Commission to set stringent energy-efficiency targets such as the 2023 ‘Digital Decade’ policy, which aims to reduce the carbon footprint of the ICT sector by 40% by 2030.
From Stability to Agility: The New Data Centre Paradigm
Traditionally, data centres were designed for stability, focusing on consistent uptime and reliable performance for relatively predictable workloads. This model works well for traditional IT workloads but may fall short for AI, where workloads are highly variable and resource-intensive.
Training large machine learning models (LLMs), obviously requires immense computational power and energy, while inference tasks can fluctuate based on real-time data demands. With the requirements of the digital space set to escalate, it’s crucial for data centre operators to adapt continuously, leveraging innovative solutions and operational efficiencies to meet the future head-on.
Enhancing Energy Efficiency: A Critical Imperative
The rising energy consumption associated with AI workloads is an operational challenge as well as an environmental one.
Data centres are already significant consumers of electricity, and the projected doubling of energy use by 2026 will place even greater strain on both operators and the grid. This makes energy efficiency and availability a top priority for operators.
Battery energy storage systems (BESS) can help to improve energy efficiency. They can store excess electricity and make it available when needed. This is critical in countries like Denmark, where the EU’s ‘Energy Efficiency Directive’ mandates operators integrate at least 10% renewable energy into their power mix by 2025.
BESS have the potential to give data centres more control over their connection to the grid providing more autonomy.
BESS can also be used to alleviate grid infrastructure constraints and offer equipment owners the potential to provide grid services and generate new revenue streams, as well as cost savings on electricity use. These systems can provide grid-balancing services. They enable energy independence and bolster sustainability efforts at mission critical facilities, providing flexibility in the use of utility power and are a critical step in the deployment of a dynamic power architecture. BESS solutions allow organisations to fully leverage the capabilities of hybrid power systems that include solar, wind, hydrogen fuel cells, and other forms of alternative energy.
According to Omdia’s Market Landscape: Battery Energy Storage Systems report, “Enabling the BESS to interact with the smart electric grid is an innovative way of contributing to the grid through the balance of energy supply and demand, the integration of renewable energy resources into the power equation, the reduction or deferral of grid infrastructure investment, and the creation of new revenue streams for stakeholders.”
Preparing for the AI Future: Strategic Investments in Infrastructure
As AI continues to change industries, the infrastructure that supports it needs to evolve too. This requires strategic investments not only in physical hardware but also in management systems that can optimise performance and energy use.
AI-driven automation within data centres can play a pivotal role, enabling predictive maintenance, dynamic resource allocation, and even automated responses to security threats. For example, it is the continuous exchange of data with the critical equipment and the adoption of a monitoring system that allows the identification of potential threats and anomalies that could impact business or service continuity. The identification of patterns and anomalies in the collection of large amounts of data permits a faster and more accurate problem discovery, diagnosis and resolution. This monitoring of critical equipment adds an important layer of protection to continuity, and therefore availability of the infrastructure.
Investment in innovative cooling solutions is also becoming essential as traditional air-cooling systems struggle to keep up with the heat generated by high-density computing environments. Although air-cooling solutions will be part of the data centre infrastructure for some time to come, liquid cooling and direct-to-chip cooling technologies offer promising additions, allowing data centres to maintain optimal temperatures without compromising performance. According to industry analyst Dell’Oro Group the market for liquid cooling could grow to more than $15bn over the next five years.
Investing in the Edge
Edge computing is another area of infrastructure that is likely to need further investment in the AI era. Edge data centres can significantly reduce latency and bandwidth usage by processing data closer to its source, which is crucial for applications like autonomous vehicles and smart cities. This distributed approach to data management allows for more efficient processing of AI workloads, reducing the burden on centralised data centres. IDC predicts that worldwide spending on edge computing will reach $378 Billion in 2028, driven by demand on real-time analytics, automation, and enhanced customer experiences.
Collaboration Across the Ecosystem: The Path to Innovation
The future of AI-driven data centres will depond on collaboration across the technology ecosystem. Operators, IT hardware manufacturers, chip designers, software developers and AI researchers must work together to develop solutions that meet the unique demands of AI. This collaborative approach is essential for driving innovation and enabling data centres to support the next generation of AI applications.
For instance, the integration of AI-specific processors and accelerators requires close coordination between IT hardware manufacturers and data centre operators. Similarly, the development of specialised software environments that efficiently manage data and resources will depend on ongoing collaboration between data centre operators and software developers.
Embracing the Future: A New Role for Data Centres
With increasing AI demands, power consumption challenges, and sustainability goals, the data centre industry is at a critical juncture. Implementing practical solutions like liquid cooling and battery energy storage systems (BESS) is key to addressing these issues. By investing in agile, energy-efficient infrastructures and fostering collaboration across the ecosystem, data centres can position themselves at the heart of this transformation. In doing so, they will not only support today’s AI applications but also pave the way for future innovations, helping to shape the digital landscape of tomorrow.
- Data & AI
- Infrastructure & Cloud