As AI adoption accelerates across Indian enterprises, the real challenge is shifting away from models and algorithms to the fundamentals that make AI viable at scale, i.e., cybersecurity, infrastructure modernisation, power and cooling. What was once a technology upgrade is now a boardroom-level rethink of how IT environments are built, secured and sustained.
In this interaction, Venkat Sitaram, Senior Director and Country Head, Infrastructure Solutions Group, Dell Technologies India, explains how AI is forcing organisations, across both MSMEs and large enterprises, to refresh legacy systems, prioritise cybersecurity, and rethink data centre design. He also outlines why liquid cooling is rapidly moving into the mainstream, how Dell is tailoring AI adoption for different customer segments, and what sovereign AI means in practical terms for India’s innovation journey.
Dell Technologies works across both large enterprises and the MSME segment. With the rapid rise of AI and evolving computing patterns, what key trends did you observe in 2025, and how do you see these shaping up in 2026?
AI has clearly been the biggest catalyst, and in many ways it has amplified trends that already existed. Cybersecurity is a classic example. It was always important, but with AI being integrated into core workloads, it has become the number one topic in boardroom conversations. Today, CIOs and CISOs are being told, “If the budget isn’t enough, ask for more, but make sure you’ve done enough.” That mindset reflects both urgency and responsibility.
The second big trend is infrastructure modernisation. You simply cannot run AI on legacy systems. That’s why we’re seeing a strong refresh cycle, whether it’s AI PCs, AI-optimised compute, AI-ready networking, or storage built specifically for AI workloads. All of these components have to come together, and they must be deployed in a way that consumes less power and increasingly relies on renewable energy.
We’ve invested heavily in engineering around power and cooling, moving beyond traditional air cooling to liquid cooling and other advanced methods. Our philosophy has always been simple: how do we help customers pack more performance while consuming less energy? At smaller scales, this may not seem critical, but at scale, it becomes a massive differentiator.
This shift is also driving significant disruption in data centre infrastructure. That’s why we keep hearing about gigawatts and megawatts in the context of new data centre investments. The trend will only accelerate into 2026. In fact, we’re going even deeper into the power side of the equation. Liquid cooling, which was about 10 percent of our portfolio mix two years ago, now accounts for nearly 40 percent. By 2026, it could very well move to a 60–40 split in favour of liquid cooling.
On liquid cooling specifically, are you looking at water-based systems or oil immersion as well?
We are exploring all options, including oil immersion, but availability and accessibility of resources play a key role. Oil-based cooling is still not mainstream for data centres in India. We were among the first to introduce immersion-based oil cooling, and we continue to support it. However, over time, we realised that while oil dissipates heat very effectively, the overall ROI for customers wasn’t always compelling.
That led us to diversify into liquid cooling approaches that leverage natural and readily available resources such as water. When combined with advanced cooling designs, these models are delivering better ROI for data centre operators. When customers run AI environments as a service, they closely track metrics like return per gigabyte per hour. Everything, from compute, storage and networking to rack density, power envelopes and software-based power management, plays a critical role.
We anticipated early on that sustainability would become a priority, and that belief continues to hold true. Cybersecurity, sustainability and infrastructure modernisation are now deeply intertwined trends. Alternative cooling methods in data centres will only grow as AI adoption increases. Today, 65–70 percent of CIOs are already using AI, and our conversations with CXOs reflect the same reality.
When you work with both enterprise and SME customers, their challenges are often very different. What are the unique issues you see across these segments, and how do you address them?
For SMEs, even before security, the first challenge is simply where to start. Many organisations know AI is important but don’t want to experiment blindly. That’s where our advisory and consulting capabilities come in. Our pre-sales architects and services teams conduct accelerator workshops, either at the customer’s premises or at our briefing centres, depending on what works best for them.
The key is helping customers understand that AI doesn’t have to be prohibitively expensive. It becomes expensive only when it’s not assessed correctly. AI requires a resilient architecture, and someone has to design that from the ground up. That’s where we step in.
We’ve built our portfolio by keeping different customer segments in mind, small offices, mid-sized businesses, large enterprises and even government. With over three decades globally and nearly 25 years in India, we’ve applied these learnings directly into product design. You’ll see different variants and bundles tailored to specific needs.
For an SME starting out, we might design a ready-to-deploy solution that requires only basic data centre infrastructure, rack, power, cooling, compute, storage and networking, validated for specific open-source or commercial language models. We can deploy it as a plug-and-play solution, manage it on their behalf, or even run it through a cloud service provider if the customer doesn’t want to invest in on-prem infrastructure.
This flexibility is our strength. We don’t believe in one-size-fits-all. Whether a customer is just starting, scaling rapidly, or already retraining models at scale, we can support them. AI loves GPUs, and GPUs love AI, it’s that simple. From small two-CPU servers to large GPU clusters, air-cooled or liquid-cooled, uniform or mixed clusters, we cover the entire spectrum. That’s why we believe we’re serving one of the largest AI and GenAI customer bases in India and globally.
And how do these challenges evolve for large enterprise customers?
The fundamentals remain the same, but scale introduces complexity. Large enterprises must re-evaluate their entire IT landscape before introducing AI, dependencies, integrations, governance models and cybersecurity frameworks all come into play.
In highly regulated sectors like banking, pharma and telecom, risk, governance and compliance become central considerations. Enterprises need to decide whether AI workloads should be integrated with existing systems or isolated, how governance should be enforced, and how cybersecurity controls should be strengthened.
Data management is another critical challenge. AI requires clean, well-prepared data, and many organisations haven’t fully addressed this yet. That’s where we begin, by establishing the data layer using our Dell AI Data Platform. From there, we help customers move into inferencing, retrieval-augmented generation, training and eventually model retraining, based on their use cases.
Dell has been talking extensively about AI factories and your partnership with NVIDIA. How does this collaboration help customers build responsible and compliant AI models?
Our partnership with NVIDIA is something we truly value. It’s been several years now, and we continue to launch newer versions of AI factories as GPU architectures and processors evolve. Each generation brings more performance and capability, which requires us to continuously refresh and future-proof our portfolio.
In India, this partnership is gaining strong momentum. Our teams work closely with NVIDIA to identify industry-specific use cases and help customers achieve better ROI. We also help customers decide which use cases to prioritise and how to validate them through pilots and proof-of-concept engagements, either in customer labs or our own facilities.
Importantly, while NVIDIA plays a critical role in GenAI, we also offer alternatives using AMD and Intel platforms. Our long-standing partnerships across the ecosystem allow us to recommend the best-fit solution rather than a single approach.
Finally, what does Dell’s vision of AI that empowers people and nations mean for India’s innovation journey, especially in the context of sovereign AI?
Our vision is centred on making AI easy and accessible. As part of the Infrastructure Solutions Group, our mandate is to help customers adopt AI faster and with confidence. This extends from large enterprises right down to individual users through AI-enabled client devices.
We see AI adoption taking many forms, sovereign AI, purpose-built AI, private and hybrid AI models, and we enable all of them. We work closely with the India AI Mission, collaborate with cloud service providers, and support startups through joint initiatives with partners like NVIDIA.
Our goal is to build an ecosystem where customers can experiment, validate use cases, achieve the right total cost of ownership, and then scale confidently. Sovereign AI, in that sense, is about accessibility, control and long-term value, and that’s what we’re committed to delivering.