Our open-source and cloud-native AI accelerators are helping GCCs fast-track transformation journeys: Srinivasa Rao Kattuboina, EPAM India

As GCCs shift from cost centres to innovation hubs, EPAM is helping them make that leap through co-created AI frameworks, cloud-native accelerators, and open-source innovation. In this conversation, Srinivasa Rao Kattuboina, Head of Data and Analytics Practice, EPAM India, shares how the company is enabling GCCs to scale AI and data platforms with solutions like Responsible AI, AI/RUN, Empathy Lab, and real-time agentic architectures. He also discusses the growing role of marketing use cases, the challenge of “dark data,” and why inference cost will be key to future-ready GCCs.

How is EPAM partnering with GCCs to move beyond transactional delivery and co-create high-value AI and cloud-native solutions, and what does this mean for GCCs evolving into innovation centres?

As you know, GCCs are literally transforming into innovation hubs these days. This means everyone is trying to maximise the value of their investments in GCCs. Some are at advanced stages of becoming true innovation hubs, while others are still focused on traditional, cost-centric execution. We’ve observed both scenarios.

For the advanced Innovation Hub GCCs, EPAM, being a leader in modern engineering, brings a lot of innovation, technical frameworks, and accelerators. Fundamentally, we’ve co-created quite a few frameworks in collaboration with some of these innovation hubs in GCCs. Examples include Responsible AI, AI Security, and AI Factory. When I say AI Factory, it’s not just a theoretical framework; it’s a technical framework that outlines the know-how of AI implementation—from building AI pipelines and models to productionisation and cost optimisation, while creating value. It’s quite comprehensive, more of a guideline that truly helps scale GCCs. This is crucial because GCCs, despite having talent, might lack the initial kick-start expertise as the industry and technology transform rapidly. It’s challenging for them to upskill easily. Hence, we build these frameworks to help them scale their AI journey, and we have many such successful examples.

Of course, we also provide accelerators, all of which are open source. For instance, AI DIAL is an open-source accelerator available on the market in co-pilot. People can download and start using it. At times, we even build such things directly for customers, as there’s nothing proprietary about it. We believe in building open-source frameworks, which genuinely helps GCCs scale their AI journey. I’d also mention things like AI/RUN, which involves cloud-native SDLC (Software Development Life Cycle) implementations that everyone is discussing these days. EPAM has already built accelerators for this.

We’ve also developed the Empathy Lab, which focuses on AI-driven customer experience. AI is fundamentally changing marketing and branding, requiring a lot of empathy. Empathy Lab is all about reinventing the customer experience, encompassing user experience, how you build chatbots, and how you create applications that bring different user experiences. It’s a significant innovation, a lab created within EPAM, and we’re helping customers implement similar solutions. These are just some examples of how we co-create with customers, bringing these engineering frameworks and accelerators to help GCCs scale their AI innovation. We’re excited about it, and customers are quite comfortable receiving, accepting, and collaborating with us, as this provides them with clarity.

Before implementing AI solutions, many organisations first focus on foundational integration efforts. In your experience working with GCCs and enterprises, have you observed any common challenges, particularly related to skill gaps or readiness, that these major players typically face?

There are quite a few challenges. The reason is, before we offer anything externally, we “eat our own dog food,” meaning we thoroughly test it internally. Whether it’s AI/RUN or Empathy Lab, EPAM extensively uses internal, custom-developed products for everything we do, and all of them incorporate AI. Therefore, when we build an AI framework, AI/RUN, or AI DIAL, it’s all incubated internally.

I would say that more than 50% of EPAMers are deeply involved in these technologies, so we understand what it takes to build a great, production-scale AI product. Hence, when customers face business challenges, we proactively help them understand, nurture, and execute the implementation. This often involves coaching, training, advice, and collaboration. I’d say that slowly, about 70% of our Proofs of Concept (POCs) are now progressing to production scale. This indicates that we are no longer just focusing on POCs, but also that customer maturity is increasing, and they see tangible value in these implementations. Consequently, many projects are moving into production.

EPAM has used its BOT model to help clients like Zema Global launch their India GCC, with AI-enabled workforce and full governance setup. So, could you share how your team contributes to EPAM’s BOT engagements, particularly in data and AI setup, and how this accelerates the launch and maturity of GCCs with AI-first capabilities from day one?

In a BOT model, a significant portion of the control resides with EPAM. We collaborate closely with the customer to establish the ecosystem and implement joint use cases or products. Simultaneously, EPAM undertakes the hiring, ensuring that the individuals are on par with EPAM’s standards in engineering and excellence. Since they operate within the EPAM ecosystem during the BOT phase, they have access to it and are trained in an AI-native, AI-driven approach, incorporating all the necessary best practices and frameworks I mentioned earlier.

Therefore, in a BOT model, customers witness substantial advancement and a rapid kick-start to their journey. Because we train all employees with EPAM’s best practices and methodologies, when they initiate the BOT model and subsequently the GCC implementation, they realise tremendous value. Zema is a classic example of our success in this area. Even when it’s not a BOT model, we assist customers in training for these technological nuances, as some of our teams co-work directly with GCCs, facilitating their advancement.

Currently, we have about 1700 GCCs in India, employing over 1.9 million people, and this number is expected to grow to about 2400 GCCs by 2030. How do you see these GCCs evolving in the next few years, especially around data platform engineering and AI, and how are you positioning yourself to support this shift?

In the next five years, I see a significant shift. Traditionally, GCCs focused heavily on back-office functions, shared services, managed services, and support. While some GCCs are already pure-play AI and analytics exclusive, the thinking in the next few years will be less about merely “doing” and more about innovating, building, and leading. My hunch is that it’s not just about using AI, but about building data products around AI. I believe that’s the directional shift for the coming years.

I will also emphasise that “dark data” is becoming a problem in the era of GenAI. When I refer to dark data, I mean the vast amounts of historical data enterprises possess, which they deem valuable but haven’t thoroughly examined for privacy, quality, or recency. This dark data will hinder the innovation of data products built around AI. Hence, good data management disciplines, including data quality, metadata, and governance, are crucial for them to address.

The industry is moving towards a few key areas. To build data products around AI, you need real-time capability. Today, much of what we do, even solutions like RAG (Retrieval Augmented Generation), are largely batch-oriented. With agentic AI becoming more acceptable and numerous use cases emerging, having robust real-time data is essential. Therefore, building intelligent real-time data platforms is extremely important. Ultimately, GCCs are expected to create significant value and demonstrate thought leadership. That’s how I envision GCC innovation hubs transforming in the next five years. By the way, some of our current implementations involve building real-time data platforms and real-time agentic solutions. With the many frameworks and accelerators we’ve developed, I believe it will help us scale in this direction. More importantly, inference cost is vital. I hope GCCs will focus on the inference cost of running these models, and we are actively streamlining and building advanced solutions to help customers achieve their AI goals efficiently.

You mentioned deploying agentic AI solutions in some domains. Which domains have you implemented these agents in?

I would say that many of our agentic solutions are prevalent in retail CPG; there’s quite a bit of acceptance there. We’ve also seen some use, though on a smaller scale, within financial institutions. However, it’s a very regulated industry, so they remain cautious. We’ve observed a few GCCs implementing these, but it’s not yet at scale; it’s a slow and controlled implementation.

We’ve seen that major organisations typically incorporate agentic solutions mainly for revenue management, HRMS, or customer-facing roles like initial customer interactions. Based on your day-to-day interactions with GCCs and other enterprises, what are the common or major areas where they incorporate agentic solutions?

I would say a couple of use cases stand out. One is marketing. Marketing use cases are becoming quite acceptable across the industry. Organisations are comfortable using agentic solutions for brand evolution, creating new brands, developing new ideas, and generating insights. This is quite widely accepted and is, in fact, becoming a primary production use case for many. They are also open to integrating external data, such as customer feedback and competitor information, including Nielsen data.

Marketing is growing exponentially in enterprises. For instance, one of our customers has over 2,500 people in their marketing team, and it’s expanding annually. A solution like this definitely helps them concentrate more on product evolution and new product launches. That’s one significant aspect.

Some regulated industries are also exploring internal applications, such as analysing documents and contracts to understand how they’ve innovated pricing models or products over the years. These are internal applications, not external customer-facing ones. However, ever since we launched our marketing-side solutions, there has been very relevant and good acceptance from customers for implementing these agentic solutions.

AIAI FactoryAI SecurityAI SolutionsDark DataEPAMGCCinterview
Comments (0)
Add Comment