‘Confluent plans to become the data streaming central nervous system for enterprises’ : Rubal Sahni, AVP, India & Emerging Markets, Confluent.

'Confluent plans to become the data streaming central nervous system for enterprises' : Rubal Sahni, AVP, India & Emerging Markets, Confluent.

Confluent today is transforming how organizations such as Jio Platforms, Swiggy, Meesho, and Viacom18, harness real-time data. In an exclusive conversation with Rajneesh De, Group Editor, CXO Media & APAC Media, Rubal Sahni, AVP – India and Emerging Markets, Confluent explains why legacy modernisation has become the biggest enabler and bottleneck for enterprise AI in India and how CIOs are balancing AI ambition with governance, compliance and risk management.

How will Confluent help Indian enterprises modernize complex legacy environments while ensuring business continuity for mission-critical systems?

Confluent helps Indian enterprises modernise legacy estates by putting a real‑time data streaming layer around existing systems, so they can evolve safely instead of “rip and replace,” while building in continuity for mission critical workloads.

From the technology perspective, we help enterprises connect to mainframes legacy databases using certified connectors and CDC (change data capture), so data is streamed out in real time without taking core systems offline. With our offering, we let Indian enterprises unlock legacy data, modernise incrementally, and run a resilient, hybrid streaming backbone so mission‑critical systems stay up while the organisation moves to modern, AI‑ready architectures.

How will the Deloitte–Confluent partnership accelerate AI and Generative AI adoption and help enterprises move from pilots to production-scale deployments?

Across industries, we see a clear pattern emerging despite significant investment in AI. Nearly 80-90% of AI initiatives stall at the pilot stage. This “pilot purgatory” happens because experiments are often built for a controlled proof-of-concept environment rather than operational reality.

Many initiatives begin with a “shiny object” mindset instead of clearly defined business outcomes. Without well-defined KPIs, measurable ROI, and access to unified data across systems, AI pilots struggle to translate into operational impact.

From a technology perspective, most pilots rely on static, point-in-time datasets to prove that a model works in isolation. Production environments are very different; they require integrating multiple data sources, adapting to changes, and responding to real-time events instead of relying on batch updates.

Confluent’s partnership with Infosys, TCS, Deloitte, addresses this gap. These partnerships bring deep domain expertise, use-case identification, and transformation capabilities, while Confluent provides the real-time data infrastructure needed to operationalize AI. Together, we are helping organizations align AI with high-impact business outcomes, industrialize data pipelines, and embed governance from the outset.

The ultimate goal for enterprises is to move beyond pilots and deploy AI and Generative AI systems that scale reliably and deliver measurable business outcomes, and we play a key role in this journey with partnerships with India enterprises.

How can enterprises structure AI partnerships to ensure measurable ROI, operational resilience, and long-term scalability rather than experimental outcomes?

AI must be treated as a capital allocation decision, not an IT experiment. This is very important to understand this aspect. Often, I see the mistakes made by decision makers. Successful partnerships define “North Star” metrics before implementation, like revenue lift, margin expansion, cost reduction, and risk mitigation.

I often see 25-40% reductions in manual effort, 30-50% faster decision cycles, and 10-30% lower downtime in predictive use cases when AI is deployed correctly.

Structurally, partnerships should anchor to board-level KPIs, adopt modular architectures for flexibility, and establish governance-first operating models. Shifting from technology-first to outcome-first thinking ensures AI becomes an enterprise capability which is scalable, resilient, and accountable.

How are CIOs redesigning their data infrastructure to ensure AI systems remain compliant, secure and auditable while still enabling innovation?

This is being done by embedding governance, security, and compliance directly into the data architecture rather than treating them as afterthoughts.

One important shift we are seeing is a “shift-left” approach to data governance, where quality checks, schema validation, and data contracts are enforced closer to the point where data is created. By validating structure, lineage, and access policies earlier in the data lifecycle, organisations can ensure that the data powering AI systems is trusted, compliant, and auditable from the start.

Compliance with regulations such as GDPR, DPDP Act, and CCPA is being architected into the infrastructure itself through capabilities like data minimization, consent management, and retention policies. At the same time, modern architectures like secure data lakes and federated environments allow teams to innovate without compromising control.

While evaluating AI partnerships, ensure they can provide transparency on training datasets, bias mitigation approaches, and retention policies. AI cannot operate as a black box. In a regulated world, security, explainability, and governance are competitive advantages. For responsible AI, look for guardrails, content filters, and human-in-the-loop workflows for high-stakes decisions. Most importantly, validate “shared responsibility” in writing who owns what when something breaks.

How is real-time data streaming addressing the challenge of scaling AI because of fragmented data infrastructure and unlocking measurable business outcomes?

Most AI challenges are fundamentally data challenges. For AI systems to deliver meaningful outcomes, they require continuous access to fresh, contextual, and trustworthy data.

However, in many organisations, data remains fragmented across legacy systems, cloud applications, and operational databases. As a result, AI models often struggle to access the timely and relevant information they need to operate effectively. The real barrier here is the underlying data infrastructure that delivers the data to them.

Real-time data streaming addresses this challenge by creating a continuous flow of data across the enterprise. Instead of relying on outdated and stale data via batch pipelines and disconnected systems, organisations can connect legacy platforms, cloud services, and operational databases into a unified data layer where information flows as events happen.

This allows enterprises to unlock value from existing systems without replacing them, while ensuring that AI applications always operate on the most current and reliable data.

Multiple customers during the AI POCs have let AI agents access to Enterprise data by directly letting agents connect to DBs/ sources using MCP. Works well for POCs, but a major risk and operational burden for production when 100s/ 1000s of agents try to do it.

On the other hand, some organizations are addressing this by letting AI agents access data far on the right side of the flow on Analytical systems. This is usually slow, sometimes the context is lost due to staged pipelines, data cleansing etc and it lacks governance, lineage becoming less trustworthy.. Using Confluent’s mainframe offload and CDC approach, you are neither bothering the DBs/source, nor waiting for hours and finally getting ungoverned data. You get data in real-time on a scalable infrastructure, data is governed and ready for AI Agent’s use.

What tangible impact are enterprises across sectors like BFSI, retail, manufacturing and digital platforms seeing from adopting real-time data architectures?

Enterprises across sectors such as BFSI, retail, manufacturing, telecom, GCC, and digital platforms are increasingly adopting real-time data architectures to move from delayed insights to event-driven, real-time decision-making.

For example, Swiggy uses Confluent’s data streaming platform to manage the complex flow of events, from order placement and restaurant confirmation to delivery partner allocation and real-time tracking. By streaming operational data across systems as events occur, Swiggy can coordinate logistics more efficiently, reduce delays, and deliver a more reliable real-time experience for customers.

Similarly, Meesho has built its real-time data infrastructure with Confluent to address the challenge of scaling its marketplace while supporting millions of buyers and sellers. By streaming data across its platform in real time, Meesho can react instantly to changes in inventory, orders, and customer activity, allowing the company to launch new features faster while maintaining platform stability during periods of rapid growth.

At the scale of digital media platforms, JioCinema leverages Kafka-based streaming architectures with Confluent technologies to handle extremely high volumes of user activity during live events and peak traffic periods. Real-time data streaming allows the platform to process millions of user interactions and system events continuously, helping maintain service reliability while delivering a smooth viewing experience at scale.

Across sectors, the impact of real-time data architectures is clear: faster operational decisions, improved platform reliability, and the ability to scale digital services without increasing infrastructure complexity.

Given India’s rapid digital transformation, how large is the opportunity for AI-driven, real-time data infrastructure in shaping the country’s next phase of digital growth?

India’s opportunity is massive because AI success ultimately depends on the quality, accessibility, and timeliness of data. As enterprises move from AI experimentation to real deployment, real-time data infrastructure becomes foundational. The fact that 95% of Indian IT leaders say data streaming simplifies AI adoption and 96% plan to increase investments in it shows that organizations already recognize this shift.

In the Indian market, where digital payments, e-commerce, telecom, and public digital infrastructure operate at massive scale, AI systems must process continuously, real-time data must drive decisions, experiences must be personalized, and risk must be detected instantly. Real-time data platforms enable organizations to move from reactive analytics to live, operational intelligence, which is essential for AI-led innovation.

As India’s digital economy continues to expand, the ability to capture, govern, and act on streaming data will shape the next phase of growth by unlocking faster innovation cycles, better customer experiences, and entirely new AI-driven business models.

What are the key pillars of Confluent’s GTM strategy in India?

Our GTM strategy is simple, looking for more new customers across industries as we feel real-time data is the key for new age businesses as customers experience depends on speed, precision and hyper personalization.

We are also committed to helping the existing customers by expanding into more teams and workloads until Confluent becomes the data streaming “central nervous system” for the enterprises. Also, on the bank of our last year’s partner ecosystem investment, we would be leveraging a strong ecosystem of GSIs, RSIs, ISVs and cloud partners in India, backed by global partner investments to co‑build AI/real‑time solutions, and run account‑based GTM in priority sectors.

Also Read –