Our enterprise AI strategy is built on model integration, hardware acceleration, and enterprise search: Abhas Ricky, Chief Strategy Officer, Cloudera

During an exclusive interaction, Abhas Ricky, Chief Strategy Officer, Cloudera, speaks about the evolving landscape of data management and the significant challenges and opportunities it presents. Ricky shares insights into how Cloudera is navigating the complexities of AI integration, multi-cloud optimisation, and sustainability, while also emphasising the importance of talent retention in a competitive market. As Cloudera continues to lead in enterprise AI and open-source innovation, their focus on reducing costs, enhancing data security, and fostering a robust, sustainable ecosystem positions the company as a key player in the future of data technology.

Given the rapid evolution of data technologies and increasing competition, what do you see as the biggest challenges and opportunities facing the data management industry today?

There are three core areas in data management to focus on when addressing this issue. First, enterprise AI has democratised the analytics and AI market. Previously, application developers needed data scientists for model adjustments, but now, LLMs can be applied directly to enterprise contexts, yielding high-fidelity outputs tailored to specific use cases. This makes data, particularly enterprise-specific data, more crucial than ever.

Second, cost remains a key factor in IT investments and project timelines. In the enterprise AI world, compute costs are skyrocketing, especially for GPUs and model training, with some customers spending millions per month. Thus, finding ways to reduce total cost of ownership (TCO) is both an opportunity and a challenge.

We are working with NVIDIA to drive hardware acceleration, allowing customers to run large workloads on GPUs and private clouds. We’re also launching an inferencing service to optimise performance.

Lastly, as data structures evolve, data sprawl has led to new architectural concepts like the Lakehouse. The ideal Lakehouse can read from any data store and write to any engine. Since 2020, Cloudera has invested in Iceberg to meet customer demands for flexible deployment, role-based access control, data federation, and cataloguing capabilities. We remain committed to supporting and contributing to the Iceberg community.

Given the rapid shift to cloud, how is Cloudera ensuring its platform is optimised for a multi-cloud environment while maintaining data sovereignty and security?

Cloudera is one of the first vendors to enable bidirectional movement of data, metadata, users, and applications between public and private clouds without needing costly application refactoring, which can range from $3 million to $5 million per application. This capability is especially beneficial for large enterprises with numerous applications, reducing the high costs associated with moving workloads.

Our goal is to help customers perform analytics and AI at the point of data origination and residency, bringing models to the data rather than the other way around. This requires a robust public and private cloud platform, which Cloudera provides across Amazon Web Services, Microsoft Azure, and Google Cloud. This flexibility prevents vendor lock-in, allowing customers to move workloads seamlessly between clouds and efficiently use their prepaid credits with large hyperscalers.

As cloud adoption grows, it’s essential for customers and partners to control their data, particularly in regulated industries. In India, for instance, we’re working with government stakeholders, including Digital India and the Ministry of IT, to meet sovereign cloud requirements. Cloudera’s solutions address data residency, sovereignty, and regulatory compliance, helping customers navigate both current and future regulations. 

The Indian Government, especially the public sector, is a significant customer, and we work across various departments to support compliance and cloud adoption in this hybrid, multi-cloud world.

How is Cloudera embedding AI and machine learning capabilities into its core platform to deliver more intelligent and automated data insights?

Cloudera’s strategy focuses on two main areas: helping customers build AI applications with the Cloudera Data Platform (CDP) and integrating AI into CDP itself. I believe your question is about the latter, so let’s focus on that.

Our enterprise AI strategy is built on three pillars:

  1. Model integration: We ensure customers can use any AI model—whether open-source or closed-source—from providers like Cohere or Anthropic with just one click via Cloudera Machine Learning and our pre-built connectors.
  2. Hardware acceleration: We’re partnering with companies like NVIDIA to make AI more efficient, reducing total cost of ownership (TCO) and ensuring competitive performance.
  3. Enterprise search: We enable enterprise and semantic search queries through partnerships with Pinecone for public cloud and other providers for private cloud, while also expanding our capabilities.

To support these pillars, we’re building data products that are accessible across platforms. Specifically in AI, this means “building AI with Cloudera” and “running AI in Cloudera,” creating and running powerful AI applications with integrated AI assistants like SQL AI Assistant, AI chatbots, and Cloudera Machine Learning co-pilot.

Our aim is to simplify one-click deployment of AI models, reducing the time to value by providing high-quality examples and leveraging cutting-edge ML techniques.

Also, regarding running AI in Cloudera, we offer a Machine Learning Workspace where you can develop AI solutions, perform data science, training, fine-tuning, prompt engineering, packaging, deployment, and AI serving—helping you build AI applications faster.

Given the fierce competition for data talent, what is Cloudera’s strategy for attracting, developing, and retaining top talent?

There are three key points to highlight. First, as the largest open-source company, we lead innovation in areas like cybersecurity and enterprise AI. Open source is often on par with or surpassing closed-source developments. For next-gen engineers, working with the latest open-source technologies is essential, and we provide those opportunities. Plus, we offer the chance to work with the world’s largest enterprises, including the top 10 banks, manufacturers, and insurers globally, which is a rare and valuable opportunity.

Second, we ensure a well-balanced and competitive compensation structure that includes financial competitiveness, comprehensive benefits, and a focus on work-life balance. For example, our “Think-All-as-Unplug” initiative gives employees 22 additional days off each year to disconnect from daily work, which supports personal well-being and familial commitments.

Third, through our partnerships, hackathons, and advanced technological initiatives, we offer access to a full end-to-end tech stack and the opportunity to work with top-tier partners. This allows our employees to make a real impact, which is a rare chance in the industry.

As a private company, we are committed to creating enterprise value for our stakeholders—employees, shareholders, and partners. We are a well-balanced growth company with a strong P&L structure and high profitability, making us unique in our ability to generate value.

AICloudITtechnology
Comments (0)
Add Comment