Home

Scaling AI in the Enterprise – 3 Key Focus Areas To Drive Success - Six Five In The Booth

Scaling AI in the Enterprise – 3 Key Focus Areas To Drive Success - Six Five In The Booth

Dan Waibel, Global Chief Data Officer at HPE, joins hosts to share insights on scaling AI in the enterprise, focusing on key areas including enterprise AI program design, agentic AI, and the critical role of EKM.

Within the intricate enterprise landscape, scaling AI effectively demands more than a one-size-fits-all approach.

HPE experts offer some proven guidance and definitive strategies to navigating the core challenges in designing and deploying enterprise AI programs that truly deliver. Join hosts David Nicholson and Keith Townsend for grounded and key conversation with Dan Waibel, Global Chief Data Officer at Hewlett Packard Enterprise (HPE), who shares a roadmap for successfully scaling enterprise AI programs.

Key takeaways include:

🔹Architecting Enterprise AI Programs: Dan Waibel shares the essential components and strategic blueprints required to design and successfully scale enterprise-wide AI initiatives from concept to widespread deployment.

🔹Mastering Agentic AI Adoption: A focused look at Agentic AI, identifying the critical factors enterprises must strategically address to move beyond experimentation and achieve effective, large-scale implementation.

🔹The Cornerstone of Enterprise Knowledge: The paramount importance of Enterprise Knowledge Management (EKM) and Unstructured Data Management, with a compelling projection of EKM as an indispensable foundation for future enterprises.

🔹Hybrid Architecture for Foundational AI: Insights into the strategic advantages of a hybrid architecture, seamlessly combining cloud and on-premises solutions, to establish robust foundational platform capabilities essential for supporting scalable AI workloads.

Learn more at HPE.

Watch the full video at Six Five Media, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: Six Five In The Booth is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript

David Nicholson: Welcome to HPE Discover Las Vegas 2025.

Keith Townsend: We're Six Five in the booth. We have Dan Waibel, Global Chief Data Officer with HPE. I love these in the booth sessions, Dave. This is where, when I was a customer, this is where I came and got free advice. These folks have seen it all, they've done it all, and they're prepared to give not just free consulting, but a way forward.

David Nicholson: Let's start with a million dollars worth of free consulting right now. Dan, let's talk about this idea of an AI program. What is it? How do you scale it within an enterprise and give us an example of how you're doing this within HPE.

Dan Waibel: Awesome. Thank you. First off, thanks for having me, guys. You know, it's fantastic to be here in Vegas at Discover. We just came out of the sphere with Antonio's keynote. 14,000 people. The energy is absolutely palpable here. If we talk about an enterprise AI program. And I think we need to break it down into the what in the strategy and the how. How do you execute the program? So when we're thinking about the strategy, we want to connect our AI strategy with our business strategy. We want to make it business driven, both on the customer facing ROI side, what do those opportunities look like? And on the efficiency side, how do we automate business processes and start to simplify and enable some of these new AI capabilities? So I think that's the “what.” On the how side, we tried to simplify it into three steps or three components. One is your platforms and your foundation. Two is your governance program. That's going to kind of allow you to interface with it in the business. And the third area are use cases. So what are the business use cases that you're going to drive? Each one of those kinds of segments of your AI program we can unpack a little bit more, but we try to take kind of a simplified view of how you build an AI program and what we've done here at HPE over the last several years.

Keith Townsend: So, Dan, last year around this time, everyone was talking about chat bots as the primary use case for AI. We've all matured beyond agentic AI. Can you talk to us about the rocks? You know, you talked about, you know, these three pillars, but what are some of the rocks that large enterprises and customers in general are going to run into as they look to scale their agentic AI efforts?

Dan Waibel: Yeah, there's a number of challenges and I'll be very honest, we're all dealing with these challenges together. I think the longest pole in the tent challenge is going to be the data, I think, providing context to the models. The models are already incredibly powerful. What is it? Every six or seven months, their capabilities are doubling. So Moore's Law is two years. We're moving much more quickly than that. But I think context is going to be the most important thing for the model. So that's both on the structured data side and on the unstructured data side. And many organizations have spent time and invested in structured data. It's unstructured data where I think that's where we need to focus a little bit more and think about how we manage all of our knowledge. How do we manage, you know, thousands, tens of thousands, in our case millions of documents, gain that context so that our systems, our AI systems can be much more intelligent and informed and have that kind of knowledge and context going forward. So I think that’s one of the big challenges we have are the big rocks with agentic. Another one is connectivity around APIs and interfacing with different legacy systems. So as a large enterprise, we have an enormous legacy ecosystem of applications and platforms across the company. And we've now implemented a formal program around MCP model context protocol, which of course allows us to interface with these platforms through an agentic methodology. And then what we're able to do is take the context and leverage that context to drive activity. So we're moving from chatbots, like you said, to actual execution. And that leads us to another area of challenge, which is risks. Right? How do we manage risks in this world? And that's where your AI governance program becomes extremely important. And I think we think about risk both in terms of regulatory risk and ethical risk. So on the regulatory side, we've had structured data kinds of regulations in place for a long time, but now we have a new set of AI regulations and they're fragmented across geographies. So it's very, very important to be able to understand, end to end, how your AI platform and how your AI ecosystem is running, because ultimately all that is going to be regulated. You're going to have to be able to explain how things happen and why things happen. You have to understand what data is feeding these models. And the regulations could be different, right, depending on what geography you're operating in. So there's sovereign concerns. So there's a lot of complexity in all of this and unpacking it. We're learning as we go. But I do think, you know, governance is critically important and a big rock for us.

Keith Townsend: Dave, I asked him for rocks, he gave me boulders.

David Nicholson: Yeah, he did well, yeah. And among those boulders, the idea that data is important isn't new. But for a long time we talked about enterprise content management. Content kind of carries with it a connotation of sort of generic stuff. HPE would argue that we've moved in the age of EKM, Enterprise Knowledge management. What does that mean?

Dan Waibel: So EKM I think is maybe one of the unsung challenges that we have in front of us in the AI space because of all these documents and by the way, multimodal, right? So we have video, we have audio, we have presentations, PDFs, all these different documents out there. And it's not just about feeding those, the context from those documents into the AI models. It's actually about managing the full lifecycle of knowledge at the enterprise level. And that's how we're going to build this entire cohort of knowledge workers that we're going to start overseeing with agentic as we move forward. So it's all connected. And I do think that enterprise knowledge management is an area that companies are going to need to focus on and spend much more time learning how to manage that life cycle of knowledge as we transition into this AI era.

David Nicholson: You know, I want to double click on that really quickly because when you think about managing that knowledge, are you trading these models with those knowledge, with those pieces of knowledge? Is this fine tuning or is it retrieval or is it both?

Dan Waibel: It really can be both. I think it depends on the use case. I will say though, I think what we're going to see as we move forward is more advancements in RAG capabilities. So we offer a product called PCAI, Private Cloud AI. You guys may be familiar with that. It comes prepackaged with a number of accelerators. One of them is called RAG Essentials, which is fantastic, which accelerates your ability to augment a model with context for your organization dynamically out of the box. So that's pretty incredible. I think knowledge graphs are also going to be important for certain use cases which include fine tuning capabilities. So I think it's going to be a mix of methodologies. But what I've seen in the last probably three to six months is that RAG is really the lightest weight and quickest way to train your models on your context and your knowledge within your organization. And it's probably also the more affordable way to do this versus building and training models from scratch is extremely expensive. And the benefits, actually with the advancements in RAG, all that's kind of coming together. So I think RAG is going to be the operating system of Agentic in the future.

Keith Townsend: So I don't think you could have come to HPE Discover in maybe the past 10 years and not heard the term hybrid cloud. For the first time I can remember I'm actually seeing hyperscale cloud providers as sponsors to the show on the show floor. Talk to me about the relationship between this AI capability that you're talking about, this data pipeline, the benefits to customers taking this hybrid cloud approach.

Dan Waibel: It's a great question and I truly believe that hybrid is the future. And I think it's shifting because the reality is over 80% of enterprise data is on prem. Much of that is unstructured data. These are documents and videos, et cetera. And in reality, for these models to be smart enough, especially in the agentic world where we want to become more autonomous, we need to provide excellent context to these models. Is it realistic for us to take all of our data, in our case millions of files and documents, and move all those documents to the cloud? Or is it going to be more hybrid where in some scenarios we may want to move AI to the data, which logically makes more sense? I think it's lighter weight, it's much more cost effective and at the end of the day you could still take advantage of cloud for certain workloads, especially for structured data and kind of traditional ML use cases. But I think in the future world, as we really start to lean into the Agentic era, it's going to become more and more hybrid with more on-prem use cases. They're going to drive a much greater level of transformation versus the chatbots. That kind of thing in the last year and a half has become more prevalent.

David Nicholson: And you mentioned chatbots, There is such a thing as Chat HPE, right? What's that all about?

Dan Waibel: Yes. So Chat HPE is our internal platform that we've designed and built. We've originally built this platform to run on Azure OpenAI. So we started building this almost two and a half years ago, so very early in this new gen AI era, let's call it. And what we learned though, as we were building this platform, which you might guess by the name, has a chat interface similar to ChatGPT. But the big difference is that all of the data, the prompts, all the embeddings, and all the documents are completely secure in a separate environment. We kind of eliminated that risk from the equation. The other thing that's important to understand about chat HPE is we built custom guardrails based on HPE policies. So now we can manage the risks in a very fine tuned way that are specific to HPE. And we've taken this platform, we said, okay, now we're running dozens, actually hundreds of different use cases on Chat HPE across the company. I mean we're talking about hr, supply chain operations, finance, et cetera. But what we're realizing is as we're scaling this up from originally we started with a group of 100 users, went to 1,000, 5,000, 10,000, we have 15,000. We're about to unlock 60,000 simultaneous users on Chat HPE, which is incredible. We're moving the platform to run on PCAI for a number of different reasons and one of them is efficiency and cost management. Another one is security around documents. We may not want to move to the cloud and another reason we are moving to PC AI is we have much more control of the end to end experience versus when we originally kind of built the prototype in the cloud.

David Nicholson: When you say PCs, meaning an entire model would live on somebody's laptop or served up from an HPE server? Both. What does that look like?

Dan Waibel: Private cloud AI. This is.

David Nicholson: Oh, I'm sorry, private cloud. I was thinking about personal computer AI. So no, yeah, good to differentiate.

Dan Waibel: Yes. Let me just clarify. Yeah, yeah. Private cloud AI. PC AI is an on prem solution. Full stack hardware and software integrated, delivered through HPE and Nvidia together with HPE for Nvidia Computer, along with AI essentials and a number of accelerators that come prepackaged. So think of it almost like you're going to have a cloud that can run on prem in your own private data center. So you don't have to worry about data security. You don't have to worry about, by the way, cost fluctuations because this is an upfront investment. And then the interesting part with PCA is the more you use it, the better the economy is to scale. So it's actually the inverse of the cloud where the more you're inferencing in the cloud your costs are going to ramp. Here it's a different cost structure. Right. So if you think about what's going to happen with Agentic and you know, we went through this kind of forecasting process and we said we expect interactions on Chat HPE, our internal platform, to grow 10x in the next six months, we expect 100x in the next 18 months and 1,000 to 10,000x in the next three to five years. So think about that scaling, think about the cost to run that environment in the cloud versus an upfront capital investment where now we can encourage more use cases, more innovation because it's actually cost effective to do that versus in the cloud. Eventually you start to think about token usage and managing those costs. So it's really a different way to think about it. And that's why I mentioned earlier, I think Hybrid is the way of the future because we're not going to move millions of documents and files into the cloud. It's a little bit cost prohibitive. At some point, why not take AI and move it to the data as opposing moving data to AI?

Keith Townsend: So Dave, it sounds like it's a lot easier to move AI than it is to move the boulder. That's your data. You see what I did there?

David Nicholson: I do, but all I heard frankly was Keith Townsend was right about Hybrid and he's always been right about Hybrid. So that's what I heard.

Keith Townsend: For our guest Dan Waibel and my co host Dave Nicholson, this was an “In the Booth” jam session. Two technologists talking to another technologist about the problems faced in the enterprise. Stay tuned for more Six Five coverage. We're all over HPE Discover, helping you understand how to properly invest in your AI stack to get the business returns you're looking to receive.

David Nicholson: In Las Vegas, 2025. 

MORE VIDEOS

Innovations with Security-First, AI-Powered Networking - Six Five In The Booth at HPE Discover Las Vegas 2025

David Hughes, SVP & Chief Product Officer at HPE Aruba, joins Will Townsend to explore the agentic AI-driven future of network operations.

Powering the Future of AI: HPE’s Scalable Vision with Trish Damkroger - Six Five In The Booth

Trish Damkroger, SVP and Chief Product Officer of HPC & AI at HPE, shares her insights on addressing the burgeoning demand for advanced AI training infrastructure, emphasizing the importance of scalability, security, and performance.

Data is Your Strategy: Building Tomorrow Begins with Your Storage Infrastructure - Six Five In The Booth at HPE Discover Las Vegas 2025

Jim O'Dorisio, SVP & GM at HPE Storage, joins David Nicholson and Keith Townsend to share insights on transforming storage strategies for a data-driven world, the integration of AI in data management, and the imperative of cyber resilience.

See more

Other Categories

CYBERSECURITY

quantum