Home
The Data Problem: Building Infrastructure for the World’s Most Valuable Enterprise Asset - Six Five On The Road
The Data Problem: Building Infrastructure for the World’s Most Valuable Enterprise Asset - Six Five On The Road
Vlad Rozanovich of Lenovo joins Patrick Moorhead and Daniel Newman on Six Five On The Road to discuss why enterprise data remains underutilized and what it takes to operationalize data and AI at scale.
Data isn’t scarce, but turning it into something usable, timely, and actionable at scale is still where most enterprises fall short.
From Lenovo Tech World in Las Vegas, Patrick Moorhead and Daniel Newman are joined by Vlad Rozanovich, SVP, ISG Sales, ISO at Lenovo, to focus on why enterprise data continues to fall short of its potential despite widespread AI investment. They break down where AI ambition is running ahead of infrastructure readiness and why legacy data architectures are struggling to keep pace.
As data stretches across cloud, core, and edge environments, the idea that a single platform can handle every workload is breaking down, causing organizations to make tradeoffs as they move compute closer to where data is created, balancing responsiveness with governance, security, and cost. The message is clear: unlocking data value requires platforms built for distributed execution, not just centralized analytics.
Key Takeaways Include:
🔷 Enterprise data remains underutilized: Most organizations collect massive amounts of data but lack the infrastructure to activate it effectively.
🔷 AI ambition often outpaces readiness: Infrastructure gaps, not algorithms, are slowing progress from pilots to production.
🔷 Single-environment strategies no longer scale: Data and AI workloads increasingly demand architectures that span cloud, core, and edge.
🔷 Execution determines advantage: Organizations that modernize data platforms early are better positioned to turn AI into sustained value.
Watch the full video at sixfivemedia.com, and be sure to subscribe to our Youtube channel so you never miss an episode.
Listen to the audio here:
Disclaimer: Six Five On The Road is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.
Patrick Moorhead:
The Six Five is On The Road here at Lenovo Tech World in Las Vegas. Daniel, what an incredible event at the Sphere. Lenovo CEO, YY, kicked it off. Just a ton of luminaries, a ton of new products announced, a lot of initiatives. It was pretty awesome.
Daniel Newman:
It was a great event, Pat. It was really good to be there, to take it all in at one time. I have so many of the key thinkers of our industry. I like the word luminary. That's good. But you already used it. all there sort of swapping stories and talking about what the future is going to look like. This is a big year for AI.
Patrick Moorhead:
No, it's huge. And whether it's literally from pocket to cloud and everything in between, AI is definitely, you and I are definitely, it's not a bubble. I mean, we haven't even seen true deployment yet aside from certain chatbots and certain agents.
Daniel Newman:
It's barely, barely being used, especially in the data in the enterprise.
Patrick Moorhead: It's crazy to think, Daniel, you and I were one of the few that were invited to Redmond to see this guy that really nobody had heard of called Sam Altman. And Satya was there, and they announced it. And it's funny. I don't know if you had this. I was sitting here thinking, wow, this is great for recipes, but how am I going to do this for enterprise? Data is going to be the biggest issue that's out there. And every year, in every research we do, I was in London and did an advisory board for a vendor and sure enough, data was like number one or number two still four years later. So data management and then complexified, is that a word? It is now. It's now across hybrid AI regardless of where it is. is going to be a challenge. And here we have Vlad from Lenovo to chat about data. Vlad, how are you doing?
Vlad Rozanovich:
I'm doing great. It's great to see both of you. It's great to see you. No, I know.
Patrick Moorhead:
We were at Supercomputing together, rocking it here. And here we are once again doing what we do, go to events, doing videos, and having a good time.
Vlad Rozanovich:
Well, thanks. And thanks for coming last night. I mean, I think, like you guys said, it was really kind of a visionary event that YY and all of the industry leaders really talked about where the future of AI is heading and where Lenovo's aligning to those types of visions.
Daniel Newman:
Was there a part that just blew your mind? The Gwen Stefani concert? Yeah.
Vlad Rozanovich:
That was good. I was trying to figure out how AI fits into that, but that was actually just a good close.
Patrick Moorhead:
We've got to AI-wash everything, including music.
Daniel Newman:
The thing was, she wasn't actually there.
Patrick Moorhead:
She was.
Daniel Newman:
No, it was a hologram AI bot of Gwen Stefani, and it's so lifelike and real.
Vlad Rozanovich:
It even fooled you. And that's how advanced Lenovo is today.
Daniel Newman:
Like Madame Tussaud's Wax Museum, you didn't know if it was real or not. Well listen, we started talking a little bit about data, and I think Pat, before even AI, Every company's been talking about this data thing for like two decades, the big data era.
Patrick Moorhead:
Garbage in, garbage out, back in the 80s.
Daniel Newman:
And AI really only made this problem larger, right? More data being created more quickly that now needs more compute. Networks are more complex, needs more power in order to do anything. And by the way, the board pressure top down is even bigger than before. So, you know, 95% of the world's data is behind the firewall. Only one or 2% of enterprise data has even touched AI yet. So these are the kind of the overarching statistics, Vlad, but like, why has this gap been so hard to fill? Like everyone knows their data is what's going to make AI useful, but it's like, we keep saying this and years go by and nothing changes.
Vlad Rozanovich:
That's right. And I think part of this is when you look at data has been the crown jewels of every enterprise for the last 30 years. Where your data resides, the security of that data, the privacy of that data, the way that data pipeline has been filled and utilized by application sets over the last 30 years has not changed much. You look at block, you look at file, you look at what's secure, what can be accessed, what cannot be accessed. And now you have this complete new architecture with AI of what do you want to start analyzing, doing, and really kind of creating a pipeline for where you're using data pipelines and models that today are completely unstructured, that may be at the edge, they may be in the cloud, they may be in a private cloud. And so you're seeing this hybrid architecture between the three, where the traditional data models and the traditional data pipelines, they don't comprehend that today. And what I think you see is that current architectures and current IT server storage traditional deployments are not working because the data, as big as it is, as unstructured as it is, as quick as you need access to it and analyze it, are not designed, they were not designed 15 years ago to take advantage of what AI looks like today.
Patrick Moorhead:
So with that said, where are they underestimating what it takes to pull this together? Because we keep, you know, it's funny, we talk about data lakes, we talk about a data lake house, we talk about, you know, your point on unstructured data is huge, right? We kind of figured out structured data, you know, columnar, you know, X, Y axis, figured that out. But why are enterprises underestimating it, or are they not underestimating it, and it just takes time to, let's say, put in a holistic data fabric that touches everything?
Vlad Rozanovich:
I think today, with AI, what is still not defined yet at the enterprise level is what is the business outcome that most CXOs are looking for? And if you don't know what your outcome is, you have no idea what data lake you need. And so what I see happening is there's a couple trade-offs that are happening right now in the market that every CIO I talk to, they talk about cost versus complexity. They talk about governance versus frameworks. They talk about how do I look at software-defined versus traditional architectures. And today, because you don't have that outcome, you don't really know what that pipeline to fill that outcome is really going to look like. And then you start thinking about, OK, well, if speed's important, is it cost is no object? Or if you really want it to be flexible and remote at the edge, now you have to start thinking about what's your network architecture going to look like to actually frame this entire environment?
Patrick Moorhead:
Yeah, it's interesting. One thing that pops up that is actually surprising to me, because it was obvious, was It's one thing to have like ERP or SCM data and be able to nail that and the applications around that, but when you start applying unstructured data to that that's somewhere else, and let's say front-end data like CRM that's completely different, and by the way, those things have been solved through like data lakes, data lake houses, but when you're looking at coming up with a query or an agent that's hitting 10 different data sources, that is a fundamentally a new thing and to do it at scale and make it predictable. And that was one of the unexpected things that I'm hearing in my conversations. I don't know if you're hearing the same thing.
Vlad Rozanovich:
It absolutely is. Every CIO I talk to, every VP of IT, even here at our Tech World event, CES, I've probably had 20 meetings just leading up into today. And one of the conversations I had was with a large oil and gas company yesterday. And they said, Vlad, hey, we've tried some of these AI models to kind of play around with in a sandbox environment. And then when we actually went to something at scale, we found out that, boy, we are just completely underestimating the complexity of this hybrid environment of edge, cloud, on-prem. Because even if you try a POC, You don't get your efficiencies until you get to scale. And with AI, you have so much data, you don't know what your scale is. And so there's a lot of complexity here that I think from a Lenovo perspective on how we look at the market, hey, where can we help? First of all, You can't just utilize your existing infrastructure that you have today. You can't use your existing networking architecture that you have today. Compute, storage, network, security, permissions, it's all going to change. It's all going to change and you need to look at what is this new hybrid AI environment that's going to take a look at where your data sits, where it resides, where you act upon it, where you analyze it, how do you protect it. That's where this new thinking has to really start with CIOs.
Daniel Newman:
So these are the foundational shifts. You kind of actually read my mind of what I was going to ask you about. But you've got these different environments. Data is being created at rapid breakneck pace in data center, core enterprise IT, and of course, at the edge. All these things are happening at once. What are the shifts and sort of things that you're recommending to these customers to be thinking about to make sure their data-driven initiatives can go to scale?
Vlad Rozanovich:
Yeah. Well, Dan, the first thing is it has to be a data-driven initiative. That's the key word. Because it can't be a compute initiative. It can't be a security initiative. It can't be a performance initiative. The thing that we have to figure out is what is that data outcome. And the way we look at it, it is this hybrid environment. If you need real-time inference at the edge, well, you have to figure out what your edge architecture is going to look like for that data. But if you're going to need burst capacity, where it's so much data that you can't do it at the edge, because you either don't have the power going to your edge data centers, or you don't have the compute from a GPU scale perspective, and you've got to go burst back into the cloud, well, how does that architecture look like? What does that pipeline of data look like? And what are you going to try to actually have happen in a burst capability versus an edge capability? And so for me, that whole data architecture of, What are you trying to, from a business standpoint, accomplish? And then, what are your budgets? What does your architecture look like? What does your data scientist engagement look like? There's so many different combinations of this, and it's complex. Today, when we think of AI, it's been large language models, frontier models, big burst neocloud capabilities. Now we're starting to talk about enterprise. And now that we're starting to talk about enterprise, it is going to be this hybrid environment.
Patrick Moorhead:
Yeah, the true value is what the enterprise has. And I don't think it took us a long time to get there. I think what took us a long time was how do we bridge the gap between all of this hyperscaler and neocloud excitement with how you blend that with true enterprise outcomes. And so I want you to get really practical here. And what are some of the trade-offs that you can't have everything, right? I want a pony, but I may not get it. I might be in the Wizard of Oz and I want to go home, but I may not be able to get back. But it's like, how do I have this consistent governance, performance, security? What are the trade-offs? What do I have to give up?
Vlad Rozanovich:
What do I have to think about? Well, and this is, there are so many variables to this, and there are a lot of those trade-offs. You know, for me, governance, security, data sovereignty, you know, every CIO has to think about, how do I protect my data? Because, you know, just like 15 years ago. That's a non-negotiable. That's a non-negotiable, that's right. That's not a trade-off. That's like baseline. That's right. But what I see happening is people are negotiating right now, saying, hey, if I need my budget to go re-architect my network storage compute solution for AI, What can I cut? And so people need to remember there are some non-negotiables. Because once your crown jewel of data gets compromised, you're in a really, really bad position for the business. So from my perspective and from a Lenovo perspective, that data sovereignty security aspect is something that you can't trade off. Now when you start looking at things like performance versus cost, This is where, okay, there could be a situation where too much data analysis actually becomes a detriment because you're paying for things. It's too expensive. That's right. And so how do you make sure you understand, okay, what's the parameter sizes of the variables that I'm looking at? And how do I make sure I design to it the right way? Do I go an RTX 6000 route or do I have to go down a GB 300 route? And those are some of the things that Lenovo can come in and talk about to say, What you're trying to do, there's been this FOMO for CIOs over the last two years saying, I need to buy, you know, the GPU with the biggest capacity. And what you're figuring out is you're just wasting power. You're wasting energy. You're not getting done what you need to get done. And so really kind of frame it down for what is your application size to where you hit a point of diminishing returns and then act on that cost profile based off of the return you want to see there.
Daniel Newman:
Sounds like something that you could probably solve with a combination of quantum and HPC. I'm not joking. All the optimizations, that's what quantum is good at. So anyways, probably in 10 years, by the way. So let's do the kind of the speed round finisher here. The companies that are going to succeed with data, the companies that aren't, what's going to separate them?
Vlad Rozanovich:
Yeah, I think companies that are going to succeed with data are the ones that know what they're trying to accomplish as a business outcome. If you know what your business outcome is and how you are going to use that as a strategic differentiator for the value of your corporation, those are the ones that are going to win. The companies that are not going to win are the ones that are going to say, or who are being told by their board or their CEO, hey, you need to show that we're an AI company. and they have no idea where to go.
Patrick Moorhead:
Yeah, the same companies that said, hey, I want to buy 12 AIs, right?
Vlad Rozanovich: Now they want 13. Yeah, exactly. Yeah. And we've seen it. I've seen it from CIOs from some of the largest corporations in the world. Three years ago, they were talking about, I'm going to build my own LLM. Well, guess what? Many of those CIOs shifted very quickly when they realized what the complexity and the cost and the people resource of that was going to really require.
Daniel Newman:
And by the way, that's one of those trade-offs. What can I open source?
Vlad Rozanovich:
That's right.
Daniel Newman:
When can I do it more efficiently in the cloud? When do I need to own the infrastructure? That just goes to that. So Vlad, I want to thank you so much for joining us here. It's always fun to have these conversations. Same. Great to see you guys. Good to see you, too. Thanks so much. Thanks. Thank you, everybody, for being part of this Six Five. We are on the road here in Las Vegas, Nevada at Lenovo Tech World 2026. Check out all our other content, all other episodes here at Lenovo Tech World, and of course, all the great content on the Six Five. But for this episode, it's time to say goodbye. We'll see y'all later.
MORE VIDEOS

Intel’s Telco Commitment: AI in the Network and the Path to 6G
At MWC 2026, Intel’s Kevork Kechichian and Cristina Rodriguez join Patrick Moorhead and Daniel Newman to discuss Intel’s renewed telco commitment, the evolving role of CPU in AI-driven networks, and how operators can prepare for 6G without forcing a hardware reset.
The Six Five Pod | EP 294: AI Capital, Sovereign Cloud, and the Infrastructure Arms Race
AI funding rounds are getting bigger. Infrastructure bets are getting steeper. And the SaaS model is back under pressure. On episode 294 of The Six Five Pod, Patrick Moorhead and Daniel Newman break down the $110B OpenAI raise, Amazon’s expanded role, AMD’s $100B Meta deal, sovereign cloud momentum, and whether or not the SaaS premium is being permanently eroded.
Other Categories
CYBERSECURITY

Threat Intelligence: Insights on Cybersecurity from Secureworks
Alex Rose from Secureworks joins Shira Rubinoff on the Cybersphere to share his insights on the critical role of threat intelligence in modern cybersecurity efforts, underscoring the importance of proactive, intelligence-driven defense mechanisms.
QUANTUM

Quantum in Action: Insights and Applications with Matt Kinsella
Quantum is no longer a technology of the future; the quantum opportunity is here now. During this keynote conversation, Infleqtion CEO, Matt Kinsella will explore the latest quantum developments and how organizations can best leverage quantum to their advantage.

Accelerating Breakthrough Quantum Applications with Neutral Atoms
Our planet needs major breakthroughs for a more sustainable future and quantum computing promises to provide a path to new solutions in a variety of industry segments. This talk will explore what it takes for quantum computers to be able to solve these significant computational challenges, and will show that the timeline to addressing valuable applications may be sooner than previously thought.


