Home
The New Backbone of Enterprise IT: Why is Hybrid AI the Next Compute Era? - Six Five On The Road
The New Backbone of Enterprise IT: Why is Hybrid AI the Next Compute Era? - Six Five On The Road
Flynn Maloy of Lenovo joins Patrick Moorhead and Daniel Newman to discuss why hybrid AI has become the practical foundation for enterprise AI, and how data location, infrastructure design, and orchestration shape what can scale.
The AI era isn’t being built in one place, and enterprises are finally accepting that reality.
From Lenovo Tech World in Las Vegas, Patrick Moorhead and Daniel Newman are with Flynn Maloy, Chief Marketing Officer for ISG at Lenovo, to examine why hybrid AI has become the operating model enterprises are embracing in practice. Their discussion moves past cloud ideology to focus on how organizations are distributing AI across on-prem, cloud, and edge environments as performance tradeoffs, governance requirements, and data gravity dictate where workloads can realistically run.
Key Takeaways:
🔷 Hybrid AI is becoming the default, not the exception: Enterprises are learning that no single environment can meet every AI requirement across performance, cost, and governance.
🔷 Data location drives architecture decisions: The question of where data will live increasingly determines where AI workloads can run, reshaping infrastructure roadmaps and becoming as important as the models themselves.
🔷 On-prem, cloud, and edge must work together: Modern AI depends on coordination across environments, not isolated stacks.
🔷 Infrastructure choices today define flexibility tomorrow: Early architectural decisions shape how easily organizations can scale and adapt AI over time.
Watch the full video at sixfivemedia.com and be sure to subscribe to our Youtube channel and never miss an episode.
Listen to the audio here:
IMPORTANT: Note to Uploader
Disclaimer: Six Five On The Road is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.
Patrick Moorhead:
The six five is on the road here at Lenovo Tech World in Las Vegas. We are coming off an incredible event in the sphere, headlined by Lenovo's CEO YY and pretty much a cadre of industry luminaries.
Daniel Newman:
Pat that was a wonderful experience. Yeah, I just wanted to say that I've been waiting to say that all morning long, it was it was really nice. Yeah, it was just all of the who's who of this event. And Pat, by the way, this is really become like an AI infra event. I know the CNN consumer. Yeah. You know, but really I'm kind of thinking maybe the C should be for chips and the C should be for because like this show has been all about like chips, enterprise AI infrastructure. I mean, yeah, you know, the stuff in our pocket and stuff on our desks are certainly going to influence the. The work we're going to do with AI. But man, CES has become a bit of an enterprise show, hasn't it?
Patrick Moorhead:
No, it is for sure. And I just remember five years ago when, you know, infrastructure was this flat, you know, single digit growth business and people were wondering, you know, what was going to happen with it.
And then generative AI hit and now we are with the AI. And it started pretty much a consumer in the cloud, but over the years is very quickly getting to an enterprise play. A conversation we knew was coming was, hey, is all AI going to be delivered from the public cloud? And the reality is the answer is no, right.
We're going to have this, you know, hybrid AI, right? We had hybrid cloud. And of course we have hybrid AI.
Daniel Newman:
Yeah, absolutely. Pat I think we saw it in the first era. You know, there's there was a little tug of war for it. And there were certain clouds that were like, it's all going to the cloud. And by the way, there was pundits that said it was all going to the cloud.
I think you were one of the smart pundits that said, oh, no, this thing's going to end up hybrid. And then, by the way, not just hybrid. It's going to end up multi. It's going to end up you know there's going to be a lot of different places in which infrastructure is going to exist. And I think if anything AI is just accelerating the fact that you're going to be a multi hybrid AI environment because where does the model run? I mean that's why don't bring it here. No, I just like talking to you okay. We like our guests.
Patrick Moorhead:
I want to bring in Flynn Malloy. Flynn. Congratulations on an incredible.
Flynn Malloy:
Thank you.
Patrick Moorhead:
uh, event last night. It really was amazing. And I urge everybody out there, if you haven't seen it, to go in and check that out.
Daniel Newman:
Pat, by the way, did you see we ended up on the big screen?
Pat, by the way, did you see we ended up on the big screen? Yeah, we did. You know that we were on the big. Yeah. I didn't know that.
Patrick Moorhead:
As they were closing it out, there was a video of Dan and I that was put on doing the simulated smartphone. Doing the pre-game. Oh, fantastic.
Flynn Malloy:
That was cool, I know that, but yes, we are basking in the glow of that show this morning.
Um, you know, it was the biggest thing that Lenovo has ever done. You know, and we talked about it before. Coca-Cola has the trophy launch. Um, you know, Adidas has the ball launch, and this was the technology launch for the FIFA sponsorship. And we use that in CES as a platform to showcase our partnerships, as a platform to launch our new technology and to kick off what's going to be an amazing year for Lenovo over the next six months. You know, we're going to be all over FIFA for global football for the largest event that mankind has ever seen.
Daniel Newman:
Hang on folks. As a soccer guy, as a soccer guy myself, I was very excited. I will say though, the enterprise AI section that kicked off awesome, but that video of the race cars and I had all the feels gave me all the feels, you know, like, you know, we're a car guys. You know, for everyone out there, I don't know that he's been a regular. He likes driving muscle cars a little bit. And someday maybe we'll just do a whole show about that. But I don't know about you. I hear those engines roaring, you know, that. Was that going? I get like, and I just well, we did those we
Flynn Malloy:
They even took the big Sky camera to. To Europe. And you saw the Hungary F1 track. There were also shots of Beacon Hill, which is the F1 data center, which now is running liquid cooling as announced yesterday. So you know the sphere, we're all in on shooting the most exciting places and the things that we do in Lenovo. And that was my favorite part of the show. Yeah, probably had something to do with that. I don't know, maybe we might have had a little bit to do with it, but yeah, it was great. Your baby is beautiful.
Daniel Newman:
All right. So let's talk a little bit about hybrid AI. You know, we talked about in the in the preamble a little bit and you heard us. And so oh Pat and I almost went on our own diatribe and just did the show by ourselves. We decided to bring you in here. Now I really want to hear from you this. I like hybrid AI. Like, what is that? Like, what are people supposed to be picturing when you say hybrid design through the lens of Lenovo? And why is that important that they have the right mental model for what that's going to look like? Because, you know, it's still very early for.
Flynn Malloy:
Well, it is. And we talked about that earlier in the in the pregame. We've talked about it for the last year together, which is, you know, businesses, small midsize businesses. You know, they've been doing a lot with AI. They've been getting a lot of subscriptions running stuff in the cloud, trying to figure out, you know, I mean, there's some things with AI once you can't unsee it.
Like once you see what it can do, it's really amazing. And everybody is kind of stretching. How do my workflows, how should my functions go? How do I build this into my business to drive growth, to save cost? But the answers for how to do that are not immediate. And a lot of it comes back to, as we say, what does hybrid mean? As we've said about hybrid cloud where's the data? Where does the data want to stay? Where does it want to go? What about data gravity sovereignty. You know all of those issues follow the data and the data. You know, almost all of your data is being generated out at the edge. And when we say edge, that's not just, you know, the far distant lands or, you know, out in a ship in the ocean, the edge is the the store, the retail store.
It's the manufacturing site. It's the the brand, the branch bank. It's a hospital. That's all edge. That's where the data is. That's where your customers are. That's where, you know, AI wants to be delivered. So when you think about what is hybrid for some companies, you know, there isn't an issue with latency.
There isn't an issue with data gravity. You know, we can run it all, you know, costs aside, but we can run it all out of the public cloud. I think for the majority of customers, um, you know, we've seen over 65% of our customers have all said, hey, um, we would like to have, you know, a hybrid answer, which means, you know, some should run, some can run in the public cloud, others, you know, we want to bring the AI out closer to where the data is, where the customers are, where the business is, where that's happening. And that can be edge. It can be far edge, it can be in the data center. But hybrid is what is the right mix of that for your business. How do you land? You know what? You go here. What you go here, what you go here.
Patrick Moorhead:
So can you elucidate that a little bit more? I mean, it's pretty clear on the edge when it comes to kind of your data, particularly where it's created. Right. Health care, retail, manufacturing, transportation. The action is not inside the massive monolithic data center. Um, but but what is still happening in the, let's say, the cloud data center or even the, um, I guess the mothership, the on prem data center. What's happening? What does that partition look like? Like, is it like, easy as hey, we're going to we're going to, you know, train the big stuff in the big cloud. Uh, we're going to deploy it on the edge. Like, how does that how does that look?
Flynn Malloy:
Well, I don't think the industry has one answer. I don't think there will be just one answer. Um, but obviously, you know, training in the cloud, you know, inferencing out, you know, in the business at the edge is the immediate, obvious answer. Um, but I think it gets back to, you know, what what what do customers, you know, what do Businesses want out of AI. You know, we talked about that before. Do businesses want to build AI? When I talk to a CIO, hey, what do you want to get out of? I want to build AI. You know, they want to use AI. They want to use it at speed, at scale around their company. And that, I think, is, you know, talks to what kind of AI they're going to to put what they're going to get out of the public clouds, what kind, what they're going to put in their own data centers and what they're going to, you know, deliver at the edge. Think, for example, like a hospital, right, with the doctors and nurses walking around with the radiologist, you know, there's going to be in clients up and down that that environment.
There's all kinds of different on device intelligence applications that are going to be private to to that hospital environment and then other things, you know, diagnosis that they you know, that they can go and get up in the public cloud and it's going to be the smooth delivery of all of that. Maybe it's smaller models in the, in the, you know, out at the edge, maybe being able to bring big models like, you some of the inferencing servers we announced yesterday, the 675.
There's enough processing power in that to put, you know, a multi-billion parameter model all the way out at the very edge so that your your real time inferencing where you need it. That's going to be a choice for what kind of workloads and what do you need. And making that decision what should go here? What should go here? What should go here is the key to successful implementation of hybrid.
Daniel Newman:
Makes sense. And there's a lot of, you know, constraints right now. There's a lot of pressures. You know enterprises may not be overly thinking some of the things that maybe Lenovo is about, you know, where do we get enough memory? Where do we get enough energy. But at the same time, they're seeing all these pressures, they're hearing all this. And and their boards are saying, look, you need to do AI. I mean, that's kind of what it is like. The board comes into the room, CEO sits down like, are you doing AI? You're doing enough AI. I mean, give me an AI. I want 5 a.m. we need an AI.
We need all this stuff on AI. We need AI, you know? And so obviously that's a big pressure. But then there's these smaller pressures. Can I get the equipment. Can we do it within a price? Can we drive ROI like? Talk a little bit about that and how that's driving compute strategy because, um, you know, I think you've probably heard me say it before.
Like, I don't know that the board and the CEO are necessarily thinking, well, I need the the server to sit here, sit there. That's not what they're thinking about. But then the pressure on the people underneath them is, hey, I need to do as much AI, get as much productivity and spend as little as possible in that process.
Flynn Malloy:
So I think, you know, there was the initial I completely agree with that. When there was an initial wave of, you know, oh, I need 5 or 6 A's. And the CIO was trying to answer, well, that's that's something, you know, I can't give you 5 or 6, what do you want to use it for? And I think kind of similar, by the way, to, to how cloud services ramped.
You know, in the beginning there was this big rush and then everybody kind of sat back and said, okay, we're not immediately seeing ROI. The CFO started to get engaged, going, hey, wait a minute. And then we start to focus on productivity, right? Which is, you know, back office first. And I think a lot of the businesses that we've seen, a lot of the workloads that that are really using AI today in a lot of our clients are back office stuff.
So productivity used to take me four hours. Now I can do it in one hour. Like you can measure that speed, throughput, productivity. That's where it has to start. And I think as, as as those metrics start to increase, that's where. And that will dictate, you know, the productivity, the ROI of the various workloads that you're doing that's going to dictate where and what kind of AI should you have.
Right. If you can, if you can drive. Hey, you used to take me four hours. Now it takes me one hour and that's all out of the cloud. Great, because it doesn't really matter per se. But if you're trying to do, you know, a thousand chatbot interactions an hour for your clients, and round tripping up to the cloud has got too much latency.
There's just there isn't. Then, you know, you put that here to drive your 1000 chatbots an hour, and you can measure the output and the productivity and the cost associated with 1000 chatbot interactions an hour. So if you can get down to that workload by workload, I think that's where you start to measure a measure of the benefit.
Patrick Moorhead:
Yeah, historically we've always had compute in different areas, right. In fact, the first business machine was a cash register on the edge. Right. And then, um, you know, mainframes, minis and then client server, the challenge always became, hey, I don't want to create a brand new layer of management nightmare, uh, for me. And here we are like, I think we all agree that hybrid AI is the way to go. It doesn't really matter. No matter where where necessarily, you know, the board isn't asking where it is and the users don't care. It's about the experience and the outcome. So how how are these different environments being managed?
Uh, is it you know, you know, we've seen, you know, people have a data fabric that they can pull from agentic orchestrators that operate regardless of of where the compute is happening. What are you seeing and how are they not adding more complexity?
Flynn Malloy:
Well, I think that's where our services and our solutions come in. So I think there is being complexity added and then there's nothing's going to stop that. You know, AI is adding a layer of complexity where it's coming from. And, you know, as as even mentioned on the stage, you know, you've got so many different organizations, um, grabbing this, putting the subscription in.
It's kind of the same thing as, as cloud in the early days where everybody was swiping their credit card, getting some, getting some cloud. And that's what we're seeing now, right? Everybody's swiping their credit card and getting some AI. Um, and, you know, it's interesting because we, we, we talked to our clients about treated in the same way.
You know, it took us a couple of years as an industry to sort of realize that cloud ops is not it ops how you run your it is, you know, you need new processes, new people, new skills to run cloud. Same thing for AI, right? AI ops is not cloud ops or IT ops, so putting in place like a PMO model, putting in place a broker, putting in place rules and compliance, people who know what they're doing. Partnering with a vendor that has expertise. You know, you might not have a ton of AI experts in your, you know, in your business or in the IT team. What we do, and we work with clients all around the world and we see patterns of good behavior, bad behavior. So working with, you know, with ourselves or any other vendor to put together a, you know, a PMO, put together a structure, put together a, a turning the lights off at night, put together compliance rules, who should use who should you not use. That's kind of step one in getting control of that. And then you can start to make the proper centralized trade offs for should this run here, should it run here. And you can properly broker AI.
Daniel Newman:
So as we kind of pull this all together, Flynn, you know you're talking to enterprise. One of the great things about where you sit and we talked about all the partners they had on stage is the partners that you've worked with, these chip companies that are building these systems. They end up looking to a company like Lenovo, who deals with tens of thousands of enterprises around the world to deliver this stuff at scale.
So, if I'm a CIO and you're talking to me right now and they say, what do I need to do? Like, what's the big shift in my roadmap that I need to be thinking about right now? If I'm going to be successful in delivering five AI's to my board? And I'm kidding, but in delivering AI to the enterprise.
Flynn Malloy:
Well, I think I think step number one is to understand going back to a previous theme, understand what do you want to do with it?
If you're a manufacturing company, you know, what do you want to do with it? If you're a retail company, what do you want to do with it? That's where, you know, we offer a bevy of solutions AI library. And you don't just go into the library and pick a book. You know, you try to sit down and we do this with our consultants as well.
Let's workshop, you know. Okay. You want five A's? Great. What are they going to do? What do you want them to do? Do you want them? Do you want to deliver agents to your marketing and sales team? Do you want to drive a computer vision solution to speed up your factory. Get down to what are the outcomes you want? Then we can back into. Um. All right. Is your existing infrastructure, which probably isn't. Is your existing infrastructure? Can it handle what needs to happen? Do you have the data environment there? Do you have the compliance rules, gravity rules? Where do you want it to run? All of that comes after.
What do you want to do with it? And that's where companies bring it forward. Like Lenovo a solutions library where you can sit through, look at you know, we've got solutions for development. We've got solutions for content generation. We've got solutions for HR and legal. And as you look to those that's where you start to clarify. All right. This is the outcomes that I'm looking for. Then you can back into you know do I have what I'm going to need and how do I procure what I might need or get from the cloud?
Daniel Newman:
Well, Flynn, I want to thank you so much for joining us here. It's been a great week, great event. And it's just getting started. Yeah. Let's go. I have a feeling we're going to be doing this this again at some point. But we're going to get, you know, more experiential, more outcome based, because I do think 26 is going to be a big year for outcomes. And I think all this work that's been done, it's like, hey, tell me how AI is delivering value to the enterprise. Maybe it's three AI's, maybe it's five, but all these AI's, how are we bringing success? It's going to be.
Flynn Malloy:
Well, the AI inferencing wave is on its way. You know, we're just beginning there. And when that comes, that's going to it's going to change how all businesses do it.
Daniel Newman:
The huge wave of growth.
Patrick Moorehead:
I think we're in a bubble.
Daniel Newman:
It's not a bubble. We're talking about growth here, not a dot bubble muscle rocket ship.
All right, everybody, thank you very much for being part of this six five. We are on the road here at Lenovo Tech World 2026. Subscribe. Join us for all the great coverage and conversations here at Lenovo Tech World. And of course, all the great content on the sixfive.com But for this episode, it's time to go. We'll see you all later.
MORE VIDEOS
The Six Five Pod | EP 294: AI Capital, Sovereign Cloud, and the Infrastructure Arms Race
AI funding rounds are getting bigger. Infrastructure bets are getting steeper. And the SaaS model is back under pressure. On episode 294 of The Six Five Pod, Patrick Moorhead and Daniel Newman break down the $110B OpenAI raise, Amazon’s expanded role, AMD’s $100B Meta deal, sovereign cloud momentum, and whether or not the SaaS premium is being permanently eroded.
The Rise of Companion Silicon: Rethinking AI Architecture from Edge to Cloud
Patrick Moorhead and Daniel Newman break down the week’s biggest AI signals, from $650B in hyperscaler CapEx and Anthropic’s breakout momentum to the SaaS repricing debate and a Flip segment on how fast AI can realistically disrupt white-collar work.
Other Categories
CYBERSECURITY

Threat Intelligence: Insights on Cybersecurity from Secureworks
Alex Rose from Secureworks joins Shira Rubinoff on the Cybersphere to share his insights on the critical role of threat intelligence in modern cybersecurity efforts, underscoring the importance of proactive, intelligence-driven defense mechanisms.
QUANTUM

Quantum in Action: Insights and Applications with Matt Kinsella
Quantum is no longer a technology of the future; the quantum opportunity is here now. During this keynote conversation, Infleqtion CEO, Matt Kinsella will explore the latest quantum developments and how organizations can best leverage quantum to their advantage.

Accelerating Breakthrough Quantum Applications with Neutral Atoms
Our planet needs major breakthroughs for a more sustainable future and quantum computing promises to provide a path to new solutions in a variety of industry segments. This talk will explore what it takes for quantum computers to be able to solve these significant computational challenges, and will show that the timeline to addressing valuable applications may be sooner than previously thought.


