Home

The Six Five Pod | EP 293: AI Factories, Memory Crunch, and the Models vs Infrastructure Showdown

The Six Five Pod | EP 293: AI Factories, Memory Crunch, and the Models vs Infrastructure Showdown

AI momentum is accelerating, but real-world constraints are tightening. From hyperscaler infrastructure lock-ins and sovereign AI expansion to RAM shortages and enterprise AI pivots, Ep. 293 examines what truly determines leadership in the next phase of AI.

The handpicked topics for this week are:

  1. Meta & NVIDIA’s Long-Term AI Infrastructure Partnership: Meta confirmed a deep infrastructure expansion across NVIDIA Blackwell and Rubin GPUs, Grace CPUs, and advanced networking. Pat & Dan discuss hyperscaler AI factories, overflow capacity strategies, and long-term compute commitments.

  2. Microsoft’s Global South and Sovereign AI Expansion: As Microsoft continues major investment across India and emerging markets, the hosts explore sovereign cloud strategy, geopolitical positioning, and how global AI infrastructure buildouts shape long-term competitiveness.

  3. California AI Oversight and Regulatory Fragmentation Risk: State-level AI oversight initiatives raise concerns about a patchwork regulatory environment that could slow U.S. innovation relative to centralized global competitors.

  4. The HBM Memory Crunch and Long-Term Supply Constraints: High-Bandwidth Memory shortages continue to shape AI deployment timelines. Relief may not arrive until late this decade, with downstream impacts on data centers, PCs, and consumer devices.

  5. Infosys & Anthropic GSI Pivot to Enterprise AI Agents: Infosys partners with Anthropic to accelerate enterprise AI agent deployment. Hosts examine whether global systems integrators can pivot fast enough in an agent-driven economy.

  6. The Flip – Models vs Infrastructure Leadership: Is AI dominance determined by model quality or infrastructure scale? Pat & Dan debate whether gigawatts or algorithmic efficiency define long-term advantage.

  7. Bulls & Bears – Cyber, Power, EDA, SaaS & AI Infrastructure Plays: Earnings and market signals across Palo Alto Networks, Analog Devices, Cadence, ServiceNow, Dell, and Marvell highlight how execution, supply chains, and capital discipline matter in this cycle.

For a deeper dive into each topic, please click on the links above. Be sure to subscribe to The Six Five Pod so you never miss an episode.

Listen to the audio here:

Disclaimer: The Six Five Pod is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript

Daniel Newman:
The gigawatts are going to be the giga-get of the giga-future of the giga-win of this giga-cycle. Don't have enough infra, doesn't matter what your models do today, someone will outdo you tomorrow.

Patrick Moorhead: 

I mean, Daniel, you forgot about the giga-chads, which we're very disappointed about.

Daniel Newman: 

Love that intro. Well, you do. You love that one? I love that. You only love that one because you shifted from the original, I think we know what we're talking about to, of course we know what we're talking about. Remember that? Remember for the longest time that intro was, I think we know. And I said, isn't that a little undervaluing how smart you are, Moorhead? You of course know what you're talking about.

Patrick Moorhead: 

It's not about I. There's no I in we. There's no I in team, Daniel.

Daniel Newman: 

Yeah, but there is a me and I'm in any, there is a me and team. So don't forget that. Well, we're back. It's episode two 93. You know, it's funny. Like you started getting these numbers and every week you feel like you're not going up a number anymore. Like in the beginning you felt like you're really making strides. Like maybe just onto, it was like, no, it was two 92. Well, this is episode two 93, Pat. Um, so don't anyone out there say we are not committed. We are committed and we are running fast, but you know, Pat, For the first time in a long time, I did not get on an airplane this week. I did not go anywhere. It was nice. Same. Same, buddy. Yeah, it was really nice. Won't be able to say that again next week, sadly, but off we go. But a lot going on, a lot of stuff happening, fundraisers, deals being made, hands not being held in Black Panther power moments from Sam Altman and Dario Amodi, where they wouldn't hold hands, but they just raised their fists in, you could call it non-unity. I know it's crazy, but thank God AI fixed it. You know, there's so many great uses of AI that applied that to creating fight scenes and hugging scenes. And by the way, what a spiral of just brain rot. What an absolute spiral of brain rot. I hope you're all enjoying the brain rot as much as I am. My IQ dropped two points this week. But rumors out there, nonsense that you and I are out there dispelling. No, 14A did not get delayed a year. Sorry, everybody. But a lot going on, Pat. You know, I guess before I jump in, just got to ask you, you know, did you get the biceps bigger this week? I mean, you know, you look kind of jacked today. Did you, the week back in the saddle, you know, you feeling good?

Patrick Moorhead: 

Yeah, I feel great. You know, two weeks not on the road. My HRV is at a two week high. Amazing. My aura ring is telling me that I'm almost back to essentially completely back after two years. Recharged. Restored. Yeah, but I've got a new malady I'm dealing with right now. This massive sinus infection apparently I've had for like years. trying to deal with that. But you know what? Gotta move forward. So I'm pumping steroids, no not those type, and antibiotics. Yeah, the control room is probably gonna have to delete the steroid part or we're not gonna get any views. But- Why?

Daniel Newman: 

You're not allowed to talk about pumping steroids on the show?

Patrick Moorhead: 

You know, not when they're prednisolone. Yeah, methylprednisolone. So not the real ones.

Daniel Newman: 

Oh, that's too bad. I wanna know when you're blasting trend.

Patrick Moorhead: 

Yeah. No, well, you know, we might have to do a sideshow just talking trend talk.

Daniel Newman: 

You know, if you guys think it would be fun, maybe Pat and I shoot one of these while working out. That'd be fun.

Patrick Moorhead: 

That'd be good. Listen, I saw, you know, uh, uh, Kennedy and kind of doing their thing and you know, maybe instead of just, you know, posting text and our workouts and, and AI, AI modified gym pictures, we, we do a video.

Daniel Newman: 

Wait, you've been AI modifying yours?

Patrick Moorhead: 

Yeah, my arms are like at that side. Kidding.

Daniel Newman: 

No, man, I actually make everything smaller, just to

Patrick Moorhead: 

You know, cause people ask you if you actually work out on your legs. I don't, I don't know.

Daniel Newman: 

There's no point. All right, man. That's enough. Now you've, now you've crossed the line. You and I'm challenging you to a leg day. We're going to do a leg day here.

Patrick Moorhead: 

Tell you what we were going to, and we're going to do it. And given the length of the legs is going to be my, um, my handicap.

Daniel Newman: 

No, I hear any of these excuses. No pre excuses. You know, you can go ahead and chime in here, folks, if you agree. No pre excuses on the workout challenge. All right, buddy, you know the format, everyone. We break the decode. We hit the top stories of the week. Then we'll go into the flip where I win an argument, simulated argument. And then we're going to hit the markets, the bulls and the bears. We're going to talk about what's going on in the markets. And then we're going to get everybody on their way. So Pat, I don't know without further ado, but let's hit the decode. All right, it was a media tour week for me. I don't know about you, but Meta made an announcement of a new deal with NVIDIA.

Patrick Moorhead: 

Pat, what happened? Really thought about this one. I think I did three press interviews trying to decode what the heck this thing was. But let me give you the news first. So Meta and NVIDIA confirmed a long-term infrastructure partnership across Spectrum X, GPUs, Confidential Computing, Vera-Rubin clusters. NVIDIA used the word millions of Blackwell and Rubin GPUs plus Gray CPUs and Vera for deployment in 2027. Meta did not use the million stuff. that I thought was interesting. Having done a lot of these large company deals and stuff like that, it is hard to see why there was a difference here. Let's get into, so why did we even do this announcement, right? I think first and foremost, it helps tame this meta going to TPU and Google Cloud story. And I just want to remind everybody, every single hyperscaler is looking at every single chip option and also overflow capacity model when they start overflow capacity data center when they, because nobody has enough. I think what happened is Jetson came in, gave a good package deal that's not just chips, chips plus capacity and one of its outlet partners, I don't know if it's CoreWeave or who it is, it doesn't matter at this point. And I also think it helped NVIDIA score some points on the CPU narrative that very quickly in two or three months, it's cool to be CPU with AI. A lot of this is driven by reinforcement learning. Also, if you look at an agentic workflow, that agentic orchestrator spawning sub-agents and doing most of the controlling, a lot of that is done by a CPU. And I think finally, kind of interesting fodder to bring out before GTC. I think also throw a little bit of a hand grenade into, well, what does this mean for AMD, if anything? What does this mean for Intel, if anything, and any other company that may or may not be bringing out an ARM-based data center CPU in the next six months?

Daniel Newman: 

Yeah, I think it's a big number. You know, we keep talking about that huge CapEx commit and NVIDIA is going to get the outsized amount. MET is building an absolute compute factory. It's investing big. It's going to continue to invest big. A lot of dumb takes about this being a zero sum thing. A lot of people really hammered on AMD specifically about this one. I just don't think it's that specific. I think Meta knows it needs a lot of compute and it's going to want as much as it can get from NVIDIA and it's still going to buy AMD as well. I think this news might have been one of the kind of quiet. You kind of mentioned this the the sort of quiet winners was ARM. Clearly, with Manus acquisition and what Meta is doing with agents, you're seeing the rise of CPU. You're seeing NVIDIA and Meta talking about leaning into more CPU, which of course is going to primarily be Grace and Vera, and it's going to be a huge beneficiary for ARM. This is another one of these big deals, but Not a lot of specifics around it, Pat, but if it's millions of GPUs, it's many, many, many billions of dollars. And so you do the math. I like counting. You like counting. We like counting. So the math's looking good.

Patrick Moorhead: Yeah. I mean, what a million, million GPUs. Let's just throw a number out there. 35,000 per, right. That's a, that's a big number. Somebody, some people might even say 50, 50,000 per.

Daniel Newman: 

Yeah, or you do that was it 72 per rack. You can break the million under how many racks and something like four or 5 million Iraq, a couple different ways you can math it. And that of course. You know, the million might not even be enough to cover it. Millions. Millions. All right. So that's that. You know, but while we're out, you know, spending another company that keeps spending big is Microsoft. You know, I don't know. I kind of teased in the beginning of the show, the India summit, you had all the big CEOs on stage, all in unity, holding hands. except for the two big model guys. We'll debate models later, though. But Satya Nadella has, I think he committed about 17 billion last year to India. And they talked about having something like a 50 billion Global South AI push. What's the Global South? These are basically developing nations, Southern Hemisphere. where they plan to spend more. This is where they're going to expand their AI and cloud infrastructure. They're going to go after these emerging markets. You know, Pat, I think it's opportunistic. I think one is there's probably going to be some meaningful sort of tax and benefits for these investments going into these regions. I think when you look at like in India, it's a region that's going to have to figure out its pivot because it's grown on these kind of BPOs. These BPOs, and the BPOs are not going to need as many people in the AI era, but all these people that became talent, they became coders and engineers and developers and systems and project managers and did this all over in that region. A lot of smart, capable people that are going to pivot into AI research. They're going to pivot into infrastructure deployment. They could become data center support. So I think I think this is going to be a good region with talent, and I think talents going to matter. I will not replace all jobs, despite my, my own panic, at times, you know, just because I think it could replace me. This is where you're supposed to tell me I'm irreplaceable, everybody. But yeah, I mean, I think this is an opportunity to capture growth. And again, the AI train is not going to leave all stations at the same time. So investing in these other regions is going to create growth opportunities for Microsoft and others to play in these emerging markets.

Patrick Moorhead: 

So Daniel, this really hits a huge topic and one of the big topics that is being discussed all around the world, which is sovereign AI, right? And this really got a head of steam when the new administration came in. And even the prior administration, there was an incident where Microsoft had cut off access to a certain country and we were trying to get data from them. And so what happened is obviously countries got concerned and they certainly didn't want to build AI. that they didn't feel that they could control. Meaning, hey, you can't unplug us. Our data is going to be sovereign and stay in one place. So this whole concept of sovereign AI. I spent a bunch of time in the UAE and that was one of the biggest topics out there. They're trying to be the sovereign cloud and the UAE with G42 even has this moniker of a digital embassy. using Microsoft technology to really dial in and kind of have your cake and eat it too. But India is just the next run for Microsoft, right? Microsoft made a $15 billion investment, and that's not over 10 years in the UAE. And now they're making a, it looks like a four-year $50 billion by 2030 investments out there. And hats off to Microsoft for kind of getting ahead of the game here because they could potentially be at risk, right? And what this does is it potentially circumvents, I'll call local heroes, from popping up like you've seen in Western Europe that compete with the Microsofts and the AWSs out there.

Daniel Newman: 

So let's move on to California. While they're busy, you know, sending their wealthiest and most capable creators out of the state. They're also continuing to want to launch more oversights. You know, there's an investigation ongoing into Zuck, but there's another one with an AI oversight accountability program focused on XAI and Grok. What's going on there, Pat?

Patrick Moorhead: 

Yeah, so I will first say that kind of what XAI did with the, you know, put this person in a bikini needed to be wrangled in a lot sooner. But California is using it to leverage and create these small, what could become 50 small islands of AI that make the United States as hard to do business with as in Europe. It's just the other way around where the EU is the one making all these onerous mandates and literally not a single, well, maybe there's two single VCs and AI companies that want to be in the region. They're all here in the United States and in China and over in the UAE. So this I understand they're using something that I think everybody can agree with, which is what XAI did with the images. But if it sets precedents, I think the amount of growth, the amount of investment won't just happen in California because if California does it, then New York's going to want to do it. And then New Jersey is going to want to do it. I mean, a lot of the coastal states will want to map it. Now, some of them will copy exactly what California does. But I think as we've seen over time, what happens, it would be like having completely different rules for the highways. You need different tires, you need different engines, you need different gasoline. I couldn't fuel up in Nevada and drive into California. I would have to buy California gas. That would just be completely onerous and completely ridiculous. would be completely inefficient and bad for AI and quite frankly, bad for society because we have to slow down AI for good, for healthcare, for education, for poverty, for what ails this world first and foremost. I'm not saying that AI should move forward with reckless abandon. I don't think, but I don't think having a state-by-state rules for this at this point is a smart thing.

Daniel Newman:

Yeah, I mean, California's not the bellwether, but it's influential. I don't know that what California does can be looked at as what the rest of the country will do, but California wants to send a signal, and you've got tough politicians that want to stand up. This is tough, though, because this kind of patchwork state level is going to slow down, especially if we really do believe we're in an AI arms race with China. China has got a top-down mandate, and it's all in. We start adding these frameworks, slow ourselves down, it will materially impact the US's ability to lead in AI. But at the same time, I'm not I don't want to trivialize, like, things that are being done with AI that shouldn't be. And we need to have some sort of policy for that. You know, I'm worried about everything from, you know, you know, how to authenticity, you know, you've got tons of cyber risk created that needs to be managed and monitored. You've got nuisance, like think about like, as AI enables, like robo calling even farther, like what, God, how bad text and voice can be like, there are things that have to be met dealt with. We can't just ignore all of this in the spirit of, you know, um, But gosh, if every state tries to do it, or many of them try to do this, it could get ugly. It's get really slow and ugly. So I guess we'll see. You know what else could make it slow and ugly though, Pat? If we don't have enough memory to build the infrastructure that we're committing to, as we try to bring 80 plus gigawatts online. You know, every year it's been something, you know, we had the GPU shortages, we've got packaging shortages. And I mean, 26 is the year of memory. You know, you've seen companies like SanDisk and Micron absolutely rocket. Who would have thought SanDisk, who would have thought that Intel should have kept up its Optane business and would have actually done well. But Pat, there really does seem to be no relief in sight. I've had conversations, as I'm sure you have, with the CEOs of Micron, of Intel, of smaller semi-companies, and everyone seems to agree that there's really no relief in sight until 2029. You've got some pretty big shovels in the ground to try to build out more capacity. And of course, the memory crisis is really not only going to hit AI, which can seem to afford it. These companies that are buying it can afford to pay more and have some margin erosion. But it's going to really hurt the consumer in some ways, because when all of a sudden the memory we need for our laptops and PCs and smartphones, I mean, let's make no mistake, like that's where the impact is going to start to hit. And that's going to start to hit inflation and the consumer. But yeah, I mean, we look it looks like the reallocations are going towards data center, going towards HBM. And that's going to, it's going to have a whole cascading effect. These are wild times, but the shortage year is very real. And I mean, I'm hearing 28 relief, but realistically that's if projects go perfectly, we can have 29, we can be into 30. This could be a very long memory cycle with some very asymmetric pricing and some pretty big downstream impacts to consumers.

Patrick Moorhead: 

Yes. So I've been in tech, this is my ninth cycle. And as I've been very consistent, there's always too much memory or too little memory. And, you know, lest we forget the three years ago when we were doing, um, Doing bears bulls and bears going through micron of, and Samsung of negative gross margins. Think about that negative gross margins. And what happened is exactly what you would guess a cash conservation. They pulled back on CapEx big time. And, and then, and then AI happened and. There's a couple of things here. So first of all, each wafer of HBM, you require about 4x the wafers to match the same wafers of, let's say, standard DRAM, like DDR5 or DDR4. And so you have that issue. And so that pulls away on other memory. And then if you look at the cascade of profitability, you have hyperscalers who literally will pay anything and any price to get this because they're making the most margin at this point. And then you go to the opposite effect, which is the least amount of margin, which is on the PC side. So the last people to get the memory will be the PC folks. because not only do they not have the margin, but they also, I mean, they definitely have the volume, but they can't afford to pay 5X, 6X, 10X like the hyperscalers can and will be. The good news is that the memory makers are not being stupid. They realize that this is not forever. If they don't give memory to people who make smartphones, who make Windows PCs, then those markets will die. They're not getting everything they want or the prices that they want, and the PC makers are reacting as you would expect, eliminating low-end CPU offerings. They're raising prices, and they're de-featuring other elements like display technology. If it keeps going at this rate, Daniel, there's nothing on a spreadsheet that I've seen that says we'll be out of this by 2030. You and I caught up with Sanjay from Micron at Davos, and he was speaking from his point of view. He felt pretty comfortable second half 2027. He doesn't represent Samsung and Hynix. And also, if you factor in, you know, what could be the gigantic, like, truly, if you do believe in the bulk case for AI, which I know you do, Daniel, you remind everybody every day on X about that. Not every day. It's like a weekly reminder. We're not going to be out of this mess. Another factor is where you're seeing non-HBM designs that use, let's say, LPDDR instead. It still uses memory. And I'm convinced that part of this disaggregation of the AI workflow between pre-fill and decode and even training, you don't need HBM for decode. You definitely need it for training and you need it for pre-fill. So anyways, we will see, baby.

Daniel Newman: 

Well, you heard me talking about the India-based BPOs and SIs. It looks like they're not taking it laying down. They're doing some things to try to increase relevance and to scale the businesses and pivot with AI. And I saw a note this week, and I know you've worked closely, Infosys, partnering with Anthropic to build enterprise-grade AI agents. What is Infosys doing? Is this a repositioning? What's the move here?

Patrick Moorhead: 

So I would say overall, Daniel, I've been super disappointed at the reaction of the GSIs. And I'm just talking in aggregate. And the scale of the amount of employees, I mean, Accenture alone has almost 900,000 employees. Okay, think about this. And they are doing heavy lifting of anywhere from SAP consolidation to connecting the front end to the back end, as I always like to say. Basically, any major heavy lift mainframe modernization projects. Any type of modernization project that we've seen for the last 20 years has traditionally been done. And what I would have expected, Daniel, is that all of these companies would have become the master of agent creation. Who knows SAP consolidation better than a GSI? Nobody does, right? And typically, you'll spend $3 million on a $3 million For a study, it tells you what you need to do. It identifies where the data, who the people, what the applications that touch it. That's 3 million bucks right there. It takes six months. And you haven't even done the 10 or 20, 30, 40, $50 million consolidation. So with that said, I know I'm boring you, Daniel. Um, I would have expected you are, I would, I would have expected them to become the absolute master of agents to, to get their people to do this all, all automatically. And then they didn't, some of them tried and they failed. Okay. So this to me seems like a very pragmatic view. I mean, heck you and I are, are at least this week, uh, bullish on Claude cowork. and the agent work that they do and how they actually help get work done. Maybe we'll hate him next month and somebody else will be better. But this is a very pragmatic take, a focus on clear ROI, sovereign readiness as a differentiator. So yeah, I'm glad to see it and I want to learn more about it. Their emphasis isn't pretending. that they can do it themselves.

Daniel Newman: 

Yeah, this was a bit of a show me moment for a lot of these companies. They're making this pivot. They're adding value.

Patrick Moorhead: 

A what moment? Show me. Oh, sorry. That said Shelby. I'm like, did I did I not read that book? A Shelby book?

Daniel Newman: 

Show a Shelby moment, show me moment. It's a show me moment, meaning like, there's a lot of paint the town with agentic narrative. And then there's the we're adding value and helping companies do this. And you know, it's not, it's not a trivial task to take a, like, if you're building a zero to two today business, you can be totally agentic quickly.

Patrick Moorhead: 

Yeah.

Daniel Newman: 

But if you have like legacy systems and data sitting inside of different, like, it is not trivial to get that data exposed to AI and then have the agents working with quote unquote agency to do things and be sure it's doing it correctly, like that, that's real work. And so like, there is a big market. And I mean, there's a lot of these open companies like these Sierras and others that are kind of saying we're open platform agent framework, technology companies, you know, though, Sierra is the one that the former Salesforce CEO forgetting his name. He'll hate that. But the the, you know, The bottom line is I still think there's a layer for humans to fix and make this stuff work and verify and validate. In time, though, it is, Pat, like the 900,000 employees, like I just don't see it. I just don't see how that works. It's going to be really interesting to see how that works long term. Because everything, like even infrastructure as code, like a lot of the work was like standing infrastructure up. Like you're seeing so much of that stuff being automated. So I think it's going to be lean and mean, but I think these are really important pivots. They'll be fun to watch.

Patrick Moorhead: 

I had an interest. I think the private GSIs are going to weather the storm better. They can fly under the radar. They can minimize profitability without their stock being destroyed. I mean, sure, the partners aren't going to be happy, but I've seen the private folks, and you know who you are, move a lot quicker. And quite frankly, I had somebody, it was an India conversation, can't tell you who, said that they were going to eliminate 75% of their jobs. Right? So, I'm not kidding. 75%, Daniel. Think about that.

Daniel Newman: 

It's massive. I just don't see another way.

Patrick Moorhead: 

Yeah.

Daniel Newman: 

I mean, look, you and I are like… you know, we're curmudgeony at times, but like, I've been like in the lab, like, like, you know, everybody's kind of like still like, I'm in the lab testing everything that you can do with this stuff. Like every process that like, can you write good research? Can you do market models? Can you script a podcast? Can you create social like content flows? Like, like everything that we do, I'm looking at right now. And I'm just like beating the crap out of the AI platforms and the agents and the tools. And it's like, It's not, like I said, it's not easy. It's not all easy to do at a level, and I think you're doing the same thing, where you're like, I would put this out to the public. But when you're working with collaborating, iterating, and starting to really build that kind of data corpus that it can see, and you train the voice, it starts to get to the point where you're like, Yeah, this is good. Like, and some of the things I get, I'm like, not just good. I'm like, this is amazing. Like, like strategy stuff like that you would, you know, like, like, like, poor McKinsey, because like some of the stuff like if you put your data and you get like the, you have all the kind of metrics of what you're trying to do, and you want to talk strategy with Claude, it's like, it's really good.

Patrick Moorhead: 

It's, it's really thoughtful. I saw a great plugin on Claude, plugin for PowerPoint, that said, refactor this like a McKinsey presentation, it went through between McKinsey, BCG. But McKinsey's, I don't know, moving down or up, down the food chain or up the food chain. And they're doing a lot of work that Gartner would normally do now in making, you know, big technological recommendations. So, God, the Gartner stock is just getting pummeled. It's crazy. That and Forrester, tough.

Daniel Newman: 

All right. All right. Speaking of tough, tough, tough, you want to battle? You want to wage war with me? I will. I mean, I got to give you, you got to win on something, buddy. All right. Give me a shot. Let's talk about, we're going to go in a flip here. And today we're going to basically talk about, there seems to be two big pivots. It's model leadership or it's infrastructure, AI factory leadership, Pat. So the question is, the companies that are chasing the best model versus the companies that are chasing the best infrastructure. So let me say it in another way, is it all about locking in AI factories or is model quality the most important in 2026? Let's see who believes infrastructure over models. And let's have it out.

Patrick Moorhead: 

Look at you, the hardware guy.

Daniel Newman: 

Well, you know me, I've always been there. I've always been a hardware guy.

Patrick Moorhead: 

Oh my gosh. You were the biggest software guy when I met you.

Daniel Newman: 

Deep down hardware guy. Look, I don't even think this is going to take me very long. You know, models are table stakes right now. It is, it is, you know, every day there's a new model, you know, some are open source, it's China, it's the U.S., it's Claude, it's OpenAI, it's Gemini Advanced Pro 3.1++ Sidebar Kick Double Bench Press 325er. And, you know, every day the one new model does something better than another, and we're all chasing revenue, you know, folks like us where I'm subscribing today. Oh, perplexity is great. Oh, Google's great. I'm moving over to Claude today. Models themselves don't ensure a long-term moat. This is why Sam Altman was chasing trillions of dollars to build a fab network because he knew, even as the model guy in the company that broke this space open, that if they don't have enough infrastructure, that's why he's spending trillions on infrastructure, wanted to spend more trillions on fabs because he knows gotta have the infrastructure in place. So the smartest companies have actually been the ones that have been somewhat on the sidelines in terms of fighting the model battle and they've been making sure that they have enough infrastructure and why does Mark Zuckerberg want to have more infrastructure? Why does Satya Nadella want more infrastructure? It's because it gives them the ultimate flexibility. The future will be all about having the flexibility, having enough compute to basically be able to pivot and move as quickly as the technological transformations underneath are taking place. If you don't have enough compute, even if you have the best model today, that will not be the best model tomorrow. All these guys know it. So a lot of people criticize Zuck. When he spent all that money on super intelligence lab and then didn't roll out the next llama fast enough. And then they said China's taking over the open source. But I think they've figured out they can wait on building the models. What they can't wait is on building out enough infrastructure. So this is very simple, Pat. The gigawatts are going to be the giga get of the giga future of the giga win of this giga cycle. Don't have enough infra. Doesn't matter what your models do today. Someone will outdo you tomorrow.

Patrick Moorhead:

 I mean, Daniel, you forgot about the GigaChads, which I'm very disappointed about. No, I mean, listen, in this simulated debate, you're basically saying that software quality doesn't matter, right? That I can put out any hardware that I want and it just doesn't matter. I mean, how many times have we discussed that it's the combination of the software and the hardware that makes the magic happen. And whether that's an Apple iPhone, a PC, and oh, by the way, the outcomes, and that's what actually matters in this world is outcomes, not the size of your infrastructure. And I mean, my gosh, we saw this with DeepSeek. I mean, DeepSeek and what it could do with limited infrastructure. I mean, you get a finely tuned model, it can use 10% the infrastructure, particularly the inference side to do what you need to do. And that's all driven by the model. Let's look at the industry today. Let's look at Google as an example. Google will come out with a new model, and then they'll come out with a flash model. A flash model is an optimized version of the model that came out maybe a month before and was all based on model improvements. And typically what we see, because the pricing sometimes comes down 90% between the standard model and the flash model. So I think I would much rather put my money into the best option. Don't get me wrong, there is a certain level of infrastructure that's required to do a certain task. I think I look at the other, who has the most infrastructure right now with GPUs? It's XAI. Who do you know that's super impressed by the XAI model out there? Who's using it? Who's talking about it?

Daniel Newman: I mean, I know- The richest man in the world loves it.

Patrick Moorhead: 

True. True. It's not really doing much, anything for productivity. It's doing a very good job when I'm trying to figure out what the heck somebody is tweeting about. What is this nonsense? I think it's amazing at that. It seems to do political ideology in a fair way. It's not biased on one side or the other, but it's like nobody's using it. And they've got the most hardware. And that's probably the best example out there that makes sense. And I also think What's happening is not just the model. It's also about the tools, the plug-ins, the fine tuning, and how it's locked in to solve a specific problem. Like, it's not just Claude the autonobot that sits there, right? You've got Claude for PowerPoint. You've got Claude for Excel. You've got Claude code. You've got all these different variations that are basically a golf club bag. You pull out the club you need to do that. That's really delivered by the software. So I rest my case. Once again, I have won the debate. Thank you for playing. That was terrible. That was terrible.

Daniel Newman: 

You got smoked. It's fine. It wasn't even a fair fight. It was a fair fight because it was such a one-sided answer. I could end this whole thing by just tweeting out that Pat's not a hardware guy. I could really break you. I could be like, Pat just said infrastructure doesn't matter.

Patrick Moorhead:

Nobody would believe that.

Daniel Newman: 

Well, I'm just saying, I mean, I could, if I tweeted, it would be true. And I put a rocket afterwards and that would be it. Hey, hey, if it works well enough, I might, you might put a rocket at the end of some of your tweets soon. I can see it happening. The only reason I don't is because you do.

Patrick Moorhead: 

It's kind of like my brother. Ruined it. I ruined it for you. My brother was an engineer and therefore I wasn't. Not that I could have been an engineer, but it's like, no, my brother's an engineer. But I do appreciate you pulling back on the bubble bear tweets. I do. I almost had to mute you. Almost.

Daniel Newman: 

Well, I think I won the debate. Like I stood my ground until when AI started to eat everything, you came to the realization that there's not, it can't be both. So I was right. You know, I just, I had to stay the course until I was done being right. And so, and by the way, you know, you can't really get anyone to look at your stuff if you don't inflame some people. So I had to inflame all the bubble bears, including Pat Moorhead, who thinks that AI is a bubble. He's an anti-infrastructure bubble guy. How did this happen?

Patrick Moorhead: 

How did we get here? Oh, look at that. By the way, no minoxidil, none of that, nothing.

Daniel Newman: 

Your hair looks good.

Patrick Moorhead: 

You know, some days it's kind of looking like a coma, or I don't know if it's my increased testosterone that's kind of eating away. DHT? I don't know what's doing it, but I feel like I am losing hair. Now look at my brother who's like completely bald on top. My dad, my dad didn't even know he was bald because he was bald in the back until somebody told him one day.

Daniel Newman: 

Because he's tall and bald, but never looked in the mirror? I get told sometimes I'm bald. Like people that want to be like shit posting me and they're like, you're bald. I'm like, no shit. It's like, yeah, I know. I mean, anyways. All right. Well, you know, that was a good debate. That was fun. That was always fun. Pat, let's hammer out, you know, there's some earnings this week. So let's hit it. Let's hit it. Let's hit bulls and bears. I'm embarrassed on your hair. I'm bullish on the, there's no bubble, but I'm embarrassed on your hair. Embarrassed on the thickness of your hair.

Patrick Moorhead:

All good.

Daniel Newman: 

All right. So let's talk first some earnings, Pat. Palo Alto Networks reported.

Patrick Moorhead: 

How'd they do? All right. Palo Alto, we had a triple beat and the market rewarded them with an 8% 8% cut in share price. It was all about margins and the cost narratives and the guide. So pretty, pretty, pretty straightforward, right? Markets punishing them for margin uncertainty due to integration costs, but they're not questioning the demand. I do think the CyberArk acquisition was pretty good. I think they did a decent case justifying why it was good for them. I think… Palo Alto is doing a decent job becoming a kind of a one-stop shop. It's tough to compete with people like Cisco, though, in this who is kind of their arch enemy on there. There were some… Street's not impressed. JPMorgan lowered price target, but they kept them overweight. A lot of compression there. So the question is, can Palo Alto Networks maintain the margin execution at the same time they're absorbing all of these acquisition costs at a time where you don't want to be looked at as a software play. And this is the conundrum that they're in. Is Palo Alto an AI play? Are they a software play? Or are they both? Or a cyber play.

Daniel Newman: 

But what is true is As we bring agents online, every agent is a badgeless, faceless, nameless reckoning in terms of cyber risk. We know that from our Claude bot, Molt bot, OpenAI bot. Seeing while we are proliferating and investing and digesting and implementing tech quickly, there's a ton of risk that's being created as we scale out all this tech. That's going to open doors. I think cyber theme actually catches somewhat of a tailwind. You know, I was on, I went to CNBC and talked about this with Palo Alto. I think that the super platform that Nikesh Arora is building is going to do well because people are going to struggle with the fragmented space that is cyber. It's always been a struggle. You go to RSA, the number of vendors, there's so many. The consolidation layer has to take place. Can Palo Alto win? You know, you mentioned Cisco. Google with Wiz. I mean the consolidation is happening. You know it's going to be who can consolidate best has the broadest solution and can kind of overlay this A.I. you know this A.I. transformation that's going on. But Palo Alto has one of the most complete security portfolios in the industry. It's made a lot of good acquisitions. Its business did perform better than the two individual CyberArk and Palo Alto was expected, but not as good as maybe the synergies that some were hoping that would get straight out the gate. I'm still optimistic that cyber is one of the parts of the industry that can do really well throughout this, because I still think we always underestimate the amount we need to invest in cyber. And if AI is as transformative as we think it is, we're going to need to spend more there to make that happen. Let's talk a little bit about analog devices. We don't talk about these guys too often. We'll just kind of do a quick hit here. Some of these industrial and data center companion companies and these ones that have specialty chips, non maybe leading edge. They all seem to be doing really well, Pat. They're all having a bit of a renaissance after a few tough years. So, you know, they basically came out, you know, they forecasted second quarter above estimates. They're saying their demand for industrial data center are both outsized. Their AI is a tailwind for them. And like I said, they've been a bit under, you know, underperforming some of the more leading edge parts of the market. There's not a lot here in this one, Pat. But we talked about this with Lattice. We've had this with Coherent. These companies that are participating in the AI boom that haven't necessarily been the front runners, you're starting to see a nice catch up. That's pulled the socks forward. That's also broadened out the chip space. And a lot of these companies, they have critical components that have to go into a lot of these big systems that are being deployed.

Patrick Moorhead: 

Yeah. I mean, they're all about power, right? And the data center is not all about power, but… A lot about power? They're a lot about power? Yeah. Well, hey, last night in the dinner, somebody asked me, Chip's CEO asked me what I'm most excited about. beyond the personal stuff, I talked about power.

Daniel Newman: 

But didn't you say like, you're most excited about Birkin bags full of cash or something?

Patrick Moorhead: 

I may have said something similar to that. That's what I need.

Daniel Newman: 

Nobody would believe you would say that, Pat. You're way too experienced and polished to come out like that. And we may have been at the same dinner. We may have eaten the same meal.

Patrick Moorhead: 

We may have sat across from each other. And we may have played footsie. I don't know. We couldn't hold hands. That would have been awkward. So instead we just did the footsie thing. Anyways, let's move on. Power. Yes. Anytime there's power. Power on the chip. Power on the card. Power on the mainboard. Power in the rack. Power across the racks. Power conditioning of the fleet. connection to the grid, about 17 different places you can add value. Now, in the end, these folks are primarily an edge company and what they do, but they did pretty well. I mean, unlike a lot of the other triple beats, they were up 8% and did pretty well.

Daniel Newman: 

Yeah, I think there's the catch up. The companies that have not run, that were beat up for a long time, that are starting to outperform are getting a tailwind. Where the ones that have been outperforming are getting crushed. They can't do enough. So let's hit real quick cadence, Pat. Another really important player. I think they had some news this week, yeah?

Patrick Moorhead: 

Yeah, I mean, they had at least a double beat and then forecast was in line up for up 4%. Again, this is a class of players that I think will be power players in four or five years. And it's not that they're not power players today, they are, but as more people design and develop their own ships and their own infrastructure and platforms go more vertical, companies like Cadence and Synopsys as a category get a lot more important. And you really gotta, you know, kind of balance that against the two fiefdoms, the China and the US or America or Western fiefdoms, because I think Cadence would be able to drive a lot more business in China. if we didn't restrict them. So what makes it challenging is a lot of the tools that you can use to do a low-end power regulator are the same tools that you can use to make a GPU or an accelerator. So it's not like, oh, I can sell them detuned EDA tools out there. So yeah, I mean, it was the backlog that got everybody I'm excited here and I really need to drill in on why a company like this would actually have backlog like it has. There's probably implementation timing that goes into it, but to have an $8 billion backlog is, I can't fully explain that.

Daniel Newman: 

Yeah, I don't have that either. So, that might be worth us coming back to. Let's hit, because there's a few more here, Pat. Let's keep going. An interesting one, as SaaSmageddon or SaaSpocalypse continues, there's no better vote of confidence is when the CEOs of the companies and the executives stop selling their own stock into what's considered a downturn. Now, one of the things a lot of people don't understand is that most stock sales from executives are well planned in advance. Usually it's anywhere from a quarter to several quarters. And this is a tends to be a large part of how these people get paid. Like you've noticed a lot of them, their cash comp, you know, people always think CEOs are making a lot of money and they are, but a lot of the money they make is stock. And so they get grants, they get rewards. Many times those rewards were made years in the past for performance, and then they sell some of the shares over time. But when you're in the sort of the middle of the fire, one of the worst things you can do is sell your stock. So it doesn't really matter when you planned it and how far in advance you planned it. It was a positive, I think, catalyst for ServiceNow this week when Bill McDermott and a number of other executives at ServiceNow decided to cancel their previous planned stock sales. Actually, McDermott announced that he's going to purchase another $3 million in his own stock, which again, some people are like, that's not that much, but $3 million purchase is a lot. Because again, he's not only not taking cash out, now he's putting his cash back in. So I think it was a good vote of confidence. But I think this is one of those kinds of things we need to see from more and more of these CEOs. If they truly believe that their product is stable, that their disruption is not real, and that their business growth is going to continue, there's no better way to show the street that you're confident in your future than putting your money where your mouth is. This is a moment, Pat. You got to show people you're willing to invest. And I think that was a good move.

Patrick Moorhead: 

Yeah. It's funny. I do like, it's funny. I think it does show the confidence, but is it also, Hey, I'm going to cancel the sales agreements because I'm not going to make any money. You know, is that what it's about? Or, you know, now that I think about it, it's probably, I think it's a confidence play. Right. We should do an hour-long show on SaaSpocalypse. I think that would be interesting. I think that's what people want to hear from us. And maybe we even do a deep dive on memory. I think that'd be fun.

Daniel Newman: 

What else we got, Daniel? Dell, Pat, you know, it saw 4.1% comeback. I guess Lenovo had flagged some memory issues and Dell sharply sold off. But I think you and I have had some great conversations over the year with folks at Dell. And if there's anyone that does supply, any company that does supply chain well, it rhymes with doing supply chain well, it's a company that's called Dell. Yeah.

Patrick Moorhead: 

Yeah. I mean, listen, Lenovo came out with earnings, Dell stock went down 9%. And then I think reality set in that looked at the percentage of revenue that comes from PCs versus infrastructure. And that's why the Dell snapped back 4%. I think both Lenovo and Dell have very good supply chains, as evidenced by what happened during the pandemic. I do think that that Dell probably favors on the memory side long term agreements. And I think that it should give Dell a pricing and availability advantage over its biggest competitors. Over the long term, that evens out with an LTA and you pay the price on the opposite side of an LTA where you might be spending a little bit more. when memory prices are low compared to the folks who don't have or who buy more on the spot market to do that. But if you were a company that was buying on the spot market, you were getting absolutely wrecked. Right now, I am doing a bit of research trying to figure out who is going to get wrecked and why. What I do know, what I do feel comfortable with talking about is that Dell will be one of the vendors, if not the vendor, who will be on top related to memory.

Daniel Newman: 

All right. You kind of got my take. This is all about smart planning and the companies that have the best planning and have the best supply chains. will win, the companies that underplanned or did a lot of spot pricing to try to defer costs and expenses are going to get hammered. And hammered not just on paying too much, maybe not being able to get stuff at all. That's an even worse scenario right now. All right, one last note before we take off here, Pat, is Marvell got a strong reiteration of a buy. A number of people are, we haven't talked about Marvell a lot. Broadcom gets a lot of attention, NVIDIA, AMD, but Marvell is still there churning away, helping to build some of the most advanced AI chips. And also, when we went to their analyst day, really doing a lot of the pulling together the networking, scale up, scale out, scale across. They've got one of the broadest and biggest portfolios. There's been less focus on just XPUs lately, and who's winning which XPU, as we've started to see the infrastructure at the chip level scaling out to the whole rack level. But there's a lot of Marvell content in a lot of these racks, isn't there? And it's not surprising that they would be reiterated as a buyer, that the market would be looking at it positively. It's got a lot of upside there.

Patrick Moorhead: 

Yeah, I mean, I think you did a great job. I mean, scale up, scale across data centers, across fleets, and even the speed of what does scale up really mean? Like, if you can do scale out at the speed of scale up, and you're limiting the amount of hops, then you're doing great. So I'm looking forward to the earnings, getting kind of caught up on the That's the right word. Kind of narrowing the rumor mill and getting down to brass tacks. I'm looking forward to that.

Daniel Newman: 

Fun, fun, fun.

Patrick Moorhead: 

So much going on. Hey, Pat, guess what's next week? Gosh, somebody having earnings. It's NVIDIA Earnings Week, right? Or no, it's HP Earnings Week. Yes.

Daniel Newman: 

And NVIDIA Earnings Week. And as we all know, it's a national holiday. It's one that you plan and you travel around. Pat will probably be on TV 10 or 20 times that day.

Patrick Moorhead: 

Don't rub it in my face that you're a TV boy, okay? Don't do that.

Daniel Newman: 

Hey, I carried the books around to get your laptop at the right level for the last NVIDIA earnings. I'm here for you.

Patrick Moorhead: 

I think you may have, uh, may have. Naked my position on a certain show. We're going to have to have a talk about that.

Daniel Newman: 

I don't think that show exists anymore.

Patrick Moorhead: 

Same name.

Daniel Newman: 

Same name. Yeah, same name. Yeah. Yeah. I think it's these shows or these media companies are just constant flux, man. I'm telling you, the hosts, the shows, the moderators. But you know what they always get is good insights from Pat and Dan. More from Pat, but they get rocket ships size and bubble bears from me. So I'm here for y'all. I'm here for y'all, Pat. If he ever mutes me, that's gonna be the best day ever, because then he doesn't see all my great stuff. Bubble Bear. I love that. All right, everybody. Well, we appreciate you so much to all you Bubble Bears out there watching the show. You're a clown. Sorry, you're all clowns. But everyone else, we love you. We appreciate you. We appreciate you subscribing and being part of our community. Six Five out for this week. We'll see you all soon.

MORE VIDEOS

The Rise of Companion Silicon: Rethinking AI Architecture from Edge to Cloud

Patrick Moorhead and Daniel Newman break down the week’s biggest AI signals, from $650B in hyperscaler CapEx and Anthropic’s breakout momentum to the SaaS repricing debate and a Flip segment on how fast AI can realistically disrupt white-collar work.

The Six Five Pod | EP 292: Capital Flood, AI Disruption, and the Real Risks Ahead

AI investment is accelerating at historic levels, but so are the questions. From trillion-dollar semiconductor forecasts and 100-year bonds to the debate over AI’s impact on jobs, Ep. 292 explores whether we are witnessing a sustainable transformation or a systemic shock

Lenovo’s Hybrid AI Strategy for the Inference Era – Six Five Connected with Diana Blass

Lenovo’s strategy for enterprise AI centers on scaling inference into real-world operations. At CES 2026, the company unveiled hybrid AI solutions spanning edge, on-prem, and cloud environments—backed by advanced liquid cooling and rack-scale architectures built with NVIDIA and AMD.

See more

Other Categories

CYBERSECURITY

QUANTUM