Compute Wars, AI Reality Checks, and the Infrastructure Breaking Point
AI is now an execution race defined by infrastructure. Patrick Moorhead and Daniel Newman break down how compute shortages, energy constraints, and security risks are reshaping the race from building models to actually running them at scale. From chip supply and hyperscaler strategy to AI-native security and the growing case for regulation, this episode maps the pressure points defining what it really takes to turn AI investment into production reality.
The handpicked topics for this week are:
1. Meta, Broadcom, and the Reality of the Compute Shortage — Meta’s multi-year MTIA partnership with Broadcom reinforces a critical truth, there is no surplus compute. Hyperscalers are simultaneously investing in NVIDIA, AMD, ARM, custom silicon, and networking just to meet demand. The discussion breaks down why “compute deficiency” is now the defining constraint in AI, and why every viable chip, regardless of performance tier, will find a buyer. (The Decode)
2. Anthropic 4.7, Model Degradation, and the Hidden Cost of Scale — The hosts debate performance tradeoffs in Anthropic’s latest release, including degraded real-world usability, throttled reasoning quality, and SLA concerns. As token usage increases and compute constraints tighten, model providers are quietly balancing performance against availability, raising questions about reliability for enterprise deployment. (The Decode)
3. Enterprise AI and the Rise of AI-Native Security Architectures — IBM’s Autonomous Security platform signals a shift from AI-enhanced tools to fully AI-native security orchestration. As models increase attack surface through agents and prompt injection risks, enterprises must rethink cybersecurity at the system level, not just the application layer. (The Decode)
4. Energy, Not Just Compute, Is the Next Bottleneck — Oracle’s partnership with Bloom Energy highlights a parallel constraint, power availability. With data center expansion accelerating, companies are investing in fuel cells, natural gas, and off-grid solutions to sustain AI growth. The discussion makes clear that AI scaling is now equally dependent on energy infrastructure as it is on silicon. (The Decode)
5. Hyperscaler Strategy: Everyone Is Talking to Everyone — Google’s reported discussions with Marvell are not an exception, they are the rule. The hosts introduce the principle that every hyperscaler is constantly evaluating every chip partner. With stakes this high, redundancy, diversification, and supplier leverage are mandatory, not optional. (The Decode)
6. The Flip: Should AI Be Regulated as a Public Utility? — One side argues that AI’s scale, energy consumption, and societal impact justify utility-style regulation, comparing it to infrastructure like electricity and the internet. With trillion-dollar CapEx commitments and concentration among a few players, the case is made that access and governance will inevitably require oversight. The opposing view warns that premature regulation would lock in incumbents, slow innovation, and weaken global competitiveness, particularly against China. (The Flip)
7. Semiconductor Policy, Tariffs, and Global Leverage — Section 232 semiconductor tariffs emerge as a geopolitical tool rather than pure trade policy. The discussion outlines exemptions, unresolved packaging questions, and how tariffs are being used to influence global supply chains and negotiations with China. (Bulls & Bears)
8. TSMC Signals Unstoppable AI Demand — TSMC’s earnings confirm what the market has been debating, AI demand is not slowing. With record margins, increased CapEx, and continued expansion, the company validates long-term infrastructure investment and reinforces that supply, not demand, is the limiting factor. (Bulls & Bears)
9. ASML and the Fragility of the Supply Chain — ASML’s performance highlights strong demand but also exposes geopolitical risk, particularly around China restrictions. The conversation expands to include broader supply chain dependencies across equipment makers and the long-term implications of restricting access to advanced manufacturing tools. (Bulls & Bears)
10. Quantum Signals: DARPA, IBM, and the Next Compute Frontier — The episode closes with a look at quantum computing’s trajectory, including DARPA contracts and IBM’s push toward measurable business value. While still early, quantum is positioned as the next layer of heterogeneous compute that could redefine long-term infrastructure. (Bulls & Bears)
Subscribe to our Youtube channel so you never miss an episode.
Listen to the audio here:
Disclaimer: The Six Five Pod is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and reference share prices, but nothing discussed should be taken as investment advice. We are not investment advisors.
Daniel Newman:
Hey, we're back. We're past 300 now, so you don't have to talk about this every single time, but it is episode 301. It is that time of the week. Patrick Moorhead, we are back. We are back, baby. You're not home. You're not in your coffin, that you call an office, a coffin, you know, a big… Buddy, I had an early workout, a lot of stuff to do.
Patrick Moorhead:
You know, I had surgery this week and, you know, I'm just getting better. I had four different things done. I know you all care. I had a septoplasty, a balloon sinoplasty, turbinate reduction, and cyclone lavage in 15 minutes.
Daniel Newman:
Yeah, that's gonna keep the viewers here. Thanks for telling us. By the way, who checked on you at least three times?
Patrick Moorhead:
No, I appreciate that. Dan is not a robot. Dan cares and Dan is actually a friend and not just interested in what I can provide him financially. Yeah, I appreciate that. Yeah.
Daniel Newman:
Is that weird?
Patrick Moorhead:
People? Well, to some people, I mean, it's kind of normal for most people.
Daniel Newman:
Ask not what your friend can do for your bank account. Ask not what your friends can do for your bank account. Yeah, so good week though. You were home all week.
Patrick Moorhead:
I was home all week. In fact, I have been home for two and a half weeks. And it's been amazing. But listen, the surgery was a setback, but the long term, no pain, no gain. I did work out today to do a great back day. Pulled back a little. Doctor said, don't work out for a week. But maybe I broke the rules. Couldn't help myself. But yeah, I mean, in between that, I worked all but one day. Forwarded my agents. We always have to have an agent's talk. My health app, I added two new APIs. I added the Oura Ring API, and I added the Withings API to my stack. Feeling pretty good about that. I forgot to wear my ring for two days. I don't even know if I was alive. We don't know if you were stressed or not.
Daniel Newman:
I used to know naturally. And just like with AI, I used to actually not have any talking points. Now I do, though. But yeah, crazy week. I was on the road a couple of days in New York City, attended an event, World Quantum Day. And then I was doing some 6.5 work for an upcoming techumentary series that we're putting out. Interesting new innovative products, Pat, that You and I are kind of like the product guys. What's the next thing? These are things that we've thought of.
Patrick Moorhead:
Hey, don't just gloss over the quantum day. I saw you, New York Stock Exchange, in the back row of INQ ringing the bell. Daniel Newman with an N was right underneath the N. It's not hard to find. Daniel's the guy below the N without hair.
Daniel Newman:
Frame mugging. I was mugging that frame. Yeah, so it was a really, it was cool. So obviously you and I both know their CEO very well. We both advise the company. And so doing something a little different, they did this World Quantum Day, which it was World Quantum Day everywhere, but INQ did it at the New York Stock Exchange and they brought customers, partners, and advisors. And so instead of the executive team going up and doing the closing bell, they invited a bunch of customers, advisors, and partners to do it. It was a neat experience. I mean, you know, I love Wall Street and I've stood on that little, podium where they sign the book and take it, but anyone can do that. But very few people, especially those of us that don't actually run public companies, get the chance to actually sit up there. It's kind of a cool experience because like they still bang the gavel. It's like a real thing. Like, you know, you go to NASDAQ, they just push a button, you know, and everything. It feels very visceral, you know, like there's up there and they smack the gavel and they clap and they cheer and then you sign the wall as you walk out of the thing, you know, so it was really neat.
Patrick Moorhead:
That is fun, that is special.
Daniel Newman:
Yeah, different, you know, maybe someday I'll be up there, you know, ringing the bell for something else.
Patrick Moorhead:
Ringing your own bell.
Daniel Newman:
You know, we're supposed to ring the bell for the RKNG ETF that we partnered with the Defiance at some point, but that's NASDAQ. So, you know, I'm getting some cool experiences in my old age. It's filling in my hairline for my hair.
Patrick Moorhead:
You're living a good life here, Bestie.
Daniel Newman:
So, yeah, but I also, you know, I did not get any weeks at home and I have a few more weeks on the road. Then there's a week maybe at home. I say that now because anytime a week seems to open up, it's just chaos ensues. But you and I are next week. Unfortunately, Pat, you are not at home. You're back on the road. I like being out. I like seeing events and customers. We're going to be a couple of days with Google Cloud Next, and we're going to be making a stop through at Adobe Summit. But we also had a busy week this week in tech. The wreckage is over, S&P is back to normal. How fast is that by the way?
Patrick Moorhead:
No, I know, right? I hadn't looked at my portfolio in a long time and it was pretty awesome.
Daniel Newman:
Yeah. If our producers were awesome, they'll find my tweet that I just posted where I actually went on April 2nd on Fox Business and I told Barney and Company, I said, this April is going to prove to be a month that you're going to need to hold your nose and buy. Um, so I said it early. I said it often. I did that. I actually took my own advice. Um, and, and as much as it absolutely wrecked my gut and my heart and my stress levels, I bought the fear and I'm feeling good today. Um, And, you know, I'm feeling good about those predictions. But that happened. There was a ton of AI news too. And by the way, even more this morning. I don't even know if it's in our docket. So we don't have like the open AI cerebrus news. I'm just pointing that out now. Maybe it's in our docket. I got to look down. But we've got a ton to talk about. Meta, Entropic might've launched something this week. I mean, that's like a first bullet point every week. We had some news from IBM, Bloom Energy, Oracle. Google, Marvell, and then talk a little bit about what's going on in China. I guess NVIDIA, I guess one of the least credible sources said something about NVIDIA is gonna buy a PC company. Not sure we'll get to that one, but just pointing out that it was a very uncredible source. It wasn't incredible, it was uncredible. And thank God people like you and I pointed out quickly that that wasn't gonna happen. And then we'll get into our flip and then we'll hit the bulls and bears where I will continue doing victory laps for the rest of this show. But Pat, you know, the decode is where we dive in, it's where we go deep, and man, was there a lot of good stuff to talk about. So why don't we jump in? All right, so Meta, I thought, you know, Meta's getting out of chips, or are they, Pat? What's going on there?
Patrick Moorhead:
Yeah, so a big announcement for Meta and Broadcom, a multi-year, multi-generation chip deal for their MTIA series of AI accelerators through 2029. There is an initial hit of, I believe, one gigawatt. capacity. And also, Haktan is leaving the board of Meta to shift to a more advisory role to move any governance conflict. You know, it's funny, when Meta did the deal with AMD and NVIDIA, everybody said that MTIA was dead. And I said, no, it's not. And then on the earnings call, Broadcom earnings call, Hock Tan said, you know, no, it is still moving. And this is I'll call this a ribbon, a ribbon cutting or. something like that, just pulls it all together. That's one point. And the other thing is this just, you know, I think you say this at least once a day on X, which is, you know, we don't have enough compute and, you know, screw you bubble bears or something like that. So yeah, I mean, everybody needs more compute. We are compute deficient in what companies need to do. We're not doing the hand wringing anymore of is there downstream demand in business. So it really is about when can we get more compute. Haktan stepping down, no surprise. It makes sense. I was actually surprised he was on there this long. This does show, though, how much of a big deal this must be. This tells me that the relationship has gone into absolutely overdrive. The other thing that was pretty underdiscussed is Ethernet networking is a huge part of it, right? You've got Broadcom Tomahawk, you've got Jericho Ethernet stack to connect these AI clusters. for the accelerator and the scale-out fabric. So, you know, this bundles really two multi-billion dollar silicon categories into one, directly competes with NVIDIA Spectrum X and NVLink scale-up story. So a bigger picture, final comment. You know, this is all part of Zuckerberg's personal superintelligence plan that he's, you know, framing with investors to justify the capex. And this is just a part of that.
Daniel Newman:
So I just want to point out for everyone out there, we do not have enough compute. And the zero summing that continues to be the reason people just say dumb shit. I'm sorry, but like when the MTIA, sorry, when the Broadcom deal with Google and Meta, was announced, everyone was like, MTIA, it's a failure, they're getting out of chips. It's like, no, they're not. But they had to address the immediate, and then they have to address the longer tail. And we just don't have enough compute. So the problem is, is realistically, these guys, and we're seeing it with the news today, we're seeing it with OpenAI and Cerebrus, we're seeing it with Meta going down the path with TPU, with MTIA, with NVIDIA. I think they also use AMD, they use ARM, they use,
Patrick Moorhead:
Right.
Daniel Newman:
Right. Right. So so I mean, the point is, is like, the fact is, is like, there's literally no more wafers, there's no more chips, like we can't even build more. And right now, what ends up happening is it's just Jensen, like who gets the supply? And so all of these companies have to be hedging a little bit because they all have to have, they all need more compute to accomplish what breaks through every week when, you know, when 4.7 gets announced. And then, you know, when Mythos gets dropped and like every one of those things creates exponential growth in compute. We're going to have exponentially more memory requirements. We're going to need more networking. You and I, you know, cause I think it's our job. We have to talk about like what competes with what. But at this point, I don't know if it actually matters. I mean, heck, I was talking to a reporter at the San Diego Tribune this week about Qualcomm's AI chips. And I'm like, they'll probably sell everything they can build. Just because right now is it doesn't matter if it's better, if it's worse, someone's going to want that access to compute. And there's only so much available. So Qualcomm's a rocket ship. By the way, I don't know if you've seen their stock. It went literally up like a hundred bucks this week. No, I mean, yeah, I mean, it was like, now it's like back to 400. It's rocking towards all time highs all in a week. That's amazing. I was very.
Patrick Moorhead:
Yeah, I was very proud of my tweet. I'm pretty sure I was literally like, it was surgery day and I'm recovering. So I've got a ton of stuff in my system. I said, for all the dummies who said meta MTI was dead. And then there were some people who said it was dead that I respect a lot. And they commented on it. I felt really bad. Why? I don't know.
Daniel Newman:
I like to keep it simple. You and your thoughtful, nuanced, objective take without a rocket ship or a fist bump. Nobody gives a shit about those. You need to call out the stupidity of the internet. But again, just goes back to prove we're right. That's why you should be here, because what we say tends to be more accurate than the nonsense that some of those higher red posters get because they just.
Patrick Moorhead:
I mean, how many times did people say Intel was going bankrupt? God, I mean, I mean, how many, Daniel? And, you know. We never said it was a done deal. It was, here's the chance and here's what they have to do. And if they do this, they can do this. And oh, by the way, they're too big to fail. And are they not the highest appreciating chip stock of the last six months?
Daniel Newman:
I think that's right. And I'm just trying to pull that actually really quickly. I mean, it's saying as of April 17th, they're up about 56% for the month. So, you know, we also, you know, not trying to do like too many single laps, victory laps in a single topic, but like, yeah.
Patrick Moorhead:
Are we on topic one and we're 14 minutes?
Daniel Newman:
We're on victory lap like 14 here. Yeah. And we got that one right too. So, yeah, I mean, look, you know, the Intel was made is what I like to say. They were made by the administration. Everything else after that was just a matter of time. So, Anthropic, next topic. Let's talk about Opus 47. It was, you know, so there's a lot going on with, With Anthropic, you know, you've been tweeting about how nerfed 4.6 has been. This is, by the way, in case anybody doesn't know, we have a compute capacity problem right now. And so a lot of the model providers are having to figure out, especially in the cloud versions of these things, the non-API versions of these things, how to throttle quality. And part of what they're doing is tuning down the quality of the reasoning engines. And basically, if you want anything to happen, you have to yell at your agents and do max effort over and over again. And so it's interesting because Now you've got 4.7, which is, you know, possibly they're saying, you know, 1.3, maybe I've heard even higher potential token usage, and all going to keep pushing on the amount of compute that's available. So this is really interesting. But first question I'll ask you, Pat, before I even dig into anything else here is have you used it much yet? Have you played with it?
Patrick Moorhead:
I have. I unfortunately was up at two o'clock in the morning, cranking on it. But yeah, and I have, I've definitely used it, but I'll wait for my slot to. Okay.
Daniel Newman:
I'm just curious if you used it. So, I mean, you know, according to the benchmarks, I mean, the model is currently meeting with fully better on every enterprise dimension that matters. Coding, long running agents, document work, vision. And of course we're seeing now Anthropic as a whole going through some real growing pains. I think there's been two outages now in 10 days. And the SLA obviously, because again, this is a question of running your business on this stuff versus you and I nerding out and building POCs independently. So, as token consumption goes up, it continues to give OpenAI and Google some incentive to keep running the race. I don't know if you guys saw this, but I believe this week, I think the kind of enterprise valuation of Anthropic surpassed OpenAI on a number of different future perspectives that have come out in terms of what the companies are going to be worth, meaning that we could actually see Anthropic be worth a trillion dollars to be the third trillion plus dollar IPO of the year. You know, overall, though, what are you laughing about?
Patrick Moorhead:
I mean, I remember when billions used to be impressive, you know, and a hundred billion was unattainable, right?
Daniel Newman:
Yeah. I was talking to some investment bankers this week and they said literally the entire IPO markets on hold for SpaceX because literally the SpaceX deal is going to take every dollar of liquidity out of the market. Every, every, there's like no market for anything because every dollar available of liquidity is going to go into SpaceX. Rumor has it. We'll see if that's true. So, um, I don't know, Pat. I mean, the one thing I'll say is they continue to talk about mythos and they're saying that this is not even… a meaningful step function to how much more powerful that those is. So it's interesting that Anthropic seems to already be three or four generations ahead of 4.7, and they're just basically dropping their old stuff now and presenting it to us as new. But I did use it yesterday for a couple of interesting research tasks, apps tasks. I used it both in the CLI and I used it in the cloud, and I was impressed so far with what it generated.
Patrick Moorhead:
Seriously? Yeah. God, it is literally dog shit.
Daniel Newman:
Like right now it's not even- Are you using the API or are you just using the cloud?
Patrick Moorhead:
I'm using cloud.ai and essentially it doesn't know how to create a Google doc. And it's telling me all the things that I have to do to create a Google doc. And with 4.6 and 4.6 nerfed, I was doing this completely and it's telling me about, I don't have a bash tool available anymore in this term, but I have the full base 64 content from my early blah, blah, blah, blah. And, and by the way, I, um, it's not just, it's not just people like me, it's also developers this morning. I got up and, uh, there was an app Theo on X and he is a hardcore developer. It does an absolute takedown of this, um, uh, 4.7 from a developer. a point of view. There's settings in there. I've used the CL tool, the CLI tool, don't get me wrong, where basically you basically tell it it's unbridled, right? Log into everything. And even when you do that, it is asking for permissions. It thinks that your code is actually a prompt injection. So you should check out that video. But
Daniel Newman:
I'm creating a Google Doc because I create Doc X's. So I don't know, works fine. Like, what's the, like, what's the reasoning? How's the response to the questions?
Patrick Moorhead:
Yeah, it's not telling me why. But I did tell it, hey, you used to do this frame without issues and 4.6. And it worked. And he said, You're right, let me just do it properly. This time with the complete base 64. By the way, I don't even know what base 64 I have no idea what they're even what they're even talking about here. But I think it brings up a bigger question, Daniel, which is what are what are SLAs? Should there be SLAs? Can you dumb down a service that you were paying for. And by the way, you said it wins on all the benchmarks. It doesn't. It actually is a step back on all the benchmarks that matter. Well, the web benchmark. So anytime an agent has a call out to the web to get content, that actually went down, I believe 10%. It was in the 80s. And I think it's now in the 70s. So everything I this was really focused on agents, which is an important thing to do. But it did actually go back in in a couple of those.
Daniel Newman:
All right, so we disagree. That's great. I think for what I've been doing, it works great. I don't really care about anyone else. So, just kidding. I love you all. All right, let's get to the next thing, dude. We're going extra slow today. 30 minutes, two topics. I hope everybody's still with us. IBM launches autonomous security to defend against AI-powered cyber attacks. So, you know, as Mythos comes, is IBM the solution?
Patrick Moorhead:
It was great to see IBM come out of the gates really quickly. It's called IBM Autonomous Security. It's a multi-agent service. It's coordinating basically agents to scan for vulnerabilities, detect exploit paths, and look at that contain the threats. IBM says it is the first cybersecurity product explicitly built for post mythos threat model. The interesting part is I didn't remember IBM being part of the early the early club here. Maybe they were and they just they just weren't in the press release here. But I think this I give IBM credit, right? IBM isn't always looked at as being first to announce or first to market. They were actually the first enterprise end-to-end service available for enterprises for real, GA, and it looks like they are out here, you know, essentially competing with CrowdStrike and Palo Alto Networks, right? IBM's trying to move the conversation from, you know, I'll call it AI enhanced security to AI native security orchestration. And those two things are different, but I do find it super interesting how that kind of on a collision course with Kudos, CrowdStrike and Palo Alto Networks.
Daniel Newman:
Yeah, I mean, look, this is going to be a growing concern and need for enterprises. They're going to be looking for technology that can speed up the management of their agent deployments and the security of all these things. I don't think there's a one solution that's going to get them all, but I think that companies that are looking to deploy agents quickly, they're following the pace of innovation and change and all the risks that everything including prompt injection creates for this new attack surface. They're going to be looking for companies like IBM, like Cisco, you know, like Microsoft to be providing. And so it's funny because, you know, when you had all this wreckage of enterprise software is dead and all these enterprise, like, these are going to be the companies that you're going to have to look to, to provide this level of security. So I'm interested in how well these things work and how we're going to benchmark how well these things work. Because is this something that works instead of Glasswing? Is this something that works in addition to? Do you need both? Where do enterprises go that want to move fast with AI, but want to make sure that they're not creating new and unexpected vulnerabilities? So this will be kind of an interesting one. Hey, you know what's run really, really quickly is Oracle. After its demise, I think the company was left for dead. Nobody wanted it. And then all of a sudden, it announced this massive fuel cell partnership with Bloom Energy. And it just started ripping. And, you know, so fuel cell, different type of technology, it's not grid required, that gives access to a ton of additional capacity. So just a little bit about the deal. It was April 13th, strategic partnership, contracting 2.8 gigawatts of Bloom's fuel cell systems to power AI and cloud computing infrastructure. you know, they said 1.2 gigawatts capacity has already been contract is deploying across the US. So the deal got bigger. And this one looks like, you know, they're addressing how to solve for this, this lack of power that's out there. I mean, I don't know why, Pat, personally, that this was like a thing that suddenly got Oracle back on track. But I guess the belief that they've got the power they've needed, or it was maybe just a byproduct of the Strait of Hormuz. I'm not really sure if this was a direct hit or just happened to land in the right week. But I do think that what is happening is we're seeing more and more commitments to alternative energy. We're seeing commitments to trying SMRs. We're seeing commitments to fuel cell. We're seeing commitments to, you know, basically, how do you get power off the grid? to power these data centers. And so that's probably the biggest point here. And I think Oracle, I said this many times, I think the selling was overdone. I think the cash flow concerns are real, but I think they were overblown. I think that Oracle has proven its business is sturdy. And of course, its cloud business has been the fastest growing cloud on the planet. So you do have to give some credit to the fact that this company is growing and they're going to grow into their RPO, into that backlog number. Otherwise, I think that this just goes to show that this entire sector, one is I think we're going to see more commitments from Microsoft, from Meta, from Amazon, from Google, to whether that's SMRs or fuel cell. And I also think that the G, Verona, the Constellation Energies, these companies are going to continue to be interesting to watch, too, because as they are able to create more energy capacity, I think it's going to get gobbled up and those companies are going to stay interesting in the market.
Patrick Moorhead:
Yeah. I mean, whether it's memory or energy, you know, we can debate on what's going to keep us from being able to fulfill all that compute. So do we have a compute shortage? So we don't get nerfed models from Anthropic or they don't have to do the magic tricks that make it worse. This just shows how innovative If you have a motivation, companies can be right. And I always like to say current course and speed. Right. We're going to run out of energy in twenty twenty eight. I know there's a lot of people saying I saw reports of, you know, 25 percent of data centers that were planned were scrapped. reality is they're being picked up by other companies for the most part. And you do have places like Maine that have decided they don't want any data centers. They want a moratorium on it. But I think you went down the list of ways companies can be innovative here. What I love what we're doing here in Texas with Stargate is natural gas, which is it's not even connected to the grid. It's essentially using natural gas to create, you know, using gas turbine to create the electricity. It doesn't put any consumers, you know, pricing at risk. And I think that's a great move. The other thing we're seeing is how do we get more efficient? Elon Musk and a couple other companies started up a, you know, a nonprofit, which is, okay, the energy that let's say a city is creating already today, How could they do that better? By putting batteries there. Because what these energy companies do is they have to prepare for peak power. And it's not always going to be peak power. And if you had battery systems that were there, the other part is efficiencies. You have some of these electrical companies that are using technologies that were based in the 1950s and 1960s. And if you were to modernize that, you can squeeze out maybe another 25, 35% out of that. When there is a need and there's money to be made, there will be innovation. And this is just just an example of that.
Daniel Newman:
Where there's a will, there's a way or where there's a need for more chips. Hyperscalers partner with more custom chip partners. So Google apparently is in talks with Marvell. Is this the media tech things not working out? Is this they need more capacity than Broadcom can provide? Is it they just need all of the above?
Patrick Moorhead:
What's going on? Daniel, I got a little bit frustrated with the traction that this got out there in the news and out in the Twitterverse. So what I did is I invented the Moorhead's Law of Always. which is, you know, every hyperscaler will always work and do the due diligence to look at every chip maker in any way, shape or possible. And then anybody making their own chips will look for, always look for alternatives of where they can be manufactured and who their manufacturing partners are. The stakes are too high. Anytime you have the stakes that are this high, and the differentiation is there. We're not talking about, you know, choosing between brands of potatoes here that are that are of the same type. Right. So, yeah, I mean, it would be very natural. Marvell is an absolute player in the XPU market. They bring a ton of the increasingly important networking IP, be it copper or optics. So yeah, I 100% think that Google is in talks with Marvell on TPU development. And what we're also seeing is we're seeing hyperscalers invest hundreds of millions of dollars in NRE just for a backup to see how that vendor works. And also, let's not forget, A lot of these vendors like Broadcom and Marvell have relationships with TSMC and they've been in place for decades. And there's also the potential ability for these companies to have more sway than even the hyperscalers. And that gets down to the relational manner of TSMC and how they operate. We'll see if the MediaTek chip actually comes to life. I think it probably will. And it'll probably be an inference chip. It is having some challenges, but yeah, I think Marvell will be in the game for a lot of reasons.
Daniel Newman:
Yeah. To your point, I think this is happening every day, always. So it's probably an article from the information. Misinformation, disinformation, whatever you want to call them.
Patrick Moorhead:
I loved it. I love Nikita Bear. Anybody who puts, they're going to deprioritize anybody who puts the red siren in their posts. Good. And I was kind of making fun of it last week. Red siren breaking Patrick Moorhead is in talks with Daniel Newman on working out. People thought that was funny. Do we need to do another one of these? Look at these. Dude, are you scared?
Daniel Newman:
I am freaking out.
Patrick Moorhead:
Look at me.
Daniel Newman:
Shaking. I'm shook. I'm shook. Yeah. So your law of everybody's always talking to everybody is the law. So it doesn't take a lot of imagination to make up these. these kinds of stories can be right, even if you don't know. But yeah, look, Pat, we have a capacity problem. So everybody's talking to everybody. We need more compute. We need more connectivity. We need more memory. We need more energy. Everybody's talking to everybody. That's what's happening. So that should be your little red light alert. It should be everybody's talking to everybody. All right. You know, we we have a few more topics. We're not going to get to them today because we got to get to a flip. You and I need to choose. You know, we're going to choose a side. on a topic. And this week, we're going to talk a little bit about the regulation of AI. So, Pat, we're going to ask the question, and one of us is going to be for it. One of us is going to be against it. And the question is, should AI be regulated as a public utility? Who's for this? Do they update the coin and make your face better looking than the original coin?
Patrick Moorhead:
I mean, maybe I had a special request to the production team.
Daniel Newman:
I'm just curious why my coin's not better looking.
Patrick Moorhead:
I mean, I told them I want to look like a looks-maxer.
Daniel Newman:
All right. So should AI be a public utility? And give me the argument that I will defeat.
Patrick Moorhead:
You know, Daniel, it should be. I mean, you know me. I'm a big regulation guy. And I think I should be regulated as a as a public utility. I mean, if I look at things like water, gasoline, Internet. the highway system, everything that we need is regulated. It's like air and water and self-actualization for all those Maslow hierarchy people out there. And it should be, right? It is a God-given right that everybody should have access to AI. And in fact, OpenAI itself is proposing utility style governments, right? So the company that benefits most from a laissez-faire regime so far has publicly floated the idea of a widely available AI access industry and government collaboration. And if you think about this, AI CapEx is utility at scale. I mean, Amazon's $200 billion 2026 AI infrastructure commitment exceeds the annual budgets of most nation states. So any system with CapEx, allocation, grid impact at that scale eventually gets regulated like, like a grid. So it's a self-fulfilling prophecy. And then, you know, we talked a little bit earlier on the show about power consumption, right? Power consumption is already triggering utility style regulation. You look what happened in, you know, Maine's April 14th moratorium. And we weren't talking about a, you know, gigawatt data center. We're talking about a 20 megawatt data center. Right. And we're seeing this in a lot of different U.S. states out there. We're seeing this in Western Europe where it's happening. I would say also utility style tariffs are creating cross subsidy protections for the smaller players. So what I mean by here is without structural access guarantees, enterprise AI becomes an oligopoly of four to six clouds. So, and two to three frontier labs, like how can that not get regulated? You know, all the dystopian movies that I've watched in the 80s and the 90s and the 2000s of where, you know, the company is basically running running the country. Um,
Daniel Newman:
Oh man, you are a socialist. You want to bring down and end all things that are good. This is the internet in 1995 and you want to stop the party in 1995. Just imagine if we had strong government regulations come in in 1995. We probably wouldn't even have apps like Facebook yet. We'd still be on web.zero. We'd Dialing up on prodigy. So yeah, I guess if that's what you want. I mean look Boomer I mean the bottom line is that when you're this old you've experienced as many things as you have you're gonna say hey Hey young man, slow things down a little bit here. Grandpa. Grandpa is not ready for what's coming next but look Um, you know, utilities are really designed for stable, mature technologies, not with the technology with predictable demand curves. AI is doubling, tripling, quadrupling every 12 months. You know, we saw the Stanford study that said that basically You know, we're coding performance went from 60% to nearly 100% a year. We saw agent performance goes from 12% to 66% in a year of 66% of what humans are capable of. And I think for you and me, it's closer to 100% of what we're capable of. We built it. Yeah, baby. Um, What utility regimes do is they actually lock in incumbents. Of course, OpenAI wants that. They want to guarantee that they will be the long-term incumbent because right now their lead is being chewed at every single day. Yes, if you wanted to treat AI a little bit like the internet being regulated like a telephone service back in 1995. you know, some of the dumb ideas that they have is like taxing robots, you know. Well, most enterprises use AI at huge scales and their innovation would start to get constrained. You'd actually literally be saying, hey, stop innovating right now because we're going to tax the crap out of what you're trying to build. And while people say, yeah, it's going to replace jobs, what actually happens is when companies build stuff and they're seeing optimistic futures, they hire more. And that's what we're going to find. So it just isn't the truth. So utilities are also single products in most cases, you know, natural gas, water, electricity. AI is not one thing. It is many things. And so how exactly are we going to build a utility framework about it? And last thing is, who the hell are we going to actually trust to do this? Bernie Sanders? We're going to have Nancy Pelosi. What genius that really understands the power of this technology should we be relying on to set the standards for the U.S., albeit for the world? Look, if we want to slow the economy, if we want to cripple ourselves and we want to hand the world's technology leadership to China, then sure, let's regulate these things like a utility. But I would rather think it's 1995, and let's have that party. The internet had a long way to go there, and AI is even bigger. Sorry, regulators, you're going to have to sit this one out.
Patrick Moorhead:
I am a deregulation person, but we are going to need, I mean, like we would regulate nuclear energy, something that could really go wrong. Anybody who doesn't agree with that is just not being intellectually honest. with themselves. I'm not buying into the mythos, BS, that is not AGI. I think it's great marketing at this point. But at some point, it is going to be scary. And we should ask the hard questions as well. But we shouldn't regulate it now. And so I think the number one thing is China. Because if China gets it before us, we're cooked.
Daniel Newman:
Yeah, it's a we're walking a fine line are we. I mean, there's security risk, there's utility energy distribution risk, you know, the communities could not have enough power to power their homes because we're powering agent bots that are, you know, I mean, there's these fine lines of things that need to be considered as we proliferate the technology. Like all things, you know, the simulated arguments are fun because we zero sum this shit. And we basically are like arguing a strict point when the nuance, the devil is in the details. Like, you know, I went on TV and I said, and you're right. I don't agree entirely with the mythos BS, but what I do agree is what mythos from a marketing standpoint stood for is a real potential of what these models could do in the future in terms of exposing vulnerabilities that most businesses and, and, and, uh, utility companies, the grid, you know, that's by the way, that's that's that's that's nation state warfare, you know, hacking into your grids and creating risk in our utilities, like, we need to make sure we're prepared to protect that when you have technology that's moving this fast. So it's the real deal. There's a real conversation that needs to be had here. But I do think we have to be very cautious about letting China run away. And we have to be very cautious about stifling economic growth. All right, Pat. We've done a little bit of this all throughout the entire show. A little victory lapping, lapping, lapping, lapping. Let's go to Bulls and Bears. So, Pat, we talked about this 232 a little bit. It looks like the deadline is now passed. What's next?
Patrick Moorhead:
Yeah, so section 232, semiconductor tariffs, you know, it just seems like. If I look at how many appearances I've done on CNBC or Yahoo Finance, I'm pretty sure a third of them were about tariffs. And this is a special tariff really for security purposes. And the proclamation said a narrow 25% tariff on certain advanced semiconductors not intended for U.S. domestic use and directed Commerce and USTR to engage trading partners to report back within 90 days. So the shot clock started January 14th and 90 days, and there's going to be a separate July 1st Commerce review to actually look at what Commerce and USTR came back with. So first of all, a couple of nuances here. There's a US data center exemption. That is the number one thing. So if you're consuming advanced GPUs and CPUs, you don't have anything to worry about. Very similar to the stuff that we saw in Mexico with USMCA tariffs on systems that were built outside of the US. USMCA said no tariffs as long as there was a conversion. And this is not the same, but it's related. The XPU and final assembly questions are somewhat unresolved, right? Like advanced, you know, custom ASICs, derivative products, final assembly treatment has not been fully clarified on that. And, you know, you just don't crank out a wafer and stick it into a server, right? You've got wafer, then you have thousands of dollars, tens of thousands of dollars of HPM memory and other things that are put on the final packaging. and whether it be Coas or Emid or Foveros or some derivative of that. And I think, you know, this is really a China negotiation lever, not necessarily a tariff. You've got Trump coming in and meeting with Xi in a couple of weeks. And I really think that this has a lot to do with what he might bring in there and negotiate. You know, the one thing that nobody's talking about yet are the equipment makers, right? The SMLs, the Applied Materials, LAM Research and KLA, they're facing direct exposure on Chinese and Asian customer shipments. And I think it's one of the things that we need to be very mindful of because we are creating a Chinese chip equipment industry where we may not really need to be doing that, but through these tariffs and other forms of regulation, that's where we're left.
Daniel Newman:
Yeah, a lot of good inputs there. I think this is going to be a moving target. The focus will continue to be strength for U.S. and U.S. companies and, you know, mitigating the strength of anywhere else in the world. We'll talk a little bit more about ASML momentarily, but first let's talk about a company that's on a pretty good run here, TSMC. 26 earnings, record profit, margins, crush guidance, AI demand is extremely robust. And you know what this very conservative company did, Pat? Plans to spend more on CapEx. And you know what, when TSMC starts to spend more on CapEx, actually saying they're willing to make multi-year commitments, that is a pretty dang good sign that the bubble bears were wrong, that their demand here is sturdy, and that there's going to be a lot more spending on AI. You know, a couple of the headline figures, revenue at 35.9 came above 35.7 figure. Remember, this company kind of gives guidance throughout the whole quarter. They report every month what they're doing. Um, they gave, uh, you know, they were, uh, well above the range, their margin surge, 66.2%. It was supposed to be 63, 65, which basically means the pricing power is, is, is very, very high right now. Um, they of course also are, you know, said they're driving cost improvements to operational excellence, uh, operating margin improved all above the guidance HPC, 61% of the revenue, 20% quarter over quarter growth. and three now years now representing about 25% of their way for numbers. So the guy in the queue to 10% sequential 32% year over year growth, with margins reaching 65.5 to 67.5 and capex moving to the high end of its 52 to $56 billion. He called the demand for AI extremely robust, CCY, while noting, you know, that there is still some macroeconomic concern. In fact, this company is a rocket ship. I mean, Intel, you know, this, you can't talk about it without saying that. What about EMIB? You know, what's going on there, some of these, but like, again, I don't think it's at so much the expense of TSMC. I just don't think TSMC could build any more. I think everything it can build, it will build. So if Intel can offer some additional capacity, but look, TSMC is and will remain an absolute rocket ship as long as China doesn't take over Taiwan at any time in the future. You know, nothing to add. I'm good. I got it.
Patrick Moorhead:
So, yeah, sorry about that. I was zoning off, admiring my guns in the mirror. No, seriously, I think you literally hit it out the gate. What everybody waits for is, you know, NVIDIA can have their forecasts, AMD can have theirs, but until TSMC comes through and says, we're going to build more capacity, It's not real. And that capacity does take time. But if you believe that the value of stocks is their terminal value of net cash flows, then you have to look into the future beyond the 18 months. three to four years from the decision to make a fab to get to fully yielded chips. And for them to do that is, you know, hopefully a sign of less risk. So, I mean, they raised CapEx, gross margin, and, you know, AI demand commentary in the same quarter. It's, you know, it's not just them, it's really a verdict on the whole AI supply chain. So I think that's another reason why the chip stonks are skyrocketing at this point.
Daniel Newman:
What about ASML?
Patrick Moorhead:
Yeah, I mean, ASML, right. Revenue beats, fill your guidance, raised demand, outpacing supply, even without selling much to China. I mean, you see they had they beat on revenue. They beat on net income. They beat on gross margin. Yeah, they they pretty much beat on on everything. Forward guidance, right. It was at the 8.4 to 8.9 billion euros and they came in, the midpoint was 8.7. So it was kind of in line with it, but I think it took. you know, risk off the table for them. But they still raised it. But the expectations that they were going to raise it in the same vein. And, you know, I know everybody likes to think that, you know, ASML is the only company out there that sells chip equipment. It's EUV. When it comes to packaging, when it comes to stuff like backside power, or even the type of transistor, those are really for other companies like applied materials. And even when you get into, you know, very few people even talk about metrology, which is Did I actually build what I thought I was trying to build? And does it work? And those are companies like KLA that I'm tracking. But listen, everybody's up and to the right. And the only question is China. Okay? You've got to have the entire supply chain. You know, who has more power, TSMC or the chip equipment companies? I don't know. We should debate that on a future topic.
Daniel Newman:
Yeah, I mean, they are trying to get ASML to stop shipping even DUV into China.
Patrick Moorhead:
That's right. And they've cut off, you know, a lot of applied materials in KLA, as well as, and here's the difference is Japanese, you know, Tokyo Electron, they can ship away to China, not a problem.
Daniel Newman:
Hmm. Interesting. Well, you'd expect ASML's numbers to be strong. I've been interested in how disciplined they've been, maybe to their own downside and not pushing their pricing higher. But I guess unlike car dealerships during COVID, they feel like at the end of this constraint at some point, it'll all come back to bite them. But I'm still wondering, you know, how it is that this is such a hard problem that no other company has been born to solve it yet. You know, just out of necessity, you think, you know, Elon, where are you at, dude? Like, isn't this something you can do like in your spare weekend with a couple of
Patrick Moorhead:
Yeah, that whole thing ended up exactly where I think we all thought it would, you know, like he was going to do the new machines or he was going to do wasn't going to need ASML. And, you know, my first question on that Saturday was really, okay, well, It's going to have to involve Samsung or Intel, and you're going to have to start putting in equipment orders, because you don't have 10 years to figure out how to replace all the equipment makers. And here we are.
null: 100%.
Patrick Moorhead:
All right.
Daniel Newman:
So quick end here, since I was down there, there was that World of Monopoly Day, but IMQ also announced a massive DARPA contract this week. I mean, look, this company went from 80 to 30. I want to point out, I'm a shareholder. I believe you might be too, an advisor. And so I just want to say that. So I obviously root for good results like these, but it's interesting to see kind of this quantum inflection. I spent time at IBM with Jay Gambetta the week before now seeing like There is this quantum is real movement that IBM is focused on. You've got DARPA, which is often very early in the process of commercialization. And they're looking specifically related to like defense, which is where quantum seems to be really taking off. But this contract, apparently it's called HARQ, H-A-R-Q, heterogeneous architectures for quantum. Basically, they're trying to build a new class of networked quantum computers combining trapped ions, neutral atoms, and superconducting. IBM, Google, who use different, different, I guess, heterogeneous is the thing this year. It's all about heterogeneous. But I think what's probably driving a lot of this is, you know, you saw Shamath and other people kicking around this Google paper about how maybe you only need like 10,000 qubits to break encryption. So we're hearing a lot more about quantum networking because of it. And so It's a big story that's coming on it. You know, Pat, I think quantum still is one of these things that we're trying to figure out where's the commercial curve, where's the revenue run. I think networking might be that first massive inflection. And, you know, I think compute, you know, it seems closer to the end of the decade is where that really starts to take off. People love these space trades. They love the quantum trades. They love the nuclear trades. They love stuff that, you know, doesn't quite have a ton of clarity, but has just kind of unbounded potential into the future.
Patrick Moorhead:
Yeah. So I'm an advisor to INQ as well. Let's get that out there. You know, it's interesting. The market's pricing these DARPA contracts kind of like opening ideals at this point. Right. I mean, if Quantum actually hits the timeline, like you said, end of decade, right. INQ at 35 will probably look cheap. If it doesn't, like this is a lot of beta going in there. I am really optimistic in the differing ways, like I applaud IBM in what it's doing. It's creating, you know, we think of benchmarks, right? There's a lot of different types of benchmarks, right? I like to call a measure of merit because people get all wrapped around the axle on those. But essentially, what is a defining thing that can be measured? that's important to people. And it used to be, you know, in quantum, it was super university and things that really didn't have anything to do with the downstream benefit that a business or government might have with it. And what I applaud IBM for is coming up with benchmarks for real value, real business value. And that indicates that we're going from science project university to real business and real downstream benefits. Don't get me wrong. There are people doing like fluid simulation type of work. in quantum that is showing value, but it's very, very light. And there's not a ton of it at this point. So I think it's just a great thing for the market. And if people wonder, well, what's next after GPUs? Well, it's QPUs. And it'll be heterogeneous and, you know, been around long enough. It's, you know, another Moorhead law of, of, of and right. We don't replace any technology. We just add, add new ones and new environments. You know, end of the decade, there'll be heterogeneous. CPU, GPU, QPU, XPU, and probably something new that we've never heard of, or we're taking one of those PUs and we're splitting it like we've done with AI. Yeah, I wouldn't call it an ASIC. You know, it's so funny, even that gets debated, right? You put memory, a lot of memory on an ASIC, or you put a, any type of unit RISC-V controller chip on it, people, that's not an ASIC. It's got, you know, you can, you can boot an operating system on it. Anyways, it's, it's, it's, it's nonsense.
Daniel Newman:
We love nonsense here. Well, Pat, that was great. Great show. Great conversation. Hour on the head. We care a lot about your time and we appreciate you taking the time to spend it with us. We will be back, of course, next week. More of this, only we'll do even better. We just, you know, we're like fine wine. We get better with age. Appreciate everybody tuning in. Thanks so much for being here with us this week. Episode 301, we are in the books. We'll see you all later.
MORE VIDEOS
The Six Five Pod | EP 300: Frontier AI Risks, Model Power Shifts, and Market Signals
Episode 300 marks a milestone moment for The Six Five Pod as AI shifts from innovation to consequence. This week, Patrick Moorhead and Daniel Newman unpack the risks of frontier models, the growing complexity of AI deployment, and the market signals that reveal where tech is heading next.

AI Beyond the Pilot: How Freshworks Is Delivering Real Outcomes in IT Service Operations
Dennis Woodside, CEO of Freshworks, joins Daniel Newman to discuss how AI is moving beyond experimentation and into real enterprise outcomes, reshaping service management, pricing models, and competitive dynamics.

Six Five Connected: How Dell Is Rebuilding the Enterprise PC for AI
Host Diana Blass and Six Five Media bring together Dell leaders to explore how enterprise PCs and workstations are evolving for the AI era with AI workloads, engineering innovations, and premium user experiences, reshaping the future of enterprise computing.
Other Categories
CYBERSECURITY

Threat Intelligence: Insights on Cybersecurity from Secureworks
Alex Rose from Secureworks joins Shira Rubinoff on the Cybersphere to share his insights on the critical role of threat intelligence in modern cybersecurity efforts, underscoring the importance of proactive, intelligence-driven defense mechanisms.
QUANTUM

Quantum in Action: Insights and Applications with Matt Kinsella
Quantum is no longer a technology of the future; the quantum opportunity is here now. During this keynote conversation, Infleqtion CEO, Matt Kinsella will explore the latest quantum developments and how organizations can best leverage quantum to their advantage.

Accelerating Breakthrough Quantum Applications with Neutral Atoms
Our planet needs major breakthroughs for a more sustainable future and quantum computing promises to provide a path to new solutions in a variety of industry segments. This talk will explore what it takes for quantum computers to be able to solve these significant computational challenges, and will show that the timeline to addressing valuable applications may be sooner than previously thought.

