Home

The Six Five Pod | EP 280: Qualcomm, AMD, & Intel: Navigating the AI Revolution's Key Players

The Six Five Pod | EP 280: Qualcomm, AMD, & Intel: Navigating the AI Revolution's Key Players

On this episode of The Six Five Pod, hosts Patrick Moorhead and Daniel Newman discuss AMD's groundbreaking deal with OpenAI, Cisco's advancements in AI networking, Intel's foundry progress, and Qualcomm's strategic acquisition of Arduino. The hosts also explore IBM's Tech Exchange event, debating whether we're in an AI bubble, and analyze Dell's ambitious growth forecasts. With insights on Oracle's AI economics, XAI's funding round, and the rise of AI utility companies, this episode offers a comprehensive look at the rapidly evolving AI landscape and its impact on the tech industry.

On this episode of The Six Five Pod, hosts Patrick Moorhead and Daniel Newman discuss the tech news stories that made headlines this week. The handpicked topics for this week are:

  1. AI Investments and Market Dynamics: OpenAI and AMD's $6 billion deal announcement. NVIDIA's response to the AMD-OpenAI partnership. An analysis of the competitive landscape in the AI chip market.

  1. Cisco's AI Networking Advancements: Hosts cover Cisco’s announcement of its Silicon One-based router for AI hyperscaler data centers and discuss Cisco's role in addressing network constraints in AI infrastructure.

  1. Intel's Technology Showcase: Intel's event in Arizona highlighting PC chips, server chips, and 18A process node. A look at the strategic importance of Intel Foundry and its competitive positioning.

  1. Qualcomm's IoT and Physical AI Aspirations: Qualcomm's acquisition of Arduino and its implications for IoT development. Hosts reflect on Qualcomm's strategy in expanding beyond mobile chips.

  1. IBM TechXchange Highlights: IBM shared its focus on orchestration and agents in IBM's AI strategy and highlighted partnerships with Anthropic and other AI companies.

  1. The Flip - AI Bubble Debate: A simulated debate on whether the current AI investment trend is a bubble with an analysis of market valuations, capex trends, and potential risks.

  1. Dell Technologies Analyst Meeting Insights: Hosts talk Dell's increased revenue and EPS forecasts, particularly in the data center segment, plus their strategy shifts in the PC market and enterprise AI adoption.

  1. Oracle's AI Business Economics: A discussion of a recent controversial report on Oracle's AI infrastructure profitability. Analysis of the challenges in accurately assessing AI infrastructure economics.

  1. Industry Updates and Future Events: xAI's $20 billion raise and NVIDIA's investment. Applied Materials' revenue takes a hit due to new export restriction rules. Upcoming tech events and conferences.

For a deeper dive into each topic, please click on the links above. Be sure to subscribe to The Six Five Pod so you never miss an episode.

Or listen to the audio here:


Disclaimer: The Six Five Pod is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript

Patrick Moorhead: And chips are cool. Chips are sexy. And you know, you know, Cisco's bet, right. AI scale isn't just out or up, it's also across.

Daniel Newman: Remember though, I just want to show this. I shared this yesterday. I think everybody should see this. Remember, don't be on the wrong side.

Patrick Moorhead: Yeah, that was good. That was good. So I love that.

Daniel Newman: But you know, the net. Net is, that they weren't talking about it for a period of time. It's time for them to talk about it. Right, Right. Everyone wanted to be software, everybody wanted to be infra. Nobody wanted to be a chip company. It's kind of cool to be a chip company.

Patrick Moorhead: Welcome to the Six Five pod. It is episode 208. 80. It has been a crazy week. I mean, every week seems like it's a crazy week. What investments are being made, who's doing new offerings, who's looking for new financing for. For AI. Daniel, this pace is great, isn't it?

Daniel Newman: Yeah. We were in, what, New York on Monday for an event and a whole event happened before our event happened. You know, we were both going to do some media, we were going to talk about some things that had happened last week. But then new news happened and we had to completely flip. Went on a full jolt. By Monday at noon, I was completely exhausted by the newest news of the week. And then again, that was just the first piece of news to drop in a week full of more news, Daniel.

Patrick Moorhead: But I want to memorialize something that happened for the first time with, with you and I. You and I. Even though we've, I don't know, what have we done a thousand podcasts together before, you know, six five pod plus all the six five summit and all the other stuff we've done. But we had never been on the same broadcast show at the same time. And you and I were in New York and were with Josh Lipton for his show Market Domination. And it was, it was fun. You know, Daniel, some people accuse us of being the same person. And you and I both know that that's, that that's not the, that's not the case. And you know, we look at things through a slightly different, different lens. And I got to tell you a little, I was a little bit intimidated, you know, all the stocks and all the, you know, future inequities that, that, that you're in. I mean, I may have the finance degree at almost, you know, went to work for an investment banking house, but, you know, that's just not what I do. So, you know, could Patrick Moorhead hang with Daniel Newman?

Daniel Newman: Well, I don't know. I mean, every week we do a debate here and very rarely is there a decisive winner. But you know what you're talking about. I mean, look, the introduction of the show. Do you know what we're talking about? Do you know what you're talking about? And you know, you changed. Remember in the beginning of the show you said, I think we do, and now you say, of course we do. Of course you know what we're talking about. So yeah, we both had a lot of media, but that was a lot of fun. I even included you in a few of my clips. I left you in, you know, did not cut you out. I might have cut out any part where you were talking, but I at least left, I left you in, in terms of, so people could see you there. I didn't pretend I didn't, you know, I didn't do that. But, in all seriousness, I've also just gotten to the point where I have figured out that you can't share videos longer than like 30 seconds if you want people to watch them. So we as a society are becoming increasingly moronic. We can't stay with anything. So I, I think even our podcast here, although you know what we have 103,000 subscribers now? You know, getting people to sit down and watch the whole thing. That's why we got to do tick tock clips and we got to do shorts all the time now because we have to ask the people to consume this thing in like 30 second bites. But hopefully we said some wise things. I don't know, it was a fun conversation. And by the way, we agreed on most things. I don't think we were disagreeing on too many things. I thought it was, I thought it was a lot of fun. We should do it again sometime. I also saw you by the way, in the studio at the New York Stock Exchange on Squawk in the street, one of my favorite shows on cnbc. You know, I stood there and watched you like a little fanboy. So you know, you, you did some with me, you did some without me. But it was a really busy week for sure.

Patrick Moorhead: Hard to keep up with you, Daniel. But hey, we got a great show for you today. I mean so much news and whether it's OpenAI and AMD and you know Nvidia's response to that. We've got Cisco rolling out, you know, multi terabit switches, big intel event, Foundry news, Qualcomm buying companies. Dan, you and I attended the Dell securities analyst meeting. We had Tech Exchange and we met with the senior executives at IBM. So a lot of stuff is going on here. Why don't we jump into the decode, Daniel? Like we like, like you said, news that hits before the news, right? We woke up in New York City to attend the IBM event and this AMD and OpenAI news hit big. A 6 gigawatt deal.

Daniel Newman: Yes, yes, yes, yes, yes. Look, this one came out of Kind of nowhere. I mean just a week before Pat, there was like a $100 billion investment and I think it just blew the tops off video Bulls couldn't believe it. They were absolutely stunned that this had happened. Was Jensen deceived? Was he in the cold? I mean don't people actually have to like to go by the office and say hey, I think I want a favor, can I go ahead and use a few of these other chips? You know, but in all seriousness like Sam Altman knows, Sam Altman knows what everyone else knows about the future of this market. And it's going to be that the more compute and the more capacity that you can offer, the, the more likely you are going to be able to be successful in this space. So people say oh, he's committed a trillion dollars. Well you know what, if he doesn't commit a trillion dollars, he'll never get to the valuation that people are gobbling up today. Remember, 500 billion over subscribe. More people wanting to get in on this and trust me, this trillion dollar IPO that's coming retail will be there for exit liquidity. But the deal itself is super interesting because unlike the deal with Nvidia where Nvidia is putting dollars in this time, Lisa sue and the AMD team basically used the company's stock as collateral to incentivize OpenAI to accelerate its utilization of AMD's future Helios Mi 450 rack scale systems in a way that essentially it's about a 2 to 1 return if, if, if Sam Altman and the Open AI team can deploy the AMD capacity and AMD stock in the process of all that sales. Remember they were like a five billion dollar forecast hundred plus billion dollars of sales expected in a short period of time. Their number just went up. It just went parabolic.

Patrick Moorhead: Yes.

Daniel Newman: In terms of how much they're going to sell packs. Remember our forecast just weeks ago, you know the, the, the market had AMD, you know, at somewhere between like 5 and 8% of GPUs, which is expected to be like 400 billion by 2029. So they were going to do what, 20 to 30 in that year alone. They just guarantee that they're going to be doing at least that just with OpenAI. So like that obviously sent the stock soaring 40% in a week. How much would you have now if you'd held all that stock, Pat? Don't, don't answer that.

Patrick Moorhead: You don't want to know how much money I would have.

Daniel Newman: I do actually because I just, it's like you can see us sit and sink in our chairs. But the net of it is it was a really clever deal. Sam Altman now has all the incentive to use AMD and succeed. He has all the incentive to see that AMD and the share price appreciation continues. And of course, Lisa, sue, it was a brilliant move to make sure that AMD now has its marquee customer that's going to use a ton of its next generation systems. And while some pessimists want to say, wow, can't believe you had to give it away. It wasn't a giveaway. It wasn't a giveaway. The only way it ends up being a giveaway, and it is entirely possible, is if all that sales doesn't drive any share price appreciation, which by the way, it's already driven 40% before they sold the first unit to OpenAI. So look, I keep saying there's enough market for everyone. People don't have to always look at zero sum. This isn't the end of days for Nvidia. It doesn't mean no one's going to use custom chips from Broadcom. It just means there's this much demand and all the demand that can be built, every chip that can be made right now will be sold.

Patrick Moorhead: Gosh, Daniel, I gotta first ask you, how did you possibly miss this in your forecast?

Daniel Newman: You know what, I, you're right. I don't know. I mean, I'm actually getting out of the business because of it, so there'll be some happy people.

Patrick Moorhead: You're supposed to ask me, well, when are you going to actually start doing forecasts?

Daniel Newman: I'm happy to not be competing with you on that.

Patrick Moorhead: So, yeah, I actually have always thought that these were nonsense, having made the decisions. But, there is like, there is value in putting it out there. Yeah. So, Daniel, this deal puts AMD on the map. There were a lot of snipers and ankle biters that said that AMD wasn't serious. And I totally understand why. Right. Which is not giving out a forecast. Which, by the way, there's a lot of good reasons why, why Lisa didn't put out a forecast on that. But now you, you, you would have to put up a pretty strong argument that says that AMD does not have the technology. This is not about a price. This is not about. Because whatever the 10% at a 600 share price means, OpenAI still wouldn't adopt it if it weren't good tech. Because at the end of the day, they need to profitably create value and get to the hundreds and billions of dollars of revenue by 2029. And you know, and there was an interesting reaction from, from Nvidia. I actually thought it was really good.

Daniel Newman: Right.

Patrick Moorhead: I thought the, Let me just close out the MD piece. I thought that there is a lot of execution to go. That chip had to have taped out already, maybe even samples before anybody would sign in the dotted line. But there is still a lot of execution to go. And this will be AMD's first scale up full system that they're, that they're putting out and that's tough. I think it also, if anybody was wondering why they bought ZT Systems, you now have your, your answer. So a lot of skeptics out there have. And by the way, the shorts are eating, eating it big time. I hope you're having fun right now. So I thought that the Nvidia response was very clever. I think they played it really, really well. Right? They, as you would expect, they teed up that, you know, hey, they had to give away 10% of their company to do it. Right. And that is not what Nvidia is doing. Nvidia is actually giving $100 billion to open AI over a multi year period without any strings apparently. So yes, it is different. But I really loved. He used the word on Kramer. He used the word clever. Okay. And what that does is that just says to me, I acknowledge that you made some gains. I'm not ignoring it. Okay. One of the big mistakes Intel made is they ignored AMD too long in the data center market and AMD went from 5% share, well, 0% share in the hyperscalers to 60% share in the hyperscalers and barely even acknowledged AMD about, until about maybe midway. And what that does is that if you ignore your competitor who's making big gains, you look like you're out of touch. And it is a huge positional mistake. What did you think about Daniel, how Jensen positioned this?

Daniel Newman: Look, Jensen's incredibly competitive and you know that.

Patrick Moorhead: And oh, what happened externally is very different from probably what happened internally.

Daniel Newman: Yeah. And he's also very, you know, he's very calculating in his words. You heard it this week, whether it was talking about, you know, some of the new China policies, talking about the Trump administration and of course talking about amd. I mean, I think right now Jensen understands he's got his own race to run. He's got to continue to focus on building and delivering on his roadmap and he's got, you know, competition coming from all sides. He's got constraints that are going to be within his control and those that are not. You know, he's got challenges that we're going to run out of transformers and turbines to be able to power these data centers over the next few years. He's got every hyperscaler on the planet that is trying to build their own infrastructure. And at the same time he needs to continue to pull through the developer ecosystem to basically say we want to build. And you actually heard him in the interview say this, he kind of says build on the American tech stack. But you know what he means is build on Nvidia. He's not actually saying the American tech stack. But having said that, like I still believe that it's not zero sum and I still, I just continue to say that he's going to sell what he's creating as fast as he can build it. And so the fact is, he's going to try to add more capacity. Of course he wants to lock in every customer to use a hundred percent, but he also knows that that's not realistic and he knows he's got real and formidable competition. But I don't think in the end that, you know, he's incredibly focused on that. I think it's a, you know, if 80 % of his brain is on building an Nvidia, maybe 20% is on all those other external issues. And I think he's, he's dialed in and focused on making sure that, you know, there's no attrition where he can control it and that he continues to build what the market sees as the leading stack. Not just hardware because the hardware is more competitive, but the leading stack to make sure that it's as sticky as possible within the regions and markets that he can control his sales to.

Patrick Moorhead: Yeah, it is game on. It is funny. I saw a new forecast that came out. You know, all of these, you know, intelligent financial analysts are upping their 2026 and you've got AMD with 20% market share in and for 2026. And I even think that is probably conservative. I think we could see AMD at probably 30% share. But you know, as we always say, just because AMD is gaining on a percentage share doesn't mean that Nvidia can't have monster growth because they, they, they absolutely will. So hey Daniel, let's go into another, I'll call it a, I mean it's, it's funny. It's AI and its chips and its networking. So Cisco announced a Silicon One based router and essentially it scales across AI hyperscaler data centers. There's a traditional way that you would connect data centers together, but this is more for data centers where you're using multiple data centers to train a model. Okay. Or you need to connect multiple data centers to do even hardcore inference. There were quotes from Azure Networking, Alibaba Cloud. Right. So it was great to see them. And oh, Patrick Moorhead was quoted in this press release as well. Daniel, what do you think? What are you? I don't know if you had a chance to catch this, but it's pretty impressive.

Daniel Newman: Well, your quote, I mean that was the only thing that really mattered was that you were quoted. So thanks for mentioning that. In all seriousness, you know, Cisco, we've said a lot. I think both of us have made comments underappreciated for their work in semiconductors. Silicon One is going to be a critical and expansive opportunity for the company. It doesn't get much market appreciation for the fact that it's in the chip space, but throughput, pat connectivity, network, it's a huge part of making AI work. You know, the way it was, it's basically described is like we have compute constraint, but we also have a network constraint. Like as we use more compute, how do we move all the data? So this 8223, you know, focuses on Cisco's kind of industry standards in areas like security, using AI for networking to create more efficiency in networking. But also enabling. Was it like 51 terabytes? Right. What's that? Terabit per second? Tbps. But it's terabyte storage. Terabit speed.

Patrick Moorhead: It's only off by eight.

Daniel Newman: Just a letter. Anyway, Riff raff. Anyway, you know, essentially though, we got enterprise data centers, we got tier 2 data centers, you got hyperscale data centers and one of them, you know, if you had three constraints, we always talk about compute and energy. Well, the third is going to be the network. And so Cisco has an opportunity to play a critical role in terms of connecting data centers, connecting and speeding up the network and connecting compute. And so it looks like they've got some breakthrough technology here. And you know, it's, it's, it's encouraging. So this is something I've said numerous times to the Cisco leadership team. They should be doubling and tripling down on because it's not just the fact that they build the routers themselves, but they also build the IP and the silicon that drives them.

Patrick Moorhead: Yeah, it's interesting the Chinese had to do this because they didn't have the density in their clusters. So they had to use multiple data centers because there just wasn't enough power inside a specific data center. I had an interesting conversation with somebody from Meta about a year ago where they had just started doing multi data center training runs and they did talk about how hard it was and the networking became an issue. I think you and I both talked to Microsoft about its Wisconsin facility and they really went out of their way to talk about multiple facilities that were going to be connected. So I find it, the payoff here is you've got Microsoft Azure networking in here. Talking about that need for those networking chip geeks out of there. Does this put to rest this deep buffer conversation that comes with the Silicon One architecture that can do basically multiple one architecture, many chips to do routing and, and switching. I really feel like Silicon One and Cisco are finally coming into their own here. You know, they, they kind of didn't talk about Silicon One for about five years and then now they're coming out in, in, in full force. And chips are cool. Chips are sexy. And you know, you know, Cisco's bet, right. AI scale isn't just out or up, it's also across.

Daniel Newman: Remember though, I just want to show this. I shared this yesterday. I think everybody should see this. Remember, don't be on the wrong side.

Patrick Moorhead: Yeah, that was good. That was good. So I love that.

Daniel Newman: But you know, the net. Net is, that they weren't talking about it for a period of time. It's time for them to talk about it. Right, Right. Everyone wanted to be software, everybody wanted to be infra. Nobody wanted to be a chip company. It's kind of cool to be a chip company.

Patrick Moorhead: Now I also think, and I think the Cisco folks have been just very open with me, which I appreciated. They had a lot of work to do. Right. To be able to, to pay off on that. And it's good to see, good to see this happening. And you know, we've got a, you know, you know, do we have a, you know, a third or a fourth networking chip beast here? Right. This art, this is already being deployed into hyperscalers by the way. And this is, you know, their router that they, that they actually sell to people for people who don't build their own say. Let's move, let's move to the next topic here. And that is another chip topic. Intel had a huge event out in, out in Arizona to really talk about three things. It was the latest in PC chips, Panther Lake and also a server chip coming out in the first half and then really about 18A. And they bought a bunch of analysts and press. I had two analysts out there, I got Angelin, Angela and Matt that attended that. But here's the way I look at this. I want to talk more strategically than the bits and the bytes on, on the performance and power claims of the part itself. So one of Intel's biggest strategic advantages or offerings is Intel Foundry, okay? And to get people to sign up, to increase their confidence, they have to bring out, they have to successfully bring up their own chips to demonstrate to other customers that they have this. Now to go from an internal foundry to an external foundry, you need to add a lot more things like intellectual property from third parties, from Cadence, Synopsys, arm, things like that and RISC V community. You also have to use standard methodologies, right? In other words, you have to look like tsmc, okay? And it looks like the broad based part of that is going to be for 14A. But you have to make your decisions in the next couple years to go in and be a high volume 14 a customer. So this is why this is so important strategically. So if they can deliver the PPA, which by the way, 18A doesn't stop with 18A. You go to 18AP, which increases performance, improves leakage for things like mobile. Again, it increases confidence. And you know, all this, oh, people need to be doing intel because, you know, of the US thing. One of the things that intel has to do is they have to have a competitive foundry process. And, this is a milestone along the way. I will talk about Panther Lake a little bit and what I will say is I feel like this is the chip that Intel wanted at the beginning of this AI PC bonanza, but didn't have it. Lunar Lake was really only supposed to be a low volume, right? You know, it had integrated memory on a SIP package and literally it was sipping more sipping power than it was on the, on the performance side. And intel really stretched that even farther. Panther Lake with its, you know, P cores and E cores I think is a much better planned solution as well. It doesn't have a SIP part. It's taken memory off the package which takes a little bit of a hit on battery life but you, you know, you max out flexibility for OEM. So all in all, I mean, this is a good step forward. I want to see parts, I want to see parts inside of OEMs. I want to see Signal 65 test this. Okay? And if I can. So I think one of the big differences in timing is that Qualcomm had notebooks that you could test as a benchmarker to be able to do that. Intel showed benchmarks on stage. You are not allowed to benchmark them. So I think from a schedule perspective, Qualcomm is ahead and maybe we'll see some big OEM announcements at CES.

Daniel Newman: Daniel yeah, I'll keep this one pretty brief. I think the biggest thing for me, outside of what you already suggested, Pat, is it's a really big moment. There's a lot of critics out there for a long time about Intel, its ability to execute its vision on its foundry strategy. Getting 18A up running, getting the part ramped and having it be successful, of course, is going to be all the parts and pieces. But this is the competitive to 2 nanometer product that Qualcomm, sorry that intel has in its foundry lineup. And while it didn't become an external node at any scale, having it become a very successful internal node for the intel products, I think we'll say a lot. So, you know, I give some credit, Pat, to where credit's due and congratulate Intel on getting 18A and getting Panther Lake ramps. You know, the PC, the AIPC platform on its own technology is a lot more compelling than, you know, using TSMC.

Patrick Moorhead: Yeah. And just, just for the record and completely it's not, not all of the silicon inside of Panther is, is done at intel. And by the way, that's not a surprise. Pat Gelsinger told everybody that and you know, it's, it's a diversification play. I think if AP had shown up, I don't think they would have had to have used tsmc. There might even be some wafer agreements that said they had to hit the volumes that they, they had, they had pre. Had pre committed to. So let's move on. Daniel Qualcomm and its ongoing IOT and physical AI aspirations made what's interesting, not big in dollars, but big in strategy. Qualcomm bought Arduino.

Daniel Newman: Yes, they did. And this is a big moment for the company. You know, I've been saying for a while publicly that Qualcomm is in a really interesting and unique, exciting position to expand its business. You know, we know that we're in a certain maturity right now for handsets. PCs of course, have ramped, they've had some really good parts and products. But again, another area that's not in hyper growth, where a lot of excitement for Qualcomm has been, is in for instance, in automotive, where they've diversified the business, they scaled and grown. Another area of excitement and growth for the company is in IoT. Their IoT growth has been incredibly strong. And this of course sets them up for what I keep talking about, their sort of their, their next generation, their next invented moment. And that's autonomy, physical AI, robotics. And so, you know, one of the things that you have to have to really create scale in an area like this is going to have to be a large community of developers. You have to be building kind of from the bottoms up, you have to be creating the edge at scale. And this Arduino acquisition gave access to like 33 million developers, users, hobbyists and those who are going to basically use this ecosystem to build more loyalty. And I think the loyalty to Qualcomm early on in this stage is going to be super important. You know, I also think that the company having more of a full stack at the edge, you know, this Dragon Wing technology they've been talking a lot about, I think a lot of it's going to start there. It's going to start with kind of the real time microcontrollers, smaller things used for sensing at the edge. And so this again adds to what they're doing there on the IoT side. This has become like a billion dollar quarter business for Qualcomm. Something that's really, really interesting and exciting. And of course, you know, you see the company, it gives them more scale in this area, it gives them a bigger, broader ecosystem to play with. And I think that one of the things is, you know, we are only in the infancy of seeing all the potential in robotics, physical AI, IoT, and how it's going to connect to the broader AI opportunity. And so if you can build that loyalty, remember Cuda kind of started this way in terms of being very heavily deployed in research and institutions and educational capacities. They have successfully gotten access to that kind of audience. So it's going to be kind of a bottom up and across strategy for the company. But I think ultimately this is all about making a stronger claim, making a stronger place, having a bigger community and being able to execute across their vision for being not just a, you know, a leader in handsets and a leader in, you know, the next generation AI PCs, but also as they go into autonomy, physical AI, robotics and IoT.

Patrick Moorhead: Yeah, it's a good assessment, Daniel. This is one block, one Lego block in a big set of things that Qualcomm is doing. You may recall the NXP acquisition that, you know, was an absolute giant chess move that unfortunately China blocked and, and never, never went through. So what Qualcomm is doing is investing organically. They've created alliances with other companies and they've made some tuck-in acquisitions with companies like Edge Impulse and Foundries IO to build a leading ecosystem for the next generation of physical AI. And one of the things is Arduino has a very rabid and excitable and interested audience out there. Right. And Qualcomm could have come in and basically gone all heavy handed and you know, gotten rid of everybody and, and just taken over. But, they're smart and what they're going to do is have Arduino maintain its brand. And I think as long as they hold a loose reign and by the way, Arduino has to execute too. I think the reaction should remain positive. Okay, Daniel, you're, you're up again here. Tech. IBM had Tech Exchange and you and I, you know, did a preview of it. We met with the senior leaders of IBM including Arvind Krishna and the heads of pretty much all of their, all of their different, different businesses. Their stock did pop right after the, right after the announcement. I think it settled, settled down after a while. But it, you know, looks to be a, a, a success. I had three people on, on the Moor Insights team there and they were giving me some, some positive feedback and, and also some stuff to work on.

Daniel Newman: Yeah, I think there's a number of things there, Pat. We were at the analyst council and our teams were both there. I think that, you know, one of the big focuses of TechXchange was orchestration and agents. So I've said for a long time, I think IBM has a really unique opportunity to play and be successful in the orchestration and the platform layer for agents. Right now you've got these kinds of three forces. You've got the force of people wanting to add agents to their current business and SaaS applications. You've got cloud companies sort of saying let us be your end to end orchestrator. So we've seen announcements like this week's Google Gemini for business. Of course, all the, you know, Microsoft, Amazon, all want to be your sort of agent orchestration layer. And then you kind of have these companies like IBM that I use the word middleware because that's probably not the most sexy but the most realistic of all these different business applications and software applications that run inside of a business and you say, you know, is there a place in the market for an Orchestration from a company like IBM, one that came in early with AI platform, came in early with data platform, came in early with compliance and orchestration platforms. And can they become kind of a unique provider to the market of all the tools that will help companies be successful. So you know, they're looking at, you know, how they can help with agent ops, which is something that they focused on here at TechXchange. You know they're, they're building agentic workflows or scaling and of course building all of this, you know, even adding, you know, more and more capabilities to add agents on top of Watson with Z. So they're, they're really building across their whole software ecosystem. And then you know, the company made a few other announcements with Hashicorp, it did Project Infographic which is helping the company reduce all the fragmentation on the infrastructure layer of all the tools to provide better observability. And then of course Project Bob was another one that was announced. Hey Bob.

Patrick Moorhead: Which is I can't unsee Microsoft Bob, by the way. Hi Bob. 

Daniel Newman: Hi Bob. Which is an integrated development environment designed with advanced task generation capabilities for enterprise software development, lifecycle cycles, you know, all of these things in my opinion are about accelerating businesses ability to implement and utilize AI at scale. And that's what I like about IBM is they're very focused on enterprise which I keep saying that's an area that's still got a lot of weakness is actually seeing enterprises take AI proof of concepts, get into market escape. And so you know, it was a fairly, you know, big set of announcements. I think they also made a few partnerships, Pat. They made a big anthropic one. Maybe you could talk more about that one. They announced one with Grok as well, but it was a busy week. It seems that they're really doubling down on enabling companies to put AI to work for them.

Patrick Moorhead: Yeah, so I think Anthropic got the markets going, going on this and essentially they're going to pull in Anthropic, who is viewed by pretty much everybody in the software industry so far as the best company for programming. And not just code completion. Right. Code checking, security checking, laying it out, laying out the system. My son is a huge, an AI developer and a huge user of Claude code. Gosh, even Jensen said that every one of his folks is using an AI assisted tool. I think Google said that 80% of their code will be written in the next few years. It's 20% actually, 30% all raised. This is the real deal. And then you put yourself in the shoes of these IBM clients who are looking for these tools. It's a good adder. What I want to figure out is, okay, IBM is pretty good at this. Things like COBOL and refactoring, I'll call it, let's say, going from COBOL to Java with its own models. What exactly does this add? I want to go back to the analyst day. So the analyst day that we had, it was an NDA conversation. It was with the senior leadership, including Arvind. I'm always struck at how real Arvind is on things. And you know, it's funny, I have to say I admire him too. When he gets a stupid question from the audience, he just layers right in and tells it the way that it is. I do think that IBM has the chance to be one of the big large enterprise AI winners, especially in areas that are highly regulated and large industries and long term. The biggest bet that it has is in Quantum, where I can argue that it could be the leader, not just a leader out there, given its scale, its resources, focused end customers and what it's doing. And you know, while they're not, you know, making claims about upsetting the GPU and the XPU market, this is exactly what will happen if, if Quantum is, is truly successful. I do want to see the company kind of rethink its messaging. Okay, you know, here's a, here's a. For instance, IBM's AI platform is the only hybrid multi cloud fabric that encompasses models, training, inference data and governance. That is exactly what enterprises want. What they don't want is to have for every hyperscaler, they have to use their tools. It's got to be frustrating that that is a fact. Yet more, more people aren't recognizing it and they're not getting recognition for that. I really appreciate it. And one thing I will share is, you know, Arvind did talk about open messaging, right? And you know, could it be better? How could it resonate better with clients? But all good, good stuff. All right, buddy, let's go. It is time for the flip. Now, Daniel, you and I have debated many things many times, but one thing is I think that people bring us on these broadcast shows to talk about. In fact, you know, I think you were on two or three shows in New York. I was on two. What did we talk about? We said, with all these massive AI deals, all the money going in there, does it signal that we are in an AI bubble? So let's dive into this simulation. Let's flip the coin and dive into this simulated debate. Are we in a bubble? We are in a bubble. And of course, and I will, you know, cite you, I'm a boomer and therefore of course I would think this is a bubble. So first off, just a little background. Let's establish a little credibility here. I was at a company called AltaVista which was a spin off of Compaq, which was acquired by CMGI. CMGI was the NASDAQ's highest appreciator one year and then it was the biggest depreciator the next year. I have seen the bubble. I have worked in the bubble. Pets coming in, massages, the whole thing. And it all, it all burned down and there were a couple similarities in there. Right. Let's talk about valuations. Right. Historically, I'll just take the S & P 500 right over around 23x forward earnings versus about 19 over the last 10 years. So we are absolutely hot. Even on, even on a forward earnings basis. And the other thing that's similar is market concentration risk. Right. Got a handful of AI exposed names accounting for the majority of the index gains, profits and Capex intensive classic late cycle setup here. Okay. And you know this Capex tsunami I will call this, it's awesome. We love it. It's fun to put out there on X and stocks and to the moon and rocket rocket ships. Big tech is on pace for $320 billion of Capex. Just this year I challenged anybody to do the math. You know, let's say in 2026 where that would turn into ROI and it's big now, but it's also rising like Microsoft's 26 capex to sales ratio could be around 38% and that's up 28%. Bain is talking about ecosystem could, could need $2 trillion per year of revenue by 2030 to fund the projected compute demand. What are people going to do? Are they not going to buy food? And even let's say it's a share shift in enterprises. Okay, I'm going to stop buying from Salesforce. I'm going to buy this. This is 2 trillion more dollars that we're going to need in revenue to fund projected compute demand out there. Oh, let's look at, let's look at gross margins on even GPU rental. We're looking at a low teen gross margin. I know we love to say that. Oh my gosh. Like the people who are the brick and mortar picks and shovels, blah blah blah. What a great business. Low teen gross margin. Not net margin, not op bank. And that's why, you know, particularly the Neo clouds are losing a ton of money with really no future here. So I think I'll end by saying a lot of promises already in the price, cash returns, power and time may not cooperate.

Daniel Newman: Well, there were a couple of good points in there. But first of all, anytime you cite an article from the dis. I mean the information, the DIS information, the. I'm kidding, I don't know, maybe I'm kidding. Anytime you cite that as real data, I don't even know how they would have come up with that. Being able to parse out a specific, specific part in the amount being charged and then pull it out and extrapolate it into something. By the way, these reporters must also have really great degrees. MBAs, finance quants. Maybe they're using perplexity or chat GPT to figure this out.

Patrick Moorhead: Possibly.

Daniel Newman: Maybe they asked the wrong model and it was just simply creating a problem because it knew it could create some funky information. But I heard Amazon this week was able to actually raise prices because the demand is so high for its GPUs that it was increasing its price prices. I saw that more than 10 bit mining companies this week exploded on expected demand and deals being made with CSPs and hyperscalers. And sure, there might be some backlogs on, you know, on transformers and turbines, but that's because the demand is insatiable and the world wants more AI. So you know, is it a bubble? I mean, sure, are there bubbles in the bubble? Are there companies with no revenue that have seen their stock prices surge? Are there certain trends and themes that are inside of this market that are going and ripping with an undeniable pace? Maybe. But is there not really going to be demand? I just simply don't agree. The fact is, Pat, every one of these companies, whether it's Lisa sue, whether it's Jensen Wong, whether it's Arvind Krishna, these are very smart people that are very aware of what's going on and they're investing in front of a trend. They know this trend is coming. They're doing it in their own companies and they're doing it for the customers that they serve. Pat, I can say that I've seen a bubble in my life. I was very young. The Internet bubble that was real. The difference in the speed and the proliferation here is exponential 3 to 4 trillion dollars of expected capex by 2030. May it slow down and only become 2 and a half to 3. I don't know. Maybe that's possible. But listen, the world's entire GDP is growing because of the expected productivity gains that we're going to get from AI. Is there work to be done in terms of getting those gains?

Patrick Moorhead: Sure.

Daniel Newman: Does software need to be reinvented? Absolutely. Do we need to move from people to tokens? Yes. Outcomes instead of overall fees? I simply cannot see any situation with this much investment by companies with balance sheets as strong as our Mag7 and others falling in a bubble the way we saw back in the 2000 era. Different situations, different times, different types of companies. It'll be a different outcome. Sorry, not a bubble.

Patrick Moorhead: Yeah, I drew the short straw in that. Here's where I think we are. We are in a bubble. And that's okay. I don't think all bubbles burst. You need bubbles for investment into future awesome things. Here's something, Daniel, that nobody is thinking through. The point of no return. Like who are really the bag holders? If so, first of all, most of the capex in the next two years can be paid through cash flow, net cash flow. Okay. And then you have to start looking at equity and debt. Right. So for two years, we can fund this. We can fund this puppy through free cash flow given. Given forecasts. Right. That is not. That is not a big issue. We can pull back on spending like nobody. Nobody's thinking through that. That if we see waning, if we see something like that, everybody can pull back. The bag holders are the people who have to deploy cash that can't be pulled back two years before you're going to start seeing something. Which, by the way, that's the foundry model by definition. So the ability to pull back, I think SNRS might be at risk. So I haven't done the full analysis on this, but look at the people who have to commit cash two years ahead of time before they see anything. And those are going to be the ones that. The only ones to truly be at risk if. If we dial back the spend. All right, folks, let's move into bulls and bears. There were no earnings last week, but a lot of market discussion. Let's dive in. All right, let's. Let's hit this really quickly. Daniel, like we said, you and I were both at the Dell tax security analyst meetings. Gosh, we not only did, we sat in the audience, but we talked with Michael Jeff, Arthur Lewis, among other executives there. It was that they always appreciated the time that they committed to doing that. But pretty big news, right? Which was first of all they more than doubled their revenue percent forecast, growth forecast and they nearly doubled their EPS forecast for data center. It was pretty much, I mean that's where all the growth was going to come from. Okay. But the strategy did not change. It's pretty much, you know, stick, stick to we're going to drive some serious business through tier 2 CSPs, the Neo clouds and that's where they're driving a lot of that. And then the slow build to enterprise AI consistently doing that. You know, Arthur came out and spent more time on storage than they normally do. Quite frankly they don't get credit for what they do. I also don't think that they communicate or market their innovations as well as they could be and they need to change that. The one change that did happen was essentially Jeff cut up that not only is he the vice chairman but he also runs the PC group. He essentially said and Jeff did this, I don't know six or seven years ago. We're done losing market share. Okay. While Dell had increased market share in premium commercials it had lost market share in every other, every other segment. What he said is we're going to participate in every price point. We are going to drive peripherals harder. We are going to get more serious about the consumer market. And by the way we're going to hit the same profitability which is a challenge. And check out on X right. There's only a few ways that you can do that but hopefully we'll get the download this week from leadership. They have an analyst analyst day. So yeah, they can margin dollar average between consumer and commercial optimized supply chain. Increased volume means lower prices. Right. More platforms mean more NRE from people. Any incremental supply leverage from PC plus dc. The company is masters of that. Margins are higher on, on peripherals. The question is do people want to buy Dell peripherals more? Dell peripherals and MDF should rise as volumes rise. So we will see next week.

Daniel Newman: Yeah, I, I'm gonna just keep for the sake of keeping moving. I mean you hit a lot of the top points. I think their big raising guide was just an absolute barn burner. I think everybody's super excited about that fact that you know, they have the confidence. We know margin pressure is real. We know the company is going to have to fight that in their AI space. But look, what I love about the company is it's really genuinely convinced the market that it's been able to make the AI pivot. And in their space they are leading that particular pack. They are setting the pace. You know, we're seeing strides being made by HPS, Lenovo and others but Dell has been the pacesetter. It's hit the top revenue pace and growth and it's able to show that it's got an economic plan to keep doing what it's done which is, you know, be more growth oriented but still return strong to their shareholders and deliver value. And of course they've, you know, picked their focus areas like in PCs and leading in a high end and more commercial and not necessarily trying to be all things, but in the AI space they are going to be, you know, the company that's going to keep setting that, setting a use word but setting the pace.

Patrick Moorhead: Dan, you talked a little bit. Let's move to the next topic here. You talked about, you know, Oracle slipping on this, the information article that, that came out. What's going on there? Is this a bad business or bad reporting?

Daniel Newman: Look, I, I said it in the, in the flip, so go back and reference it. But I just think it's very difficult. It feels very disingenuous. Look, when people build factories they're always as big expenses up front, depreciation accelerates. Depreciation obviously eats up a ton of quote unquote a bit. You know, your, your, your, your ebit after depreciation and stuff. So they kind of used like a mixed bag of numbers. They kind of counted labor in parts, didn't count labor in parts. They counted in some of the getting set up and running parts. And then there's all the timing issues of well when did they start really billing versus when did this stuff start to come online and be usable? And in the end, like a factory, whether you think of a philosophical physical factory or you think of an AI factory, it takes time to bring value. So you know, my read on the whole thing is that it just feels very much like the information once a month comes out with a story that just feels like it wants to smash Nvidia because it was really not just about Oracle, it was about Oracle not making money with Nvidia. And it's like, I don't know what the point is here, but I think it's very hard to extrapolate a piece of the data on a piece of the hardware that doesn't get individually financially reported and then say a source inside the company who's giving this information. So like all these things to me are a little bit alarming. Could it be real? Like I've said before, it could be, it could be right, it could be real. It still doesn't mean that they're not going to make money on AI. I just think Larry and Safra and now Clay and the team, they're not that stupid. So I just don't think these companies are that dumb and I don't think the economics can be that bad.

Patrick Moorhead: Yeah, I think in the end, and I, I called this during the original Oracle deal, I think their margins will be in between what their current margins are and like a Neo cloud, that's, that's where I think it'll end up. I don't think they'll be the same because that just doesn't make sense unless they're at, they're putting a lot of value added services on, on top of it. The funny part is Jensen came in and said Oracle is going to be wonderfully profitable. He said that on, on cnbc, but he said it is, they will be wonderfully profitable over time. Some quick hits here. As we saw with open AI, XAI is going to raise 20 billion and Nvidia is going to jump in and put some money in there. This strikes me as like an open AI type of deal, which to me makes a lot of sense.

Daniel Newman: Absolutely. I'm making a broken record here. Yeah, I mean my Xai point is super quick. You got to be in on all of them and actually it is awesome. Like I love a Grok, I love X. So I'm super supportive. It's a pretty small investment. It's another one that's going to fund more growth and more chips. But yeah, I think what he said about basically investing in anything Elon does makes sense. I think there will be a good return and that's a pretty small investment. I know 2 billion is a lot to most of us, but that's a pretty small amount of investment. Plus the financing vehicle to help them buy all the chips, which again Pat, of course like to support the continued utilization of that technology. So yeah, I'll stick with it. I think it's, I think it's a smart raise. I think it's a smart place to kind of, you know, like betting on different chips and Jensen's betting on different models.

Patrick Moorhead: A quick hit on Applied Materials. They came out and said essentially $700 million revenue hit, new export Restriction rules. And what this was is these were subsidiaries. So there, there's a watch list and an entity list and what wasn't encapsulated in there was a sub entity of an entity. On and it's interesting other capital equipment folks, WFE companies didn't take the hit and people are like well why is that? It's proximity to their earnings date. Okay. On why Applied had to do this and others didn't. Let's close out with Iran. Daniel, I think you tracked this a lot more closely. New multi year AI cloud contracts.

Daniel Newman: Yeah, this is another one of these sorts of. I call them AI utility companies. We've got the GPU as a service company and now we've got the rise of the AI utility companies. Iris Energy Iran is the, is the Iron Limited. It has become basically this massive player that's basically building out compute capacity. And so the biggest point here is that you know they're going to now provide not only. And they're going to provide energy utility and grid and of course they're going to provide the GPU as a service as well. You've got a lot of other companies entering the space. You got the ciphers. You've got Cumulus AI which you've maybe talked to. They're coming into the market now. And then of course you had Applied Digital. We both know someone that actually went over there as their COO. Now there are a lot of these Bitcoin mining companies and so they basically become, they've turned all their compute capacity into scale and to compute and then make it available. And by the way, these companies are just ripping. They're going up like 10, 20 per day right now. Because the belief back, going back to our simulated debate earlier, is that there is just simply not enough energy. So these guys are creating energy and then they're also. So they're building more power and then they're powering the compute and then they're making that all available to the hyperscalers and application builders. So this is a whole new system of creating the capacity that the market's going to need. And this is another company getting a big tailwind from it. Wow.

Patrick Moorhead: We covered a lot of topics. Dan. This is great stuff. You'll be watching this on Monday. It's a big week of events. Dell Tech Analyst Summit, Oracle AI World renamed Dreamforce. You got NetApp very busy. And then we have F1 here in Austin, which I am, I'm looking forward to. Hey, thanks everybody for tuning in. Hit that subscribe button and check out Daniel and I on the socials and on the broadcast. Thanks and take care.

MORE VIDEOS

How MediaTek is Driving AI Innovation at the Edge - Six Five Media

PD Rajput, Associate Vice President at MediaTek, joins hosts to share insights on how edge AI, custom silicon, and advanced device capabilities are transforming client computing across Chromebooks and personal AI supercomputers.

Broadcom’s Breakthrough CPO Architecture for Scale-Up and Scale-Out AI Networking - Virtual Webcast

Manish Mehta, VP of Marketing and Operations at Broadcom, discusses the launch and impact of the Tomahawk 6-Davisson CPO Ethernet switch, highlighting industry-first bandwidth and efficiency advances for next-generation AI networking.

The Six Five Pod | EP 279: Government Shutdown Blues: Tech Impact and Economic Uncertainty

On this episode of The Six Five Pod, hosts Patrick Moorhead and Daniel Newman discussMeta's $14 billion infrastructure deal with Core Weave, OpenAI's $6 billion agreement, and the ongoing debate about custom chips versus merchant silicon for AI. The hosts explore Microsoft's strategic move in appointing Judson Altoff as commercial CEO and the implications for enterprise AI adoption. They also touch on the US government shutdown, its impact on various sectors, and the challenges in interpreting economic data. Throughout the episode, Newman and Moorhead offer insightful analysis on AI's influence on job markets, the future of chip manufacturing, and the broader economic landscape.

See more

Other Categories

CYBERSECURITY

QUANTUM