Home

Ep. 288 OpenAI’s Valuation Debate, Marvell’s Network Bets, and the Next Bottlenecks for AI Growth

Ep. 288 OpenAI’s Valuation Debate, Marvell’s Network Bets, and the Next Bottlenecks for AI Growth

This week, Patrick Moorhead and Daniel Newman unpack why today’s AI moment feels less like the endgame and more like Netflix’s DVD-by-mail phase—the very beginning of a transformation that will redefine the tech industry. They dig into soaring AI valuations and the growing debate over whether today’s leaders signal durable platforms or bubble dynamics, then shift to what really matters under the hood—AI infrastructure, with a sharp focus on networking and memory, informed by insights from Marvell’s Industry Analyst Day. This episode also breaks down recent market moves across major tech players, before closing with a forward-looking take on where AI is headed and what it will take to stay competitive as the pace of change continues to accelerate.

The handpicked topics for this week are:

  1. The Importance of Input in AI: The effectiveness of AI is heavily dependent on how well users can prompt these systems. The shift from skepticism to embracing AI in research and development showcases the need for businesses to adapt and leverage these tools effectively. The right input can yield incredible outputs, and understanding this dynamic is crucial for anyone looking to harness AI's capabilities.
  2. Big Funding Headline: OpenAI’s reported mega-round and valuation chatter (and why it’s increasingly about funding the industrial build-out, not just a “software company”). 
  3. “Circular deal” / Corporate venture logic: A look at examples like AWS investing in infrastructure, while also supplying it, and why that doesn’t automatically invalidate the investment case.
  4. AI build-out constraints: Pat & Dan break down why the real stack is compute + networking + memory + storage + power—and the permitting, community, and political friction around scaling it.
  5. Marvell Industry Analyst Day recap: Why “the AI network is the AI computer,” plus Marvell’s positioning and strategic bets (optics/photonic fabric, scale-up/UA-Link, silicon/design enablement).
  6. The Government “Tech Force” Discussion: Bringing AI/tech talent into government, modernization, and execution capacity.
  7. The Flip: Hosts debate whether OpenAI the real deal or the canary in the coal mine. Pat argues durable platform + enterprise distribution + investor conviction. Dan argues model parity, commoditization risk, and the need to become more than a model company.
  1. Market Volatility “Bulls & Bears”: Why AI-linked names sold off; how sentiment swings on OpenAI risk and capex intensity. Hosts break down the Oracle sell-off with a Revenue/EPS/guidance narrative, capex + debt concerns, and why RPO/IaaS growth still matters. And Broadcom’s pullback: margin/insourcing fears, why “X goes to zero” narratives are overblown, and the reality of hyperscaler in-house efforts.
  2. Micron’s Breakout Moment: HBM demand surge, memory becoming a primary constraint, and why Micron’s quarter/guidance hit so hard.

For a deeper dive into each topic, please click on the links above. Be sure to subscribe to The Six Five Pod so you never miss an episode.

Or listen to the audio here:

00:00:00

Disclaimer: The Six Five Pod is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript

Daniel Newman:

They're the Netflix shipping DVD moment for AI. It's like, we're not seeing the potential of this thing. We're seeing the teaser. We're seeing that here's one little thing that you can do with AI. If people are actually trying to say that this is the entire TAM, is how many people are subscribing to OpenAI.com. If that's what you think AI's TAM is, CEO of More Insights and Strategy, Patrick Moorhead. CEO of The Futurum Group. Do we know what we're talking about, Moorhead?

Patrick Moorhead:

We absolutely know what we're talking about. Wow, look at that new intro, you know? They brought in some Skinny Dan and some Skinny Pat.

Daniel Newman:

There's some.

Patrick Moorhead:

I still saw some Fat Dan. Jack Dan. I know, though. I mean, that was when you were with Famous People. I think they snuck one Fat Pat in there as well. But no, it's good stuff. Good stuff.

Daniel Newman:

Can't we AI that? Can't they, like, fix the image, Pat? Where, like, it's still the same, but it's... Skinny Pat is now being superimposed over Fat Pat in the interview with Jensen or whatever it was.

Patrick Moorhead:

I don't know. I've used some of these tools and like the last video I did on Sora too, it gave me three chins. I asked it to do a video of me doing a bench press with three plates. It gave me three plates, but it also gave me three chins. It thinks you have a steel plate chin. I know something like that. Anyways, great. Hopefully, you know, for all those who are working this week, like Dan and I, at least Monday and Tuesday, I appreciate you tuning in to episode 288. Dan, we've made it another year. I don't know. I guess we have to make it another two weeks to see if we get canceled between that and there. Do people get canceled anymore? I don't think so. Nope, I've tested it. Yeah. Anyways, before we dive in, I just want to say whatever. Happy Hanukkah, happy holidays, Merry Christmas, whichever one you strive for. Whatever you do or don't celebrate. Exactly, exactly. It's so funny, I just got into the habit of saying, for those who celebrate, of course that's what we all mean, but we are more politically correct in this age. We don't want to offend anybody. We don't want anybody running to their safe spaces. So we have to put in that caveat. For those who celebrate, whatever you celebrate, hopefully you'll have peace with your family and your faith, whatever you sign up to.

Daniel Newman:

But hey, before we dive into this.

Patrick Moorhead:

I say don't @ me and then you @ me. Don't @ me. I don't care. I'm the guy who does stuff that you tell people not to do just to keep you on your toes, Daniel. Let you know, remind you that I love you. I am your business wife.

Daniel Newman:

I missed you, though, man. You were away. Our show has been gone. I know.

Patrick Moorhead:

Yeah, my fault for the last show. I went to London for a couple of days and just took off time. Probably the first time in, I don't know, 10 years where I literally did no work when I was gone. And it was worth it. That's what everyone out there knows, you know, I didn't bother. I really missed you though. And then you were kind of playing hard to get, uh, when I came back and I, I was wondering about, uh, the level of affection that you have for me and the status of our relationship. I got, I got a little bit insecure.

Daniel Newman:

Yeah. Well, you know, I, I, I, I may have found new love while you were gone, you know. Or maybe abstinence is making the heart grow fonder.

Patrick Moorhead:

Listen, I saw what you did. I saw you taking pictures in front of a big nuclear fusion or fission. I forget which one. That's fusion.

Daniel Newman:

That's fusion. Yeah, fission or the big pipe, but also that's the SMRs. But yeah, those are the ones that melt down.

Patrick Moorhead:

Yeah, I mean, you stayed at a holiday inn last week and you became an expert in this technology. It's pretty impressive.

Daniel Newman:

It took, you know, thanks to AI. And I know we got some stuff to actually cover today. But what I can say is thanks to AI is when you know a little bit about something, you can learn a lot really fast. Now, that's one of the kind of hacks, you know, things but like, you know, I spent like six years from about 2014. When I met you in about 2020. Figuring out the semiconductor industry, people No, believably, we've been at this more than a decade or we're having more than a decade anniversary at this point, I would like. So, it took about six years to learn semis and obviously still learn every day. I think you would agree, there's no end to learning there but like. I think what it took me six years to do before AI, I learned in about six weeks about nuclear, just because of what you can actually do with these tools right now. It's unbelievable. Like, you know, the progress that we're able to make as humans, if we embrace this stuff is, is, is, is mind blowing. And it's not just about three plates and three chins. Although that stuff is pretty cool. There's like real valuable stuff we can do. It is good.

Patrick Moorhead:

Yeah. That AI stuff's way overrated. In one year, I went from, I'm going to fire anybody in my company from writing with AI to, I'm going to fire you if you're not using AI to do research and all this stuff.

Daniel Newman:

It got so good. Like a year ago, it was garbage. It really was slop. And now that you've figured out the input output thing, like the prompting you're doing, and I've watched you do, the work you do with AI is really impressive. you know, cause it takes real work on the input. And that's what people don't understand is like, they want to use AI like a search engine. So they just put, like, write me a paper about nuclear fusion and it'll write something. You'll be like, this is garbage. This is absolute crap. And because like a real prompt might be a page long to actually get this thing to work, but it will give you a 20 page report in response. It's gonna be really valuable. Again, Pat, this goes back. I saw a funny commercial the other day about, you know, the massive uptake of GLPs. And it was along the lines of like, you know, the, it was talking about like kind of the movement, how we got into this like unhealthy movement of people being out of shape and how we were trying to have a positive body, positive body. And it was kind of like, it turns out in the end that, you know, these people, it was never about a positive body. They just didn't want to do the work. Because the second the GLPs became available, everybody got on them. So it's the same thing with AI. Like if you don't do the work, it doesn't actually deliver a whole lot of value.

Patrick Moorhead:

Don't do that. Don't do that. Yeah, I love that. So before we dive into the code, a couple shout outs here. We've got a Decode Summit and consider this a mini summit talking about next gen AI infrastructure, which I think we probably spend three quarters of our time here on the pod discussing. December 29th, register at sixfivemedia.com/decode. Daniel the Six Five, you and I, and a couple of our compatriots are going to be at CES doing Lenovo Tech World. We're going to be broadcasting from the sphere. Uh, check it out, uh, out there, uh, on X. I think the Lenovo data center, uh, group, uh, put some content out already. We're still lining people up. We've got Synopsys CEO, a fireside chat with Sassine Ghazi there as well.

Daniel Newman:

And I'm also, I'm sitting down with, uh, NXP CEO, Rafael Sotomayor, new CEO. Oh, that'll be wonderful. Yeah. A lot of stuff going on over there. And that'll be between Pat's runs down the black diamonds. That's going to be like get off the mountain, come to Vegas, go back to the mountain.

Patrick Moorhead:

It is. I don't know. There's not a whole lot of snow out there, so I might be sledding or hiking. We will see. And then a couple of weeks after CES, because we needed another event, is a view from Davos. Daniel, I've been doing this for two years. I went there last year where essentially we go in, have a ton of important conversations about tech, about policy and AI, but we also have interviewed some of the leading CEOs. in tech out there, and we're putting our calendars together. If you have a CEO or you're a famous person that you think we need to get with there, hit up our website and- Only if you're famous. No, we only interview famous people. Famous. Or that's how you know you're famous. You've been on the Six Five.

Daniel Newman:

That's true. It's like Rogan, Six Five. I don't know.

Patrick Moorhead:

We got 100,000 subscribers and, and we were cranking out some pretty good page views. So I think we're doing, I think we're doing, I think, I mean, listen, I'm never, I'm happy, but never satisfied that we can do a lot better. And Daniel, you know, I need to step it up for 2026 and make this even better.

Daniel Newman:

Yeah, I'm generally miserable and dissatisfied, which is what's really been the fuel of the growth of our company.

Patrick Moorhead:

It's true. I'm the same way. You know, I sat in my company offsite and, like, anyways, we didn't need to dive in.

Daniel Newman:

You're like, that might be too much story.

Patrick Moorhead:

It might be too much story. I'm not going to talk about that. So, hey, let's dive into The Decode where we look at the latest tech topics out there in the news, the noise, try to separate the noise from the signal. Let's dive in. All right. A lot of news out there. Probably the biggest one that got the most amount of chatter is OpenAI raising more money another round at $750 billion.

Daniel Newman:

Hey, Pat, I gotta, I gotta correct you. I heard this morning it could be $830 million.

Patrick Moorhead:

Yeah, I heard there was somebody out there saying, hey, stupid sovereign money is coming in. Let's push this thing to $830 or $880 or something like that. At this point, it's less about just funding a company. It's more about funding the entire industrial build-out. You look at who benefits. You've got compute, power, networking, memory, training, and inference commitments. And even Amazon is in talks to invest $10 billion. And some people are calling it a circular deal. I call it vendor financing. Who cares? And I think we'll likely see some Tranium coming in here. And, you know, I think it reinforces the insatiable demand for AI compute. I think on the AWS side, it also validates XPUs, which, you know, I've been very consistent about that. Go back 10 years when I first started talking about ASICs, what they do well, what they don't do well compared to GPU, CPU, and FPGAs. And we are here. I think there are some operational risks, permitting, greater connects, community pushback. We saw Bernie Sanders going crazy. I'm sure if somebody finds a way to pay him some money or do something, he will Lighten up. You've got the House trying to push through a quick permitting. Heck, I was at the in Washington, D.C., watching in the audience, watching Donald Trump sign an executive order on reducing the amount of steps and time it should take to get permitting to build these data centers and power around it.

Daniel Newman:

Yeah. You know, Pat, so I wrote a long opinion on the Amazon, the 10 million stuff. I mean, I think I've said this before, like corporate venture has really this kind of dual mandate. Like one is to make good investments and two is to invest in companies that will benefit from whatever the resources that company has to offer. So if OpenAI is willing to use more Amazon, more infrastructure, more chips, more whatever. It's not a terrible thing. But the idea that kind of like 10 billion is being given and 10 billion is being given back, I think it sort of misses a lot of important points. Like one is like that 10 and 10 is indicative of 10 going in and never being worth more than that. which of course, if they invest 10 and OpenAI's next round, they invest 10, say they got in at 500, their 10, and then you get around at 830, that 10 just became 17 mark to market, right? So that's one thing that's being completely missed. And if they do a trillion and a half IPO, that 10 became 30, which is a pretty good bet. In the meantime, while getting a company to use your stuff, like, and potentially have the ability to say that they're using, their stuff. So that one to me is like it's a little bit like I get that the A.I. naysayers want to just poke holes in all of this. But look all of these corporate venture teams have that mandate. They have to make investments in things that are going to be worth more. And so yeah I guess. If OpenAI fails and this whole thing crashes and burns, then yeah, it's a bad investment. But if OpenAI does an IPO and these guys get liquidity at 3x, 2x, in some cases the early investors at 5x or 10x, that's great investing. And I think the reason, I also think one other thing that just kind of keeps coming to mind is OpenAI must be telling something to investors about what it's doing, what it's building, that is just not being presented to the public. in terms of like giving, cause I can understand right now when you're only doing 20 million in revenue, you're only going to get to maybe two or 300, like how people that are modeling this thing, you'd be like, yeah, it's never going to make money. Why would you pour your money? But these people, like, I got a lot of pushback. A lot of people will tell me I'm stupid and that's fine, whatever. But like, I don't think Amazon and Nvidia and Microsoft and Google are full of stupid people that just want to incinerate their capital. I do think they have a willingness and an appetite for risk. And I do think they know that if they get three or four right out of every 10, they'll make a ton of money. So they are willing to take some chances to lose, but they're not stupid. And so whatever it is they're seeing that gives them the confidence to pour in at these eye watering valuations is probably something we're not seeing, even you and I. And so I think that's just one of the big misses right now is that people kind of assume what we're seeing for AI is AI. And I think, you know, and I wrote a Forbes op-ed about this, but I said, LLMs are a parlor trick. They're the DVD, they're the Netflix shipping DVD moment for AI. It's like, we're not seeing the potential of this thing. We're seeing the teaser. We're seeing that here's one little thing that you can do with AI. And people are actually trying to determine or trying to say that this is the entire TAM, is how many people are subscribing to OpenAI and Entropiq. And it's dumb. I just want to say that. Like, hopefully you clip this. If that's what you think AI's TAM is, you're dumb. Don't @ me. I'm going to start doing that more. I really like that line. You're going to have me all the time, right?

Patrick Moorhead:

Absolutely. I'm going to retweet you on all that stuff. So, Daniel, one of the big discussions, the risk points, you know, I always like to say, oh, it's not just compute, it's compute memory network. and storage, but it's also power, right? We had Bernie Sanders and I think, you know, one other senator come out and basically say that, you know, it's going to push costs up for everybody. I know you've been working on a project for a company and a sector, but what do you make of this? You know, it's funny. I think it's less of a populist pushback, but it looks like it's you know, a senator discussion, even the house is getting involved. Yeah, it's stupid.

Daniel Newman:

I'm putting this on the line of also stupid. The idea of just putting a moratorium on the build out of like, what's the most important technological advancement to keep our economy as the world leader? Because, you know, What you can't control it in the case of the powerful politicians. Look, we have to keep going. We have to keep building. We have to keep leading. We have to we have to speed up development. We have to deregulate. We have to make energy abundant. This is I spent a lot of time on this deal with nuclear fusion. We need to. We need to move beyond just coal fire and turbines and we need, you know, high volume renewable energy that's abundant and safe. We need to build data centers but we also do have an issue like we do need to figure out how to make energy more affordable like I've looked at my energy bills and they're high, you know, it's expensive. And for the average citizen, we do have to do two things at once. I say that a lot. Too often we look at doing one thing at a time. We need to do both things. We need to make energy more abundant. We need to make it more affordable. We need to break pricing down so that we can keep inflation lower, which by the way, print came in super good this week. But I think the idea of stopping, it's just so binary, Pat. It's dumb to me. It's like, yeah, OK, let's just stop. All the shovel ready, all the electricians that are out working now, all the construction workers that are now involved in these projects, all the small and mid-sized companies and all these sites that are pouring concrete, they're selling rebar, and they're actually pulling plumbing into these facilities and water. And yeah, we're just going to stop all the jobs that we're creating through manufacturing, through the build out of AI. I don't know. I, it's almost too dumb to talk about.

Patrick Moorhead:

Listen, you've got senators worth a hundred million dollars. Um, I find it hard to imagine that, that they're fighting for, for everybody, um, with their insider trading and that's just not a democratic thing. It's, it's pretty much, uh, everybody, everybody's looking for their nut. Okay. Which is, um, what do I get out of this and how do I make money? Or how can I create more power to be able to do this? Like you said, I think in the short term, all of this build out is going to create more jobs. The alternative is us to slow roll AI as a society. We lose competitiveness on all fronts with China, and we essentially go into a depression. We're going to see how Europe deals with this, because they're kind of in between China and us, right? They've over-regulated. One of the biggest things that came out of Davos last year was an affirmation when I pulled policy leaders from, I talked to four policy leaders from four different countries and they're like, we over-regulated this and we need to get We need to get moving or we're going to find ourselves behind. You already see Europe crumbling on the manufacturing side as Chinese vehicles come in. And you've seen what happened to the German automakers, Porsche, BMW, and others. They're just getting eviscerated. And the amount of job loss, and I think that's the tip of the iceberg. I think the smart thing to do is to move forward. And we really don't know exactly what is going to happen. I do think that if I relate this to the time frame of every other major industrial revolution and electronic revolution, there will be job loss, but there's also job creation. But again, not, The AI doomers were out there, they got tamped down and they're just back under the power thing. I do think if power does raise for the average person, I think that is an issue and should be addressed in some way, shape or form, but primarily from building more energy. not subsidizing and doing it with fusion, fission, gas turbines, solar, whatever it takes. Good conversation. Hey, let's jump into Marvell Industry Analyst Day. You and I were both there and front and center. And we also shot a video that everybody should look at. And my net from this, Daniel, was the AI network is the AI computer, right? An AI factory, the network isn't just plumbing. It's about performance. And it's very simple. If you look at the core of what a GPU or an XPU is, it has access to a specific amount of memory. that the application can fit inside, but more importantly, the model or that token generation. And the more memory, the more stuff you can do. But what happens after you bounce out of that HBM wall, you have to connect in a very fast way to another accelerator with HBM to have one planar service of memory. And the way that you do that is through networking. And it was one thing to network two GPUs or XPUs that are sitting right next to each other. And then you started networking the different sleds that were in a rack, and then you started connecting racks, and then you started connecting racks and fleets to other fleets. And now we're connecting multiple data centers to have this gigantic contiguous memory footprint. And the only way you do that is with the highest speed networking. And I think if nothing else, what I walked away with was The company has three big bets, right? Celestial AI. Although people like to call it a networking thing, you can also disaggregate HBM from the GPU. You can have a pool of HBM. I don't think the company is leaning into that yet because it's a little bit complex. UA Link, right? Marvell is the key silicon provider. which is scale up. And also it was funny, they threw something really interesting. You think of scale up, which is inside the rack, but if you do it well enough and your cables can go longer, you can go across the rack, you go between racks, right? And I think, you know, 50 meters, in fact, which kind of scrambles your brain on the definition of scale up. power efficiency side, right, on the ASICs, they showed an 80% power reduction on embedded SRAM, which is key and I think is a key differentiator out there. And what I'll end with is Marvell is the Switzerland of AI networking. You want to buy NVIDIA? Marvell will sell you some optical DSPs for the scale-out network. Build a custom ASIC like Tranium, Marvell designs the chip and the packaging, or they'll just sell you an IO chiplet or IOIP. And that's where, by the way, all of these day traders and hedge funds are getting really confused on the amount of content that Marvell sees. If you're open up to an alternative scale up to NVLink, Marvell will sell you the UA-Link switch chip. And then finally, if you want to break out of the HBM memory wall, Marvell will sell you Celestial AI photonic fabric. So check out my analysis that I did on my website for a deep dive.

Daniel Newman:

Yeah, you did a great deep dive, Pat. And I don't have a lot to add, but one of the things I think it was really worth adding is just the volume and size of the opportunity that sits beyond the XPU. The XPU attach. That is such a missed point. And that's why I think Marvell gets so unfairly judged in the public markets is people are obsessed with just that XPU. And by the way, even if Broadcom took an XPU or a project goes in-house or Marvell loses a part of a design because again, a lot of these designs aren't any one. It could be, you know, it could be GUC and all chip and it could be an in-house team and then partially Marvell. It's not like one, you know, where Broadcom's kind of known, but even Broadcom, that's a vulnerability there too, is like becoming less of the all design and becoming part or, but the point is, is like Marvell's overall attach is two, three to one on the XPU. So that big data center number, only about 25% of it's actually custom chips. So the rest of it is attach. And so I just think that's a really like one of my big takeaways is that in the fact that the attach is going to get bigger with like optics and everything that you kind of called out but like, I just think a lot of people miss that and I think that was one of the really valuable things that we kind of need to beat on is that. this opportunity is so much bigger than just compute. And I think now that we're seeing memory explode, Pat, we're starting to see other parts of the supply chain really get constrained. We'll realize that moving data is the next big constraint as we increasingly figure out how to do compute more efficiently.

Patrick Moorhead:

Yeah. Hey, I'm going to skip this topic because we already talked about it. The House bill to support the Idea Center build-out.

Daniel Newman:

I guess the only thing to note, right, Pat, is they want to go faster while Bernie wants to go slower.

Patrick Moorhead:

You know, I hear you. I think that's good. What I'm trying to figure out is the difference between, you know, Trump gets an executive order that says we're going to do this. And obviously that must have gotten tied up. OK, let's go to the next topic. U.S. government launches tech force. tech farce, to bring in AI and tech talent inside. And, you know, I think, you know, it's really an admission that AI competitiveness is not just all the infrastructure, but it's the people, the procurement and the ability to execute. I read an analysis that said, The average government employee gets an annual rating of a 95, okay? And the other thing is that government employees, at least what I've found, are not exactly the most competitive people outside of the military. And so, therefore, you attract people who are looking for a very steady paycheck with a pension. which typically are not the biggest go-getters in technology out there. So whether it's cybersecurity, automation, fraud detection, internal productivity tooling, people are trying to attract in a very cool way. It kind of reminds me of kind of a military vibe to it, which is, you know, come in to the tech force, right? And let's see what we can do here. So yeah, pretty interesting. I think and I think it will attract some decent tech talent.

Daniel Newman:

Yeah, I mean, I think that's a pretty straightforward Pat is, you know, You kind of just look at it's a snowball, you know, more energy enables more manufacturing, more manufacturing enables more job creation or job creation enables more opportunities and with AI we, you know, we're going down this path of building we're going to need more skilled people. To me, I don't want the government to lead everything, but I think I want them to lead by example of the things they do do. And so the example here is we want to lead in talent, both in the public and private. And in this case, we don't want public to fall so far behind like it historically has with tech, where it can't move and it does become like paralysis for our ability to innovate. It's good stuff. I think we'll see a lot of this kind of stuff for some time. I don't think this is going anywhere.

Patrick Moorhead:

Yeah. Hey, let's jump right down to The Flip here. And I think a good debate that we saw here is there's a lot of OpenAI discussion about, is this the “tulip bubble, canary in the coal mine” or is OpenAI the real deal? And it really started with this Brad Gerstner interview that I think Satya was part of and Sam Altman was on where they were discussing valuation. And I think people walked away and thought that Sam Altman was very defensive in how he justified the amount of investment. And then we saw a lot of other things come out. And I think that was kind of where the latest bearish cycle in AI happened. Then you had Gemini 3 that comes out, where Google really wasn't relevant in the conversation until then, and now they were. So, hey, let's dive into this. Is OpenAI the real AI deal? All right. I chose my own All your own name. I called my own name. Thank you. So, yeah, I do think that OpenAI is the real deal. I mean, Google may have put out the paper first on Transformer, but they sat on it for seven years and didn't do anything with it. And then OpenAI took the ball and here we are today. I always like to say, and I even use this in team meetings, is the best way to gauge the true value of anything is how much money people are willing to pay. And if you look at that $750, $800 billion valuation implies that they believe that they're viewing core infrastructure. I think, Daniel, you even said it earlier in the podcast that there is information that investors see like that, that we don't. And I highly doubt that these super informed billionaires who are investing into OpenAI in these countries and hedge funds and Family offices would invest in that if they don't believe in it. And if we truly believe in the $4 trillion GDP adder, it seems like putting an $800 billion valuation on it seems like it makes a perfect Perfect sense you know it took about it took opening about a week to bring out 5.2 that shows that it on on many important applications that actually trumps any other model it was almost like. Okay, red alert. Let's decide that we're not going to do some of these sideshows. Let's actually focus for a couple of weeks on something and boom, here we go. And the other thing is, is that if you look at who OpenAI has hooked its wagon to from an enterprise perspective, I think you and I both have agreed that the biggest value unlock is going to be for businesses. And the leader in business IT today at scale is Microsoft. All the way from, I mean, they are the absolute full stack there all the way the applications to the applications developers and everything that they create. Yeah, they are having to go to get money from a lot of different companies, but that's just because they need a lot of infrastructure to drive that out there. And I think if some people are poo-pooing people like Oracle and things like that, but I think that's just noise if we look at the value that this company can bring them. And that's it. They're the real deal. OpenAI is the real deal.

Daniel Newman:

OpenAI is absolutely, positively the canary in the coal mine. This is where the AI trade implodes. But the good news is, I don't believe AI as a whole has any risk. I just think it's all the risk sits with OpenAI. This is a company that's overcommitted, it's overextended, and it no longer is over the top the best of anything. Its models are just okay. They're parody to others. You've got companies with more money, more resources, and bigger customer bases that are building products and services that compete. And once you say, hey, do I really care which model is doing the work for me? Or am I okay with Gemini? Am I okay with Anthropic? Am I okay with Nemotron 3 or some other open source model, not DeepSea? But the point is, is that OpenAI to be successful can't just be a model company. It has to be a hyperscaler. It has to be a software company. It has to be an agentic platform. It has to actually do all the work to justify the valuation of a trillion dollars or more. It can't just be the company that's providing models to other companies. So here's what's going to end up happening is we're going to continue to see model parity. We're going to continue to see other models become better and better, more accessible, more usable, more affordable. and less volatile and less arrogant than OpenAI. Sam Altman himself is a liability to the company. He wasn't just dismissive. He was flippant about the valuation, basically saying, screw you, put your money in. We're worth all this. But at the end, everybody that's evaluating this thing says, why would this be worth a trillion? Why is this worth a trillion and a half if I could just replace OpenAI 5.2 with Gemini 3? or Opus 4.5 or any other model that's out there. It's not a problem for AI. The great news for everyone out there is Oracle is not really in trouble because Oracle will see that demand come from someone else. If they build the compute, another company that's building AI will use the compute that they make available. Even CoreWeave, which I don't love the fact that they do the six-year depreciation and they lever all their GPUs up. As long as the AI industry keeps growing, and I think that insatiable demand exists, even CoreWeave could be okay. But the problem with OpenAI is what they're doing can be replaced instantly by another company. And once that gets replaced, the demand goes elsewhere. But OpenAI is no longer worth what people are saying OpenAI is worth. When that happens, that will be possibly the first implosion. This could absolutely be the Netscape of the AI era. It's not that what they did is wrong. It's not that they weren't first to come to market and it's not even that what they built isn't good. It's just that good enough and in some cases parody we'll see a seismic transfer of use of usage from one platform to another. And there's a real possibility that it doesn't stick, that people go elsewhere, they use other models. And if these models do in fact become commoditized, which I think they could in some way, and they aren't plugged in as part of that enterprise stack or part of that robotic stack, then they just become one other company offering a model. No way they can grow enough revenue to support the trillion and a half dollars of commitments they've made. No way they can drive enough revenue to reach profitability. The musical chairs are going around and somebody is going to sit down on a round of capital for OpenAI. It's going to be the last up round before the down rounds begin. AI is going to be fine, but OpenAI, you're in trouble.

Patrick Moorhead:

All right. Thank you for participating and giving me another W. Thank you.

Daniel Newman:

I don't think so.

Patrick Moorhead:

All right, Daniel, let's jump in.

Daniel Newman:

I think I won big.

Patrick Moorhead:

Bigly. I was actually curious, how are you going to thread the needle between AI trade and an OpenAI? You did a pretty good job.

Daniel Newman:

I could do it without blowing up everything I say every day. Exactly. And by the way, like, as I was making that argument, I'm really thinking to myself, like, isn't that the risk, though? Like, I don't think the demand for AI goes away. I think we're just getting started, but isn't there a world like talk to you know your son who builds right like. If Gemini 3, it's like the things that Chassis ABD do for us can be done by another model. And you're building like these agents and workflows. Like, isn't there a point where you're like, I can plug in one for another. So unless the enterprise has said, we're going to pour all our data into an OpenAI and we're going to build our entire agentic workflows through OpenAI. Isn't that a huge risk? Like that there, they are a model that could be.

Patrick Moorhead:

I mean, they've got a huge user base, but I think you answered it yourself, which is you can't just take. what you've created in OpenAI agent tool and imported into Gemini. And so it gets down to this agentic orchestrator, agentic tool set that is the new lock-in at this point, because there's no standard For for for any of this stuff so and you know i think there's 10x more developers on OpenAI than there are google at this point all kind of cracking question is you know how how deep. Are they in that they can't do an about face? Now you look at the cost, right, of it, where Google's full stack and OpenAI is not. So conceptually, Google can get the revenue and they have a lower cost, which... Can I make an interesting prediction?

Daniel Newman:

I know we've got to get to the last segment. I think OpenAI is too big to fail, but I think it could very easily fail, semi-fail into Microsoft, which was your argument. Microsoft doesn't have what OpenAI has, which means like all that compute capacity it's acquired, all that IP that it's built to really, you know, to basically become Microsoft's version of what Google's built. I mean, that to me is like the ultimate fallback. It's like they just basically get swallowed up by Microsoft and Microsoft takes that and it becomes the platform. And then OpenAI and Microsoft fully integrated, not what they're doing today, but fully integrated becomes meaningfully competitive. with what Google can do in the long run. Good stuff.

Patrick Moorhead:

Hey, let's dive into the next section here, Bulls and Bears. All right, Daniel, time to talk to the text and the market. Here we are, we had the biggest run up with infrastructure, and I think we saw one of the biggest sell-offs with Oracle, Broadcom, and CoreWeave. What is going on here?

Daniel Newman:

Yeah, what a beat down. Look, I think the market, some of this is just like, you know, we've had a big run. And so sometimes people are like, ooh, you know, big pullback, but it was like, like Oracle for sure, like all the way up, all the way back down and fully round trip. And so what happened is, is I think the first potential unwinding is one, everybody started to zero out everything OpenAI did because there's concern of whether OpenAI can actually do and sustain and pay for all this stuff it's committed to. Obviously today's news of them raising a hundred billion more just helps the story that they're going to be able to get there. But that impacted companies like Oracle, which a huge part of its RPO and a huge part of its growth commit was all tied to OpenAI. It's hit companies like CoreWeave that's had to continue to raise money because the idea is basically there's a split in the sort of market sentiment that there's a possibility that this demand just isn't real. So if you believe the demand isn't real, then you start to look at what would be the company's most impacted, OpenAI, Oracle. core weave companies that are heavily debt heavily debt lever, they're diluted, they've diluted themselves to raise more money, their, their depreciation is massive. You know, Oracle was, you know, raising a ton of money, its bond value went down, and this made the stock absolutely crater. But Pat, the thing is, is like, The debt fears are only really a problem if AI demand was to meaningfully wane. And what we're seeing now is there's an appetite for investment, and there's an appetite for the demand for AI. And as long as those two things exist, these are like little corrections. But I would say, like, if you're an investor, you're kind of probably thinking, like, maybe not, like, look at companies like Google and companies like Meta that just print cash all day and they can finance. Because I think I saw Goldman Sachs came out with something that basically 90% of the boom is being financed by cash and cash flows. Only 10% of it's being funded by debt.

Patrick Moorhead:

And so debt will go up as the investment goes up. But yeah.

Daniel Newman:

But that's not “tulip bubble.” Correct. You know, and I'm sorry, but at the same time, like clearly Oracle is like, we don't care what happens to our stock for the next year or two. We cannot miss this. And they just don't have the cash flows of a Meta or a Google to fund this stuff. So they're taking a different route. But I always said, Pat, like Oracle is just one of those companies that's been perpetually underestimated. but tends to consistently outperform. Yeah. And so I'm not sweating Oracle. I'm a little more concerned about CoreWeave just because I could see trying to both, you know, what they're trying to do with debt and dilution. I mean, that really depends on hyper growth to keep doing that.

Patrick Moorhead:

Good stuff. That's my take. Yeah. One of the drill-downs here, we saw Oracle's earnings on NetNet. They had a revenue miss, an EPS beat, and a guidance miss. And I'm going to attribute this to the lumpiness of all of these data center build-outs. There was a lot of conjecture on them getting pushed out. I think it was after earnings, which I think ended up being nonsense. You leaned into that in a couple of hard-hitting ex-posts. I mean, you and I both went to the Financial Analyst Day, or maybe actually I just did, and Oracle has a very clear story on when they commit capital and their margin structure. And anybody who has a question about how Oracle makes money in infrastructure, I defer to those materials. When you step back, though, I mean, RPO still was up 438%, okay? And that is contracted demand. This is not LOI demand, right? IaaS plus SaaS up 34%, which obviously will go up dramatically when that RPO turns into actual revenue. And if you parse out IaaS was up 68%. So I think most of the negative connotation was about capital intensity that was going in and we just had the debt discussion. One little side note, they sold their stake in Ampere, which I'm still trying to figure out what this means to the company and how it can be competitive in the future with all the other hyperscalers doing their own custom silicon.

Daniel Newman:

Probably their close tie to Stargate and the Arm Graphcore isn't Ampere involved in that? And you know, that whole circle, maybe they didn't need to actually fund it to get out. Funny, they said something along the lines of like, we don't want to be in the business of manufacturing chips, but Ampere's a fabulous company. So it was a little interesting, like the choice of words was sort of interesting. But I have to imagine there's something there, the whole SoftBank side of the house, because they're all so close together. Not much to add there, Pat. I think I beat the Oracle drum in the last topic, so maybe we can pound onto it. Wasn't there a company that crushed it in their own NVIDIA moment this week? Yeah, I mean, it was Micron. Let's dive into that. What happened? Beast mode, right? I mean, crushed EPS. crushed revenue, but let's just be clear. This was a guy like, what did they guide like 5 billion above the expectation? You know, they're seeing the company shifting its consumer over to all HBM for a data center. I mean, the demand is absolutely at its, they're at their knees trying to get access, Pat. If you don't have the HBM now, you are screwed. Absolutely. S-O-L. And guess what? A lot of roads, especially if you want U.S. at HBM, all roads lead to Micron. Pat, this was kind of their 2022 NVIDIA moment. where, you know, you and I, we have a fun little squabble about the boom-bust cycles at time. And like, but I keep saying like, is this the end of the boom-bust cycle for memory? Like as long as we have like, like consumer aside, like, and by the way, a concern that all these companies are like, we can make way more money just doing memory for data center compute. So I started to worry about phones and PCs and stuff because you start to see that rotation. We have to still serve that market, but like, Pat, this guy was just eye-watering. And this is a company that was only trading at like 10 times forward. So it wasn't a company that's been like heated up. And, you know, for months I've been pounding the table saying like, this is gonna follow. Like we're seeing it now. Like what are the things after compute? It's memory. So, you know, listen, not a lot. I don't wanna get too into the granularity here. We gotta keep moving quickly, Pat. But gosh, this is their moment. Like I just, I don't see anywhere in the next three to five years where they're not going to be able to beat just based on the demand that's expected for these data centers.

Patrick Moorhead:

Yeah, listen, I'm hopeful that Micron continues to invest in so they solidify their future. I think their competitors are banking on them to just give all their money to investors. But this is a good time But also they need to prep during the bad times. And I'm not convinced that Micron is currently doing that. Their executives, they keep hold up a lot, don't talk a lot outside of financial. So you really don't know what is going on there. So hey, let's move to Broadcom, right? The faster they go up, the harder they fall. And I think it was just, you know, the question on Broadcom was all about XPU margins in the future, where customers like Google will be doing more. And it's almost a mirror image of the criticisms of Marvell. Right. And it's funny. I think it came down to like one comment that Hock responded to in a question that some investors just didn't like. Right. So they were concerned about the margins. But Hock was very clear that this would take many, many years for these companies to take over what they do aside from the connectivity intellectual property. He wasn't responding to the IP and the quality of the SERTIs and the quality of the networks and working at Photonics. It was all about that process of integrating the front end and the back end with the IP and spitting out a chip that Broadcom monetizes that entire base. So that's what happened. Broadcom beat the quarter, but missed the vibe and margins are the new question. And it's not even anything that was a number that was used. It was just this classic AI fear.

Daniel Newman:

Yeah, by the way, I mean, because my read on it was the idea that Broadcom won't be needed by the hyperscalers kind of came out, bubbled up. Yeah. And he was a bit flippant about it. Like he just didn't really seem to consider that possibility. You know, Pat, you and I have advised, and we can't say anything specifically, but we've advised some of these companies that try to take more in-house than they should. And chips are hard. Like, these are big companies that have taken big swings at trying to minimize the outside help they're getting to build their own in-house chips. doesn't always go well. So, I mean, there is a role to play, like, you know, as much as I'm the guy that like AI replaces everything, you can't just punch this into an AI right now and get a great accelerator. There's a lot of work to be done. So, I don't know that there is quick, you know, again, like every week there's another story of our Broadcom is going away or, you know, Google is gonna abstract away PyTorch and no one's gonna want Nvidia. I mean, these stories are so freaking stupid. They are so unreasonable and so zero sum that I just think you and I have a full-time job to just dispel, like, it's not like this story is zero. It's just, it's not 100. You know what I mean? Like, it's not like NVIDIA is no longer needed or Broadcom is going to be overnight no longer needed. And it's not like they won't innovate in other ways in between, but maybe that just deserves some margin or maybe there's just some compression that comes with that. I don't know.

Patrick Moorhead:

So Daniel, we have hit time. We've got a lot of other topics we could have leaned into, but hey, we can't do it all in one show.

Daniel Newman:

Our poor audience is gonna have to come back for more. Exactly. If they want more. I bet you there'll be some AI stories next time.

Patrick Moorhead:

I don't know, Daniel, or I think that's just a fact.

Daniel Newman:

What are you, quickly, I know you gotta go, what are you doing between now and the end of the year? You get a little time, a little rest?

Patrick Moorhead:

Yeah, yeah, I'm gonna hit the ranch. We're gonna hit the ranch next week for Christmas. Family's coming in and then gonna go skiing in Snowmass the week after.

Daniel Newman:

Amazing. I'm going to go hug my mom, going to Chicago, give her a big hug. Might see my dad too, I don't know. And then I'm going to go to an undisclosed location somewhere in the Caribbean. And I'm gonna walk around the entire time with my shirt off, and I'm gonna come back as Tan Dan. My kids have called me Tan Dan. That's my new nickname is Tan Dan.

Patrick Moorhead:

Yeah, I am definitely not tan and probably won't be tan when I get back, but you never know. The sun is good, but the sun is bad. The sun is very good. So hey, I just wanna thank everybody for tuning in. We appreciate you. It's been a great 2025. Dan and I may or may not crank out a final show. for the year if we don't really appreciate your support and your loyalty and hit that subscribe button tell your friends your family at holiday dinner that they need to tune in here if you want to shit post and play armchair quarterback and podcaster you know where to find us on X take care we love you bye bye

MORE VIDEOS

Building Enterprise-Ready Agentic AI with Search - Six Five On The Road

Steve Kearns, GM of Search Solutions at Elastic, joins Nick Patience to share how Elastic is enabling enterprises to move from RAG to agentic AI, solving operational challenges, and powering the next generation of autonomous workflows.

How Agentic AI is Reinventing Observability - Six Five On The Road

Baha Azarmi, General Manager of Observability at Elastic, joins host Jason Andersen to discuss how agentic AI is advancing observability tools, enabling faster insights, automated operations, and improved root cause detection for modern IT teams.

Agentic AI and the Future of Threat Detection with Elastic Security - Six Five On The Road

Mike Nichols, General Manager, Security at Elastic, joins host Mitch Ashley to discuss how agentic AI and the Elasticsearch platform are driving faster, more intelligent threat detection and response for today’s security teams.

See more

Other Categories

CYBERSECURITY

QUANTUM