Home

Edge AI, Physical Intelligence, and the Path to Scalable Automation

Edge AI, Physical Intelligence, and the Path to Scalable Automation

At Davos, Qualcomm’s Nakul Duggal explains why Edge AI and physical intelligence are redefining how AI scales, and why robotics, efficiency, and real-time execution are becoming the next major frontier beyond the data center.

AI conversations at Davos often start in the data center. This one doesn’t stay there.

From Davos, Daniel Newman and Patrick Moorhead sit down with Nakul Duggal, EVP and Group GM for Automotive, Industrial, Embedded IoT, and Robotics at Qualcomm, to unpack why the next phase of AI scale depends less on model size and more on where intelligence runs.

The discussion moves beyond cloud-centric narratives to focus on Edge AI as a system-level requirement, not an architectural afterthought. Duggal explains why real-world AI, from autonomous vehicles to robotics and industrial automation, demands real-time execution, deterministic latency, safety guarantees, and extreme energy efficiency. These constraints fundamentally change how AI must be designed, deployed, and monetized.

As AI becomes physical, the conversation highlights why robotics, industrial platforms, and intelligent machines are emerging as the next major growth vector. Qualcomm’s evolution from mobile-first innovation to automotive, industrial, and robotics leadership illustrates how edge-native design principles, mixed-precision compute, and ecosystem enablement are becoming decisive advantages for scaling AI outside the data center.

Key Takeaways

🔷 Edge AI is execution, not experimentation: AI systems that interact with the physical world must operate in real time, with safety, latency, and reliability baked in. This shifts the AI conversation from training to deployment discipline.

🔷 Physical AI introduces non-negotiable constraints: Power, cost, thermal limits, and deterministic performance define what can scale. These constraints favor architectures designed for efficiency over brute-force compute.

🔷 Robotics is accelerating faster than expected: Advances in model distillation and physical AI are compressing timelines. Robotics is moving from research to deployment faster than most industries anticipated.

🔷 Industrial AI requires ecosystem enablement: Scaling AI at the edge depends on developers, partners, and platforms that can prototype, deploy, and iterate quickly across vertical-specific use cases.

🔷 The next AI wave is hybrid by necessity: Data centers and edge systems must work together. Trillions in AI-driven value depend on connecting intelligence across cloud, network, and device layers.

Listen to the Audio

Disclaimer: Six Five Media’s The View from Davos is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors and we ask that you do not treat us as such.

Transcript

Patrick Moorhead:
The Six Five is On The Road with a View from Davos. Daniel, it's been a good show so far. It's pretty much everything you would expect from WEF, like this intersection of policy, technology, primarily AI discussion. And we even had some tariff discussions as well.

Daniel Newman: 

Well, it became a very interesting lab. It was set up in advance. There was definitely some escalation going on. You know, you're bringing leaders from all over the world here. But there's also, this is just such a good platform. It's getting all these people here, putting them out in front to basically have the opportunity to share their vision. And of course, Every year we're here, technology tends to reign front and center of this event. And once again, here we are, AI running front and center, most important topic in Davos. And even some of the stuff related to tariffs and related to policy, Pat, really sits at the heart. It's about critical elements, minerals. It's about access to power and energy, all the things that are basically setting up the ability to lead the world's economy, which is going to be dictated by who wins in AI.

Patrick Moorhead: 

That's right. And most of the AI discussion so far has been focused on the data center. But quite frankly, that's an incomplete discussion, as AI in the edge is arguably as important as the data center. And I can't imagine a better person to discuss this with Nakul from Qualcomm. Nakul, welcome back to The Six Five. Good to see you guys. Good to see you too. I guess first question, this is your first Davos and I was just curious to get your impressions. Did it end up being everything you expected?

Nakul Duggal: 

It was expected to be an eventful Davos, so certainly one of a kind. No, I think it was a great show for me. My takeaway was if you kind of get past the data center focus on AI, the big takeaway was pretty much every business leader that you talk to, implementation of AI is on everybody's mind. And folks are getting to the point where there is clearly value that is ascribed to the technology. But what is the recipe that allows you to be able to go scale this such that it actually starts to show productivity gains, starts to show what the world looks like as you deploy AI in your business? I think there is a lot of discussion around it. I did not walk away with a clear recipe that folks have come up with. So I feel actually that 2026 is going to be all about, how do you implement this?

Daniel Newman: 

Well, I think a lot of these sort of philosophical debates that are going on in the market, Nakul, are related to all this investment that's being made. And by the way, you've seen the same thing at the, you know, we talk about both the edge and the cloud. I mean, when you go from generation 4G to 5G and the service providers are deploying tons of infrastructure, they're thinking, you know, what are the killer apps that are going to drive ROI? Well, right now we're seeing the same in the data center, you know, The first killer app that everybody's looking at is LLMs, but we also see they're not necessarily profitable. So we've got to get enterprise AI. We're going to have to also build this connective tissue to the edge, which to Pat's point, the incomplete story is that there's multiple trillions of TAM out there that are going to depend on the edge and the data centers working together. Qualcomm, you guys are running down that path now. Anyways, you're getting into data center too. You're not just the edge. But edge isn't a new thing. For like three decades, it's been a topic, Nakul. I mean, talk a little bit about it.

Nakul Duggal: 

I'll take you back 10 years or so, as you started to think about the modernization of legacy platforms as they started to intersect with advanced semiconductors. The car. If you think about a modern car today, central computers isn't something that we talk about. But if you go back 10 years ago, people were even thinking about, why do I need the car to be connected to the internet? That's right. So I think when you think about inflection points, to me, what it really comes down to is, what is the utility value of technology to a large business at scale? And when the time spent between when you are massively investing to when you actually start to recover that investment, if that time is predictable, then you'll be able to get to those inflection points quickly. Today, if you think about how normal it's become to deploy robo-taxi services, as an example, it assumes that the car is running an edge AI model that is designed to be completely safe, can replace the driver, take the driver out of the loop. The car is fully connected. It's a robot on wheels and it's performing the function of transporting you. There is no dependency on the cloud when that function is happening. So as folks start to think about deployment of AI at the edge, One thing that I think folks don't realize is, there's a training part to the process, and then there is the execution part of the process, especially for Edge AI applications, where you're actually performing a function that is real-time, that is latency-grade, that is safety-grade, and that's already at scale. Now, if you apply this to what's the infrastructure that is needed across large industries, last 10 years were mostly spent around connecting endpoints to the Internet. But the idea was you're still processing in the cloud. You're not actually processing at the edge. We are talking to a bunch of different industrial leaders and it's all about OT change, operational transformation change. How do you get situational awareness? How do you make intelligent decisions? Because you know that the outcomes are all happening at the edge. That's where we're spending a lot of our time in terms of our internal transformation.

Patrick Moorhead: 

Yeah, so Nakul, when I first started covering Qualcomm, you did two things, right? You had wireless IP and you enabled smartphone platforms. And I remember six or seven years ago, we had a conversation. I came into San Diego and you talked to me about this new business you were getting into. basically intelligent vehicles. And I know there are a lot of naysayers out there, but here we are, it has become a juggernaut. And now you're extending that edge to even robotics and the industrial edge, which I know you've been doing for a while. You made some pretty big announcements recently Can you talk about the highlights for everybody?

Nakul Duggal: 

So I run three businesses. I run automotive, industrial, and most recently robotics. Automotive is growing at a great pace. We are doing almost a billion dollars a quarter now. And so this business is growing at about 25% or has been over the last five years. And this is a business that I would say is a relatively new business to us, but it's also a good poster child for the transformation that's going on in the company, where we take a lot of our tech that we build for a smartphone platform, but we have now massively adapted it to a completely different industry, to the point where we are now actually building tech for automotive. It is an edge-only application. So, I mean, in terms of the transformation, what it takes to do this at scale, we are already doing it at scale. We spent a lot of time over the last couple of years thinking about what do we need to do in the industrial space. And industrial for us was really about what is changing in the industrial landscape that will create an inflection point in this decade, in the next couple of years. And if I go back two years, it was about what is AI going to go do? The rate at which AI has progressed, and when we started to look at this, robotics was not on our mind. But if you think about the rate at which AI has progressed across all of the various types of models, physical AI, embodying models into something that is physical AI. This space is far faster than anything you can imagine. However, what it does is you have to be able to implement it at the edge to get feedback as to whether you're making progress or not. So we feel like in the last two years, the acquisitions that we've made, we acquired Arduino, massive developer ecosystem. Why? We wanted to be able to get closer to people who are actually prototyping, who are close to problem statements and use cases. who want to be able to run this inside their companies, create labs. One request we get from customers all the time is, as AI starts to get deployed inside our organization, our employees, our engineers don't know how to actually convert these use cases into what we will use. That's something that we will now go to with Arduino, is to build an Arduino Pro, such that it is going to allow customers to be able to experience prototype internally. The other area we are spending a lot of time on is camera AI and video AI. One massive change that's happening because of all of the geopolitics is I want to be able to control situational awareness. I want to be able to see what AI can see and I want that to then translate into safety, into protection of my assets, protection of my people. If you think about all of the L.A. fires as an example, massive tragedy, massive loss of life, loss of property, a lot of money wasted there. So there is a lot of focus now, as you start to build out new industrial infrastructure, what is the risk? That's where we are spending a lot of our time. And then robotics is frankly just growing at this enormous pace. I think how I would summarize it is, there is a lot of focus on building a generalized set of skills that you can apply to really any physical robot. And as those skills improve, you have the ability to create yet another edge platform, edge application that will drive automation. So that's something that we announced at CES. We are working with Figure, we're working with KUKA, a number of other new players that we're starting to work with. The way I would summarize it is, to scale this, you have to be able to have products that are designed to operate efficiently, with the right energy constraints, with the right cost constraints, with the right level of integration. That's kind of what Qualcomm has excelled at over the last many years.

Daniel Newman: 

So I just want to drill into this a little bit more. Personally, I want you to validate my social posts on this podcast, because I've come out I've come out very bullish on Qualcomm's robotics, and I've talked to some of the reasons I think you have permission. And I think right now, you know, there's sort of this bifurcated opinion of who gets to play in robotics, right? Is it just Elon Musk and Tesla? Is it going to be Jensen and NVIDIA? And one of the things I've said is, you know, Qualcomm's ethos, it's sort of provenance, it's pedigree, it's low power. It's vision. You've built success for the edge, mobile. So I actually said I think Qualcomm has a very significant and distinct advantage, especially at bringing AI and robotics and autonomy to scale at affordable pricing. Because that's another thing. You can do a $30,000 optimist, but that's not going to scale. Talk a little bit about why Qualcomm has the permission to win in robotics.

Nakul Duggal: 

I think you have to dig a little bit into the technology because I think a lot of things get conflated very easily. If you hear one thing, you start to believe the next thing is that. I think the big thing that has happened over the last two years is that as models have gotten smarter, folks have figured out that I don't need a mega model that needs to do everything all the time. I actually need the model to represent a set of skills, a set of problems I'm trying to solve. And the term that is used is distillation. If I have a data set that I can train that model on, once I get it to good enough, then I can deploy it in a constrained environment. And that actually has nothing to do with Qualcomm or Tesla or NVIDIA. It just has to do with the physics of how you would go solve a problem. These things are not going to, they're not expected to run in the cloud. They have to run in the edge. So the design set point of the problem statement is it's a constraint problem statement. You have to be able to have something that runs efficiently. We've always done mixed precision AI hardware. We are not a company that does floating point only, as many others do. And if you think about what's happened over the last year or so, everybody's talking about we need to build mixed precision, we need INT. We've been doing INT for the last 15 years. There's nothing special about how else would you get energy efficiency if you don't have a mixed precision format. What we have excelled at, frankly, is if you think about our progress in ADAS, for example, which is another good example of how models have to run efficiently. Once you figure out what you're trying to run, Then the entire game is about how do you take cost and power out, which is something that the company is built around. So I feel robotics, any edge application that has to get to scale, where you're actually automating something at the edge, those constraints will always remain. Frankly, for us, the question really is how quickly can we accelerate it? And then it comes down to the recipes. Do you have enough education in the ecosystem? Do you have enough partners that can go scale this? We feel pretty good about the plans that we put together.

Patrick Moorhead: 

Yes, so Nakul, appreciate this conversation here. What I'd like to end on, I think we've done a really good job talking about your overall strategy, permission to play and permission to win, some announcements, key announcements that you've done. Let's talk about 2026. What are the things that you're focused on for the year? What do you have your finger on the pulse of?

Nakul Duggal: 

So we started the year off really well with CES. We had a very large number of announcements. I had lost track. We announced a global partnership with Volkswagen. We announced a global partnership with Google for all of our cockpit platforms. A variety of different partnerships across various tier 1s and OEMs. We announced our leading edge SA8797 platform at Snapdragon Summit in October 24. That went commercial in December of 25. So we have significantly shortened the timelines. I think this year is frankly mostly going to be about how do we accelerate industrial deployments. So we've made a number of acquisitions in 25. At Mobile World Congress we will announce a strategy around industrial and then at Embedded World we'll announce our next platform for AI development at the edge for a wide variety of ecosystems. So there's going to be this rolling thunder. My expectation is that by the end of this year, we will actually have vertical specific portfolios for our entire connectivity, compute, AI products that start to become super relevant across a variety of different ecosystems. The other thing that we are spending time on is not only to be able to solve for the edge, but what services do we connect. So we acquired a video SaaS platform middle of last year and we were awarded a nationwide contract in India. for security and surveillance solutions. So you're going to see a lot of progress on that front as well. So it's going to be a pretty busy year, but we've spent a lot of time in 25 preparing the portfolio. So I think it's about execution. Exciting.

Daniel Newman: Well, I think after what I've heard this week, you can just go home. You can vibe code this in Cloud Code and have your entire business managed.

Nakul Duggal: 

I wish.

Daniel Newman: 

And off you'll be running. But Nakul, always great to spend time with you.

Nakul Duggal: 

Good to see you guys.

Daniel Newman: 

And thank you everybody for being part of this Six Five. We are on the road here with a view from Davos. Like what you heard, hit subscribe, be part of our community. Check out all of the content we did here in Davos. And of course, all of our regular programming on the Six Five. Nakul Duggal thank you so much for joining us. Thank you so much for joining us. We'll see you all later.

MORE VIDEOS

The View from Davos with MBZUAI’s Eric Xing

From Davos, Patrick Moorhead speaks with MBZUAI President Eric Xing about why world models may define the next frontier of AI research, how to recognize progress beyond narrow intelligence, and what role universities play in balancing innovation, openness, and responsibility.

Innovation to Product: How AWS Goes from Science to Service – Six Five On the Road

Swami Sivasubramanian, VP, Agentic AI at AWS, joins Patrick Moorhead to discuss how the AWS innovation engine transforms research breakthroughs into real-world cloud services shaping digital transformation.

The View from Davos with Wedbush’s Daniel Ives

Daniel Ives of Wedbush Securities joins Daniel Newman from Davos to discuss why AI adoption is entering a monetization phase, how enterprise software and modernization are driving ROI, and why the current cycle looks more like the early stages of a long-term buildout than a speculative bubble.

See more

Other Categories

CYBERSECURITY

QUANTUM