Home

HPC, Data, and AI in Research: A Conversation with MPCDF and Lenovo - Six Five On The Road

HPC, Data, and AI in Research: A Conversation with MPCDF and Lenovo - Six Five On The Road

Scott Tease, VP, ISG Product Group at Lenovo, and Dr. Erwin Laure, Director at MPCDF, join the Six Five team to discuss how their organizations are partnering to drive innovation in HPC, data, and AI for groundbreaking scientific research across Europe.

How are leading research institutions and technology providers redefining the scientific computing landscape with advanced HPC, data, and AI solutions?

From Supercomputing 2025, hosts Patrick Moorhead and Daniel Newman are joined by Lenovo VP, ISG Product Group, Scott Tease, and Dr. Erwin Laure, Director at Max Planck Computing and Data Facility, for a conversation on how the partnership between MPCDF and Lenovo is advancing high-performance computing (HPC), data infrastructure, and AI in research. The discussion centers on evolving demands in research computing, the strategic considerations behind technology partnerships, and the integration of cutting-edge compute and data workflows at scale, highlighting key trends shaping the future of scientific discovery.

Key Takeaways Include:

🔹Evolving Research Demands: Insights into how MPCDF supports diverse scientific research across Europe and globally with advanced HPC and AI.

🔹Strategic Technology Partnerships: The factors and shared values that led MPCDF to select Lenovo as a critical infrastructure partner.

🔹Measurable Collaborative Impact: Early results and joint outcomes illustrating the tangible benefits of the Lenovo-MPCDF collaboration.

🔹Sustainability in Practice: How sustainability considerations are prioritized in infrastructure strategy and decision-making.

🔹Roadmap Innovations: Emerging compute, data management, and AI technologies driving the future trajectory of scientific computing.

Learn more at Lenovo.

Watch the full video at sixfivemedia.com, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: Six Five On The Road is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript

Patrick Moorhead:

The Six Five is On The Road here in St. Louis at Supercomputing 2025 show. Daniel, it's been a great show so far and it's everything we would expect. Big compute, big data for HPC workloads.

Daniel Newman:

Yeah, look, you and I have been through this transformation that is supercomputing. For a long time, it was a show for research. It was a show for, you know, large national labs. And it was an exciting, it was definitely exciting for that community. But with the advent of AI, the Chetchi PT moment, you know, we like to say from flops to tops, but it's flops to tops and back to flops. And it is a huge show. And I mean, I've been to St. Louis for years by the way, many of you may not know that. My wife actually grew up here, spent a lot of time here. This is the busiest I've ever seen it besides maybe a World Series parade for the St. Louis Cardinals.

Patrick Moorhead:

Yeah, so while some things stand true, compute, networking, and data, and always has, they are evolving in this age of AI. And that's what I want to drill into today with Lenovo. Scott, great to see you.

 Scott Tease:

Nice to see you, too.

Patrick Moorhead:

Dr. Loray, great to see you. Good morning. Max Planck Institute. So let's chat. Great to be here, Scott. You've been on the show a couple times. A few times. And you just keep coming back, and we appreciate that.

 Scott Tease:

I love being here, and this show's electric, isn't it? Every year.

Daniel Newman:

So, Erwin, let's start with you, the Max Planck Society. Tell us a little bit about it, give us a little bit of the background, and tell us how your scientific research is really being distributed, not just in Europe, but really globally.

Dr. Erwin Laure:

So, the Max Planck Society is the largest German research organization focusing on basic science, and it's literally covering all fields of science. Starts from astrophysics and goes to the studies of history of law. And we at the Max Planck Computing and Data Facility, we are the competent center for computational science, artificial intelligence, data related science and support all these diverse use cases and researchers in their computational data AI needs so they can make a better science and they can excel their science, right? And being a large scientific organization, of course Max Planck is in worldwide collaboration with researchers from the US, from Japan, all over the world.

Patrick Moorhead:

Do you have any focuses like fluid dynamics, drug discovery?

Dr. Erwin Laure:

It's really all of that, right? As I said, it starts from artificial intelligence. We have a lot of molecular dynamics that's being computed. We have plasma fusion, right? A very important topic for the future of energy. Sure. Climate science. And thanks to AI, we are talking literally to the people of history of law. So they start to embrace computing thanks to artificial intelligence. So there is no field anymore that is not using computing.

Daniel Newman:

I was going to ask him really quick, how much over these last few years has the society seen acceleration? I mean, you were probably always heads down, but has this AI sort of boom led to an explosion in your work and your society?

Dr. Erwin Laure:

So the majority is still what we would call the classical simulation and modeling, classical HPC workloads, right? From the physicists, from the material science, and so forth. But artificial intelligence in the past couple of years has really gained a lot of momentum. Researchers trying to understand what role artificial intelligence can play in their scientific workflow. So how can certain steps in their discovery process be helped and be faster thanks to artificial intelligence? On the other hand, we have a lot of people that work on artificial intelligence research itself. So they try to come up with faster models, better models, better inference. As you know, the energy consumption, well, which is due to the compute consumption of artificial intelligence is just huge, right? So there is a lot of work to be done to make artificial intelligence less resource hungry, more sustainable. And that's what we are working on too.

Patrick Moorhead:

So Scott, I think you have had a relationship with the Institute for a while here. Can you talk about how the two of you met, like this is a relationship, right?

 Scott Tease:

Long time.

Patrick Moorhead:

And why did they choose you, right? And you have your customer sitting right here.

 Scott Tease:

I know, I know.

Patrick Moorhead:

So I know it's going to be on the up and up.

 Scott Tease:

Yeah, yeah. I'd actually rather you ask him this, why did they choose us, but I have opinions on why they've done it. So I've known this gentleman a very long time. I knew his predecessor before that. I think we've created a synergy between the two groups where we understand what they're trying to achieve, we understand the problems they're having, and we're trying to apply the best IT to help them solve those things. What's nice about dealing with these folks is They're not asking us for a certain number of processors or a certain number of GPUs. They're telling us what task we need to solve, and we work it together to figure out how best to go about it. Right now they're going through a big transition, as Dr. Lowry said, from HPC traditional to AI and everything in between. Trying to help them guide what the future of their IT looks like, what the data center needs to look like, that's a big part of what we do. And we've got people embedded with his team, with their researchers, to make sure we understand what they're working on, what the new, codes are, what the new areas of research are, so we can try to help give that guidance.

Patrick Moorhead:

All right, Dr. Laurie, is that 100% there? Yeah, let me underline what Bob just said.

Dr. Erwin Laure:

It is the flexibility that Lenovo brings in, and it is really The whole company is driven to achieve results. They're not just selling us pizza boxes like some others. They are selling solutions and they work on us on the solution. And the date we signed the contract is actually the beginning of our relationship. So we work through the whole lifetime of the system and beyond on meeting the challenges that are coming up. And as I told you before with the diverse user set, we have to have a company that is flexible, that supports us in all different kinds of use cases. There is not the one model that just fits us. Lenovo is really the perfect partner for that, so we have an excellent relationship over the years.

Daniel Newman:

Let's drill into that then, Erwin. In the end, partnerships are usually built on the successes. Of course, you sometimes build in the trenches, as I like to say, but the successes of getting those outcomes, getting those results. In the partnership so far, Can you speak a little bit to some of the outcomes and results that have come from this collaboration, from this joint effort that really indicate success?

Dr. Erwin Laure:

The biggest measure of success is when you are in trouble, when you have issues. And we've been through a few of these times when there were issues. It was not Lenovo's fault, but the environment is changing. Lenovo and Scott in particular, they have been driving this through with a lot of commitment and goal direction and we made it work excellently. So we have now the system, I think it's four years, right? The latest system we have with you guys. it's working perfectly, it's being used to a hundred percent, both GPUs, CPUs, diverse workloads that we have on it, and Lenovo makes sure that the system is up and running and supporting our scientists, and this is just working

 Scott Tease:

These guys are on the cutting edge every time. Every generation they do is cutting edge systems and sometimes things don't go right. I mean, we get cancellations of programs or things don't come out on time. We always find a way to work together with them to make sure we're supporting the needs of the researchers and I think that's a big part of why the relationship continues so strong.

Patrick Moorhead:

So Scott, HPC has always been what I would call high performance and high power, okay? And that's just what you get, that's nothing new. But I think AI in particular has added increased power and energy requirements that lead to new and novel ways to, you know, cool the systems, but also use less energy. And a lot of times, that's a twofer, you get that at the same time. But talk to me about your strategy. How are you optimizing your solutions for the HPC market?

 Scott Tease:

Yeah, I mean, again, what we've done together is a great example, which we'll get to like some of the things we've done together on that. But you've probably seen us talk in the past about our Neptune solutions. Oh, yeah. Exactly. So back to 2012, we started doing warm water cooled systems. Back then it was a way to save power, and we started in Germany because power tends to be relatively expensive there. We knew we could do away with a lot of the power consumption at the data center level, not having to move heat around the data center, not having to move heat with fans inside of the servers, you know, trying to save 40% of the power just by changing the way the systems are cooled. We've continued that. We've made it even grander. We're trying to build systems now that are doing 100% heat transfer to the liquid, so you don't need any traditional air conditioning, any kind of specialized air conditioning around those systems. So we helped the society get to direct liquid cooling. They've had a data center that's a very modernized data center, but they saw the change that was coming with AI and with higher power HPC, and we helped them make that transition from air cooled with liquid assist to direct liquid cool, and I think it's gone pretty smoothly, right? Will you ever go back?

Dr. Erwin Laure:

Never, ever. Yeah.

Patrick Moorhead:

It's definitely the trend. It's the trend. I remember years ago, even the hyperscalers saying, we're never going to use DLC. Guess what? Here we are. Everybody, water cooling and DLC is cool.

 Scott Tease:

The thing that I like about it is you've got the benefit of the power savings and fitting in the data center infrastructure, but you've also got the sustainability benefits on the backside. Normally, we think to be green is going to cost you more. In this case, with liquid cooling, the two go hand in hand. you're going to get the economic benefit of saving power, you're going to better utilize the data center, and you're going to need less power means less CO2 in the environment. So all go hand in hand.

Dr. Erwin Laure:

There's even another perspective to this. Once you have a reasonably warm water, You can actually reuse it on campus, in your environment. We are talking to the city where we are placed, if they can use our water, get it into their district heating system and the like, right? So it doesn't stop at the data center. It's really... going through all the environment, through all the place you're in. And we've been working with DirectLiquid Cooling from the very beginning. We've been one of the first to deploy it. And with Lenovo, we found an excellent partner.

Patrick Moorhead:

I just thought there's a university that actually worms their sidewalks when it's snowy outside. That I thought was an interesting and unexpected use case.

Daniel Newman:

I just choose to live somewhere that doesn't have snow. But yeah, I'm kidding. I lived in Chicago for 40 years. I've been snowed in for plenty of time. But I did go south, Erwin. I did. It's a little bit nicer there. So let's just take a walk into the future a little bit. We've got a lot of innovation. We've got it in infrastructure compute. We've got it in data management. We've got it in AI. What do you think is going to drive your roadmap? What are some of the innovations that, Erwin, I'll start with you. I'd like to hear from both of you. that are going to drive the roadmap forward in terms of what you're doing at Max Planck Society.

Dr. Erwin Laure:

It's clearly this convergence of HPC and artificial intelligence, right? We are just at the beginning of it. So the AI hype, while with chat GPT and all that you have it in the public, in the science people are just starting to understand what this new tool And it's really a new tool, it's a new method. It's not a revolution, it's an evolution. You get a new tool to do your science, and people are just beginning to understand what that means. And as we don't know yet how the workflows really will work, right? How will the HPC component talk to the AI component? How will the AI component influence the HPC component? We need a lot of flexibility in designing our systems, right? We will not deploy a system now and keep it running in the same configuration over five years as we used to do it in the good old times where there was only simulation and modeling. I expect that we will frequently reconfigure the system, not necessarily in hardware, but in software, the way we use it. Will it be a batch system? Will it be a Kubernetes, more cloud-like system? So it will be a constantly evolving system, and that's the big challenge that we are facing from a partially hardware, but mostly operational point of view.

 Scott Tease:

So for us, I think it's around the same kind of theme about that move towards more AI. Right now, it feels like we're throwing a lot of hardware at these AI problems, not being very elegant at it. One of the big focuses for us is how do we get, how do we transform these codes, make them more efficient, so you don't need as much hardware to do the same kind of tasks. That's a big part of it. And again, having people embedded inside of the society so we see what they're doing, that's a big win for us, just to be able to be that close to those kind of researchers. This customer here plays a really special role in enabling the ecosystem. They're not just a consumer, but they're also helping other people follow, and being close to that is really great for us.

Daniel Newman:

It's a great story. It's also a very exciting time, so we're certainly very interested, Erwin, in hearing about the continued innovation. I know we talk regularly to Lenovo, hopefully maybe next year or in the near future, we'll come back, we'll hear more. We know the convergence here, we know there's other breakthrough innovations like quantum that are probably driving a lot of attention into societies like what you're doing. Can't wait to hear more about that, but there's enough going on with AI, we'll stop there, enough going on with supercomputing. Scott, Erwin, thank you so much for joining us here on The Six Five. Thank you so much. And thank you everybody for being part of this Six FIve On The Road. We're here in St. Louis, Missouri at Supercomputing 2025. Hit subscribe, join us for all of the great coverage across Six Five here at the event. We appreciate you being part of our community. We got to go for now. See you all later.

MORE VIDEOS

A Closer Look at AWS AgentCore - Six Five On the Road

David Richardson, VP at AWS, joins Jason Andersen to discuss how AgentCore is enabling enterprises to deploy secure, scalable AI agents, what's unique about AWS's approach, and what’s next for developers in the rapidly evolving AI landscape.

From Prompt to Production: AI Spec Driven Development with Kiro - Six Five On the Road

Al Harris, Principal Engineer, and Jessie Vanderveen, Head of Product Marketing at AWS, join Jason Andersen to discuss how Kiro leverages spec-driven development to accelerate production-ready, AI-powered application development.

HR Compliance Today: Challenges, Misconceptions & the Role of Technology - Six Five Virtual Webcast

Jeff Glauber and Janusz Smilek from SAP join Keith Kirkpatrick to discuss how compliance requirements, misconceptions, and emerging technologies—especially AI—are shaping the future of HR compliance and what organizations can do to stay ahead.

See more

Other Categories

CYBERSECURITY

QUANTUM