How Enterprises Are Rethinking Infrastructure for AI at Scale

Next-gen AI infrastructure isn’t about chasing the latest technology. It’s about building flexible, data-driven systems that can adapt as fast as AI itself.

In this session from the Decode Summit: Next-Gen AI Infrastructure, host Patrick Moorhead is joined by Bob Venero, President & CEO of Future Tech, for a discussion on how enterprises are rethinking infrastructure as AI moves from experimentation to operational scale.

Together, they explore what truly differentiates next-generation AI infrastructure beyond faster hardware—including the growing importance of partner-led, vendor-agnostic ecosystems; the convergence of hybrid cloud, edge computing, and security; and how organizations can manage complexity, cost, and risk while designing for long-term outcomes in an AI-driven era.

Key Takeaways Include:

🔹 Outcome-Driven Infrastructure: Next-gen architectures are being designed around business results like cost efficiency, resiliency, and speed to value, rather than raw CPU/GPU specs or single-vendor stacks.

🔹 Data Quality as the Primary Constraint on AI Value: Across diverse sectors spanning government, healthcare, education, and enterprise, poorly structured and poorly governed data remains the primary barrier to trustworthy, scalable AI.

🔹 Core-Based Software Licensing Driving Architectural Change: As vendors price software by core count, efficiency per core matters more than ever. Enterprises need to rethink how they modernize compute and AI platforms.

🔹 Hybrid + On-Prem Infrastructure as a Strategic Safety Net: Cloud continues to play a key role, but resiliency, sovereignty, and operational control are keeping on-prem and colocation firmly in enterprise AI strategies.

🔹 Modular, Vendor-Agnostic Architectures to Avoid Lock-In: With AI evolving faster than traditional infrastructure lifecycles, enterprises are prioritizing modular designs that reduce lock-in and allow systems to adapt without disruption.

Learn more at Future Tech.

And Subscribe to our YouTube channel, so you never miss an episode.

Listen to the audio here:

Disclaimer: The Six Five Decode Summit is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.


Transcript:

Patrick Moorhead:

Welcome to the Six Five Summit Next-Gen AI Infrastructure: Decoded. One of the favorite topics over, gosh, I don't know, 18 months to years, seems like forever, is AI and infrastructure. Once every five years, it seems like something new comes in and takes the world by storm. There was once a meme that talked about how infrastructure was a commodity and there was no growth. But here we are in the age of agentic AI and boom, It is reshaping everything in IT and also with the consumer. And today I'm joined by Bob Venero, who's President and CEO of Future Tech, a solutions provider who is on the front lines, whether it's with enterprises, Fortune 500 enterprises, or large governments here. Bob, welcome to the show. Patrick, great to be here. What a great view you have back there. I'm sure you get everybody who asks you, is it real? Is it fake? I mean, I've got this cityscape here, downtown Austin, but it's beautiful.

Bob Venero:

Yeah, in today's day and age, people like AI-generated, it's AI-generated–to stay on topic. But it's not, it's legit. And we love it down here in good old sunny Fort Lauderdale.

Patrick Moorhead:

Very nice. So Bob, you and the company, you're on the front lines, the intersection between your clients and the technology. And for decades, they have looked to you for your recommendations. And also implementation, and quite frankly, just making it happen. And there's been this phrase, “moderate infrastructure,” that has been tossed around to get AI to scale in the enterprise. So from your point of view, what does that mean? What defines this next generation of infrastructure beyond, let's say, a faster hardware CPU or GPU and a bigger platform?

Bob Venero:

I think it's a, it's a combination. Those, those pieces, they play a certain role within the modern infrastructure, but they're not quite as important as they used to be. Right. So, you know, being in this for 30 years, uh, speeds and feeds were key to how a company was going to be able to perform by what they adopted internally within their organization. And that has definitely changed from an AI perspective, from a modern infrastructure perspective, where a lot of it's tied to how do we become more efficient than bigger speeds and feeds? How do we become more effective? And how do we become a little bit more agnostic, right? For the ability to have multi OEM platforms working with each other so that your business outcome is a lot better than what you would have had, say, on a single platform. And I think that's really changing the way that the OEMs are creating their platforms. The software is a big piece of it, right? As well, as you look at companies that are now charging based on a core count, in the processor and what that means to an organization that is not used to that where you have these processors that have tons and tons of cores, right? You're talking thousands and hundreds of thousands of cores within an environment where you're now getting charged per core based on the software and that is exponentially raising the prices. So you're seeing modern infrastructure is being addressed to say, hey, I can do this with less cores and be more efficient and effective. There's a lot of different areas there, but that's a big piece of what we're looking at.

Patrick Moorhead:

Yeah, we're mature. I guess I've been in tech for 35 years, and we've seen the ups and downs, the ebbs and the flows, but in the end, it does get the outcomes. And sometimes the outcomes are defined by a core count. by a vendor, but it sure seems like in this massively complex arena of AI, plus you have hyperscalers, quite frankly, that are developing things for their sandbox that don't necessarily work in anybody else's sandbox. And the reality is that the end customer wants this all to work together. So it would make sense to me that this partner lead model change, first of all, would be very good for you, which is a plus. But I'm curious, how does this model change the way that these enterprise design and operationalize infrastructure compared to, dare I say, the old days?

Bob Venero:

It's very different, right? The design, the architecture, the pre-planning, and then the post-planning is very different than it was years ago. What you're trying to design is an outcome versus a platform. And that is changing in the way that you look at things. You need to be much more nimble as an organization as well, because the speed of change and the rate of change is just amazing. So if we went back and had this conversation five years ago, and we were talking about AI, it would be a very different conversation than what we're having today. And organizations, especially the large organizations, need to be able to pivot and adjust and be nimble in order to take advantage of what the companies are releasing out there, whether it's from a hardware perspective or a software perspective, and be able to plug that into their environments. And to your point, right, from a hyperscaler perspective, the hyperscalers are figuring out what works for them the best, and then working it out to the end user. community, i.e. the larger companies of the world, there's value and there's risk there. We always talk about on-prem and co-location as a very important component to enterprise organizations, government agencies, and government systems integrators to be able to make sure that they've got a good sense of their on-prem because when incidents happen and outages occur, which as we all know they do, making sure that they have the capabilities to be able to pivot to their internal systems, to be able to support the customers and or their own internal customers is vital to what needs to happen.

Patrick Moorhead:

Hey, Bob, we've talked about outcomes a bit here. I'm curious, are they expressed in terms of, I mean, are they more high-level like, “Hey, I want to reduce costs here, I want to increase revenue here, I want to accelerate this, or is it more specific, let's say related to an application?”

Bob Venero:

Or environment. It's a combination, right? Depends on how high up the chain you want to go within an organization, right? So your CFO is like, we need to reduce costs and do it with less people, right? And now you've got the folks on the ground that are tasked with making that happen that are being very application specific, right? And very use case specific. And so what you're trying to do is design the right modern solution around the use cases that they're trying to accomplish. And then being able to measure what those end results are so that when they report it back up, they've accomplished the, hey, I've actually reduced costs versus spent. And that's something right now where there's a lot of thought in advance being done. At what level do you start it? Do you start it at building an AI factory and then figuring out what systems can run on that AI factory to accomplish the goals? Or do you figure out, all right, these are the end use cases I want, and then I'm going to build a use case specific factory that is going to accomplish those goals?

Patrick Moorhead:

Pros and cons to both. No, that that's absolutely spot on. It's like big models, small models, vertical models. And quite frankly, we're all we're all kind of figuring this out here. I do a lot of CIO and CTO roundtables. And I don't think anybody has figured it out, everything out. And if they have, well, I don't know, maybe they're stretching a little bit. So Bob, you support customers across a wide variety of verticals, but specifically your strength in federal, healthcare, education, and then overall commercial markets. A lot of different needs for these folks. What are some of the commonalities that relates to some of their infrastructure challenges that cut across this new agentic AI and data-driven phenomena that we're seeing here.

Bob Venero:

Yeah, you use a keyword, right, which is data. That's the common scenario across or the common theme across all of these companies, whether you're in healthcare, whether you're in government agencies, whether you're in corporate, right? It's the value and the quality of the data, the availability of that data, right? And then how that data is brought in and then how that data is used. right? That is, I don't care where you are, the cleanliness of your data is key, right? You can look at, you know, the ChatGPT models, the Grok models, right? Remember, they're pulling data. And you can ask it a question. And you know, the question, the result you just got is absolutely wrong. And you'll ask it again. And it'll go, Oh, my bad. Yeah, I didn't think that's all data, right? So making sure your proper data modeling is consistent across your environments is key. And I think companies have struggled with that, right? Because they have decades of data that they didn't categorize in the right way. And now they're struggling with how to actually do that. And they're making mistakes. Now, look, mistakes are good as long as you can fail fast and then be able to adjust. and then hire the right talent that can help from a data integrity perspective and how that's going to align to what we're doing. That's number one. That's one big common theme right across. The other piece of it is around structure and security, right? And how you're going to secure the things that you're doing with the proper guardrails within your environment. And that goes across, you know, public corporations, private corporations, government agencies with different hierarchies of what they need, and then systems integrators as well. Those are very two common themes. The speeds and feeds and all of that, it's really not relevant to what differentiates a government systems integrator from a fortune eight company in the corporate space.

Patrick Moorhead:

Yeah, Bob, one thing that's been consistent in all of our research has been this data challenge. And the way that I like to look at it is you got a couple of things going on. First of all, generative AI, agentic AI, truly, if the biggest value is when it's cutting across your ERP, your CRM data, your HRM data, your PLN data, and coming up with the best answer. And we have done that in different ways by setting up data lakes, particularly, let's say, SAP backend data. plus Salesforce front-end data, but it's hard to do and it's not hitting all of the relevant data. And the second thing is, this is the first time where data that we used to put on ice or take it somewhere, cold data, frozen data that now has value that you can put into the system. And that is a relatively new phenomena here. So not only do you need to have the right data architecture, but also storage is being looked at as well.

Bob Venero:

Yeah, and storage is a big one, right? I mean, there's where when we look at what's happening from an AI perspective, if you look at the cost trends, data, I mean, data types of storage is right up and to the right. And that's what's causing a lot of the challenges that you see in the market right now, right, from an availability of product perspective tied to SSD drives and all those things that are creating major shortages in the market that are increasing pricing. It's a little bit of the wild, wild west there, but it's all data-driven. It's all storage-driven that's making this phenomenon happen out there in the market.

Patrick Moorhead:

So Bob, there seems to be this convergence right now with AI, hybrid cloud, edge computing, right? Oh my gosh, did we forget about that? Remember how hot that was seven or eight years ago? And security. How.. How is Future Tech creating the longest-term value between these three or four elements regarding complexity and risk? Because this stuff is hard, this stuff is valuable, but you do it the wrong way and it's risky.

Bob Venero:

Yeah, and long-term value, putting those two words together in today's market is a little bit of an oxymoron, right? Because today's long-term value is tomorrow's shelfware, right? In a lot of different environments that are there, right? There's You have to make certain bets, and that's what we do as an organization. We help our customers try to make the smartest bets. Because we're an agnostic provider, and what I mean by that is, we're putting in place in front of our customer as their trusted advisor, the best solution that we feel accomplishes the goal. And whether that's one singular OEM or a combination of multiple OEMs, that's what we're going to build and design with the longest possible shelf life. and what those outcomes can be. Understanding that when we try to build it, we try to make it modular so that we can unplug somebody, plug somebody in without disrupting the environment. And that's key, right? Where, you know, in the old days, right, you had these big compact ProLiant boxes that had all the infrastructure and all the compute and the memory and everything in this one big box. You know, we're breaking that up now based on value, right? And based on protection for the customers that we are, we have, and then trying to create lower term pricing agreements that help protect them because of challenges that you have, whether it's tariff-based supply chain, right, all of those things, and then being able to make sure that They have the equipment ready when they need it. We have a unique process called VMI, which is Vendor Managed Inventory, where we actually bring the inventory in on our own dime to support our customers' needs and forecasts over a long period of time to make sure that the quality of what's being delivered in their environment is consistent to the models that they look for.

Patrick Moorhead:

Yeah, it's interesting. It's very straightforward. It doesn't move from your initial promise that you have made over the past decades, but it's modernized for the age of AI. And in the end, I mean, one thing stands apart. Customers want choice. and they want the ability to move and make changes and not be locked in and do it without paying an arm and a leg to move from different environments. And I see nothing in the future of agentic AI that makes that need go down. In fact, and again, I mean, I'm a researcher of the public clouds too, but the public clouds are absolutely looking inward to pull everybody to them. A little bit of softening on the, hey, let's do this data sharing thing after 15 years, Bob. And also an understanding, particularly when it comes to sovereign AI, that customers want to bring the AI to the data. And if the data is created at the edge or created in your data center, that's where they want this to happen. So I appreciate that. So Bob, let's go from long-term value that you're creating today to long-term vision, okay? I want you to put your futures cap on here and talk about how you see AI, automation, and intelligent infrastructure to add value to enterprise IT? And then what are you doing at the company to be able to intercept that need?

Bob Venero:

Great question, right? And interesting answer. I was asked the other day on our town hall, what's your next five-year plan in the company? And I told them, I said, I have no idea. Five years is an eternity. I can talk about revenue and growth and all of those things, but what we're actually going to do as an organization to get there, I don't know yet, right? Because what is going to be the next 6, 9, 12, 18 months from a technology perspective? And what you see today is amazing, right? What AI is driving, the speed of change is light speed now, because AI is helping drive those changes, right? Before it was, you know, the humans trying to drive the change and create the things. Now it's AI is helping tweak and master and recreate and do things that are much faster, whether it's coding. As we know, it's coding itself. It's doing a lot of different things. So, when we look at how to future-proof, and that's what we kind of look at, how do we future-proof our customers and our business? It's making sure that we understand what components are going to make the biggest impact in the environment, and how those can be adopted into an organization, and then they can leverage it. AI is going to drastically change what companies do in the next 12 to 18 months. Right now, companies are still trying to figure it out. and they're testing the waters and they're seeing what guardrails they need to protect themselves. But that very quickly is changing where full-blown adoption is happening within organizations. And you're going to see a collapse of that AI, modern infrastructure, edge computing, like we talked, becoming one ecosystem within an organization. And it's going to teach and talk to itself. And so that's going to make a big difference. And now it's going to be the software applications that take advantage of that. The OEMs have talked about the NPUs in a laptop for years. And what's two years I think it's been now with NPUs. But it hasn't taken off because the applications don't take advantage of it. When the applications can take advantage of the technology, then you'll see scale. We're at that tipping point from an AI, modern infrastructure, edge computing perspective, where it's starting to take advantage of it, and people are starting to understand what the value is on that, and now driving agentic into that environment, and then having really, really good outcomes, and then reinvesting back in. And so I think When we look at what Future Tech is going to do, we're going to consistently bring new technologies to the ecosystem team of partners and our customers that we do by making sure we're eating our own dog food as well. So we've integrated ServiceNow and NextThink with their AI and business intelligence and ChatGPT enterprise. We're doing all of these things so that we know what works and what doesn't work. And there's a bunch of other technologies that our teams have in a sandbox perspective. And then when we feel it's something that's ready for a customer, we're going to bring them in and talk about the value of what it is. Chargeback is a huge thing. There's a company called Parallelworks that we work with that has created something really great. And we've taken that out into an environment that allows customers to do chargeback down to the GPU level within organizations and the data. Phenomenal stuff. Now we've put that on our line card, and now we're bringing that out to the customers that we support. So a lot of that stuff, Patrick, that we're going to do, we're going to be the speedboat, right? That's going to be able to nimbly move and cut in between. My son would rather say a Porsche GT3 RS, right? But that type of speed and maneuverability is where we're driving.

Patrick Moorhead:

Yeah. So I love it, it's interesting. I don't know if this is a thing, but it's, it's pragmatic vision, which is been there, done that. And then you set the table of the modality of how you want to work to be ready for the twists and the turns that we're going to see in the future. I mean, I remember when we were all just focused on LLMs and then agentic pops up. Well, guess what? There is going to be a next new thing that's going to come up, agentic orchestrators, right? Your customers are going to be coming to you for that. What's an agentic orchestrator that I can maneuver across the public cloud and my private cloud and the edge? Bob, this has been a great conversation. I really appreciate your time and effort here and really look forward to hearing more about how things are going in the future, what your customers are telling you, what solutions that you're putting against those that are truly winning. So thanks again. Appreciate it. Great being here. Thanks, Patrick. Thank you. And thank you for watching the spotlight of the next-gen infrastructure decoded, defining what next-gen AI infrastructure looks like. So tune in to all of the mini summits and the agenda at the 65media.com website. More industry insights are coming up shortly. Take care.

Midway through GTC 2025, Pat Moorhead, Founder, CEO, and Chief Analyst at Moor Insights & Strategy, shares his early insights, revealing a landscape buzzing with AI innovation and a clear vision for the future. Pat highlights takeaways from NVIDIA’s latest GPU launches, AI-driven infrastructure, and how Samsung’s advancements with GDDR7, HBM3E, and SOCAMM will impact the next wave of AI.

Key takeaways include:

🔹Hyperscaler Investment Surge: Despite concerns about demand saturation, the reality is a significant increase in computing demand, driven by reasoning engines and agents, leading to rising H100 and H200 prices.

🔹Fueling the AI Ecosystem: Talks underscored the critical role of advanced memory technologies, with next-generation HBM, eSSD, and the collaborative integration of Samsung’s 24Gb GDDR7 DRAM into NVIDIA’s RTX 50 series, driving the future of AI and high-performance computing.

🔹AI Pervasiveness: From data centers to gaming PCs, AI is permeating every aspect of technology. NVIDIA’s messaging from GTC underscored the broad applicability of AI across diverse sectors.

🔹Software Leads the Charge: Contrary to expectations, NVIDIA’s focus was on software, with advancements in CUDA and the introduction of Dynamo, highlighting the critical role of software in enabling AI at scale.

🔹Hardware Innovations: Announcements like Blackwell Ultra, Rubin, and the enterprise data platform, showcase NVIDIA’s commitment to pushing the boundaries of computing, networking, and storage.

🔹Data Management is Key: The introduction of the enterprise data platform addresses a critical challenge for CIOs, highlighting the importance of efficient data management in AI deployments.

Watch the full video above, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Speaker

Bob Venero
President & CEO
Future Tech

Bob Venero is the CEO and Founder of Future Tech Enterprise, Inc., an award-winning, global IT solutions provider headquartered in Fort Lauderdale, FL.

He started Future Tech in 1996 from the basement of his home. Since then, he has spearheaded the company’s exceptional growth, positioning Future Tech as an IT partner-of-choice for leading companies in the aerospace, defense, education, energy, healthcare, and manufacturing sectors.

A member of the Forbes Technology Council, Bob is often quoted in major business and IT trade media, providing expert insights on key industry trends, technology developments, corporate culture and successful entrepreneur tips. Bob has appeared in a wide range of media, including Forbes, Forbes TV, Buzzfeed, Fierce CEO, CIO, Information Week, Entrepreneur, Inc., and more.

Today, Bob and the Future Tech team are at the forefront of the most advanced technologies, helping companies understand and implement Artificial Intelligence (AI), Internet of Things (IoT), Virtual Reality (VR), and 3D Print solutions.

Recognized as a dynamic speaker and onstage personality, Bob is frequently asked to participate in major tech industry events as a host or emcee. He most recently served in these capacities for CRN’s 2019 Best of Breed Conference (October) and NVIDIA GTC 2019 (Washington, D.C.).

Bob Venero
President & CEO
Patrick Moorhead
Founder, CEO, and Chief Analyst
Moor Insights & Strategy
Patrick Moorhead
Founder, CEO, and Chief Analyst

Other Sessions