YOU ARE AT:AI InfrastructureResearch note: Dell Technologies World — AI in action

Research note: Dell Technologies World — AI in action

Dell’s AI top 10 and a learning-by-doing approach to scaling enterprise AI

As enterprises move from AI experimentation to operationalization, Dell Technologies is positioning itself as both a vendor and a case study. COO Jeff Clarke used his Dell Technologies World keynote to outline not just the company’s AI offerings, but its internal AI transformation which offered a window into the mechanics of enterprise-scale adoption.

Toward the top of his talk, Clarke set the stage for the latest and greatest from Dell, as well as the AI landscape in general, using a top 10 list. “You know I love a good top 10 list,” he said. With that, here’s the list, lightly edited for clarity and flow: 

  1. “Change, change, change, roadmap, roadmap, roadmap, acceleration, acceleration, acceleration.” This includes commercialization of NVIDIA B200, GB200, B300, and GB300 platforms on Dell hardware. 
  2. “Really capable multimodal models and large context LLMs have vastly improved outcomes.” 
  3. “The techniques are advancing at near exponential rates. Quantization, distillation are enabling more capable, smaller models and enabling faster training of new models.” 
  4. “One of my personal favorites — inference and reasoning models are far more compute intensive than we thought a year ago. At least 100x more compute intensive than we thought just a year ago, and likely going to grow from that.” 
  5. “Tokens, tokens, tokens…We’re building token factories. In 2024, 25 trillion tokens were generated. By 2028, that number will be 35,000 trillion tokens…And it’s likely under-called.” 
  6. “Very capable NPUs will be in our PCs. It’s going to be disruptive. It’s going to enable RAG at the very edge of the network.” 
  7. “Despite…a 10x increase in the cost to train the next new generational model, there are more foundational models today than there were a year ago.” 
  8. “The cost per token has actually decreased four orders of magnitude in the last four years. So each successive training of a new model is 10x, and the rate of token cost is declining four orders of magnitude over the past four years.” 
  9. “Agentic is the answer to everything…by 2028, one-third of all interactions with gen AI will use autonomous agents to complete tasks.” More on that later. 
  10. “The largest companies in the world have moved beyond proof of concept.” 

The point here is AI is scaling fast, reshaping infrastructure demands, and pushing compute to every edge of the network. We’re seeing exponential growth in model complexity, inference intensity, and token volume; this is forcing a rearchitecture of IT systems from the PC to the hyperscale data center. 

Dell lays out its own enterprise AI journey

Granted Dell is a complex, enormous operation, Clarke detailed the company’s own internal AI journey to give customers instructive insights on how they too can benefit from AI. He cited polling of 3,800 enterprise decisionmakers to understand their thoughts on AI. He pulled out two data points: The top two technical challenges are the scale of the implementations and the underlying work on data needed to operationalize AI. He also noted that 39% of all data center power isn’t utilized. Keep those ideas in your head as we proceed. 

“In the spirit of misery loves company,” Clarke said, “I thought I’d share the detailed experience we’ve had in this journey over the past two-and-a-half years…And I’ll tell you, we were pretty horrified when we started.” He said Dell had more than 900 “AI projects” within the company, and was grappling with suboptimal data governance and a general lack of clarity and purpose. 

With that, Clarke used something of a list to lay out the underlying structure that would (and does) guide Dell’s internal AI ambitions. 

First, Dell had to define an AI data architecture, then build an enterprise data mesh which connected all the relevant data, he said. “Processes had to be simplified, standardized, and automated…It became very clear to us that if you apply AI to shitty process you just get a shitty answer faster, and a not particularly good one.” 

Next, the AI strategy and attendant use cases need to be aligned with the things that matter most to Dell; if there wasn’t alignment, we didn’t do it, Clarke said. And, finally, there had to be committed, meaningful ROI. “Unless you were willing to sign up for real dollars, real efficiency, and productivity, we were not going to fund it.” 

Before diving into the use case, let’s circle back to Clarke’s reference to the things that matter most to Dell, which he also detailed in a list: end-to-end product strategy, go-to-market engine, worldwide supply chain, and global services capability. 

To that last point, Clarke went through how Dell leveraged data and AI to drive efficiency, productivity, and quality in its global services organization which is made up of tens of thousands of personnel in more than 170 countries supporting more than 250 million field assets. He said examination of the underlying data revealed five key data sets that, if properly connected, held significant insights: case history, knowledge base, self-reported info from products, dispatch logs, and repair information. 

“All of that data existing in our company…It was a little all over the place,” Clarke said. And the company had previously tried to connect and harness those datasets unsuccessfully. “It wasn’t until the advancements in generative AI that the reality of extracting value from all five of those datasets simultaneously came to fruition.” 

The result was a digital “genius on their shoulder” available to global services team members. “We created service assistants for all of our service members in our organization…It’s quite extraordinary for us.” Clarke said agents close more cases faster and with an increased resolution rate. Dispatch rates are down, repeat dispatch rates are down even more. “And then most importantly, the thing that matters most is customer satisfaction is up. So we’re able to solve more cases, faster with a higher resolution rate whether it’s the first time or the second time that we’re out dispatching a part. Those rates are down and the ultimate outcome is a more happy customer. And this didn’t require the latest models, the latest GPUs, nor did it require a significant amount of resources…And the ROI was less than six months. In fact, it was less than three months.” 

An enterprise AI playbook — uses cases and things to think about

Big picture, and based on Dell’s own experience, Clarke identified (in list form), six common enterprise use cases AI is poised to disrupt, and surprisingly listed out five things enterprises need to think about as they ramp AI investments. With further ado: 

The six common enterprise use cases AI is poised to disrupt are: 

  1. Support assistance
  2. Content creation and management
  3. Natural language search
  4. Design and data creation 
  5. Code generation 
  6. Document automation

And the five things enterprises need to think about: 

  1. “It’s really time to get busy…The threat is existential…If you haven’t started, you’re behind.” 
  2. “There is on one-size-fits-all approach.” 
  3. “Many of you have the power, cooling, and space in your existing data centers already.” 
  4. “You don’t need the latest models, you don’t need the latest GPUs, to get started.” 
  5. “There’s a compelling ROI out there for the right use cases inside your organizations.” 

Clarke’s point, if you read between the lines, is that the barrier to enterprise AI is not technology. It’s organizational discipline, architectural readiness, and ruthless ROI tracking. His keynote reflects Dell’s strategy to lead not just with infrastructure, but with experience. The message to CIOs is clear: don’t wait for perfect conditions. Optimize what you already have, start with aligned use cases, and let results drive iteration.

Read more of our research from Dell Technologies World. And here’s a video of Clarke’s keynote if you’re so inclined.

ABOUT AUTHOR

Sean Kinney, Editor in Chief
Sean Kinney, Editor in Chief
Sean focuses on multiple subject areas including 5G, Open RAN, hybrid cloud, edge computing, and Industry 4.0. He also hosts Arden Media's podcast Will 5G Change the World? Prior to his work at RCR, Sean studied journalism and literature at the University of Mississippi then spent six years based in Key West, Florida, working as a reporter for the Miami Herald Media Company. He currently lives in Fayetteville, Arkansas.