Capitalizing on the long tail of AI means scaling down from the Fortune 50 to the family room
Dell Technologies is the enterprise IT powerhouse behind the world’s biggest firms—and it’s also in our homes and offices, powering how we learn and play. As the AI era takes shape, Dell is positioning itself not just as the infrastructure backbone for the Fortune 50, but as the on-ramp for AI adoption everywhere. In a keynote at Dell Technologies World, CEO and Founder Michael Dell made the case for disaggregated, flexible, and scalable IT architectures and infrastructure designed to bring AI from Wall Street to Main Street, from corner offices to living rooms.
“AI is the new electricity,” he said, “and Dell is the grid powering this transformation — connecting the data, the intelligence, and the innovation.” He added: “It’s not a destination, it’s a street of continuous innovation. And we’ve been walking on this path for a long time.”
Recalling last year’s launch of the Dell AI Factory with NVIDIA, Dell noted the product already serves 3,000 customers. Now for the real test: scaling from 3,000 early adopters to the long tail of mainstream deployment.
JPMorganChase has around 84 million customers (71 million interact digitally with the banking giant), moves around $10 trillion in payments on any given day, and has $4 trillion under management within its asset and wealth management arm. The company operates in 100 markets globally, has a workforce of around 300,000 people, depends on some 6,000 applications, and has an $18 billion tech budget.
Larry Feinsmith, managing director and head of global technology, strategy and innovation, joined Dell on stage to lay out his four priorities: provide a best-in-class digital experience for customers and employees, put data and AI at the center of its operations, maintain an unwavering focus on cybersecurity, and “to have all that run on a modern, resilient, scalable infrastructure.”
Chase exemplifies a modern, hybrid, multi-cloud, multi-provider strategy—an architecture increasingly viewed as essential for scaling enterprise AI.
Chase exemplifies a modern, hybrid, multi-cloud, multi-provider strategy which represents an architecture increasingly viewed as essential for scaling enterprise AI. And it runs a constellation of AI models across its footprint. Whatever the application, whoever the end user, it all requires a lot of compute. Discussing the JPMC’s AI strategy, Feinsmith said the firm has around an Exabyte of data that is treated “as a first class asset…We catalog our data, we understand lineage, how we permission and govern it, clear ownership, but most importantly that the data is discoverable. Use cases include quality assurance, summarization and content generation for workers at all levels of the org.
In terms of AI-enabled automation, he said, it’s “all human-in-the-loop right now. But the next exciting horizon is going to be using agents and reasoning models, and how you orchestrate all these agents together” to think, plan, rethink, and execute complex business processes. This (again) points to the need for more compute, and suggests agentic orchestration at this scale goes beyond just a software challenge to an infrastructure mandate.
From global scale to aisle-level intelligence
Moving from Wall Street to one of the 1,700 Lowe’s locations, EVP and Chief Digital and Information Officer Seemantini Godbole told Dell her goal is to make AI impactful and meaningful to customers and associates, and avoid dying a death of 1,000 pilots. To that end, Lowe’s is applying AI to how it sells, how its customers shop, and how its 300,000 associates work.
In terms of what that looks like in the real world, Godbole reiterated a point made throughout Dell Technologies World that AI is most effective when it’s taken to where data is generated rather than the inverse. In the back of every store, she said, there are four Dell PowerEdge servers kitted with NVIDIA GPUs, two more for SD-WAN, and one more for video analytics.
“My role is technology,” she said. “When we were thinking about AI…there are some companies who are building LLMs and building models. We are not in the business of building LLMs.” Lowe’s is, however, in the business of providing niche expertise to consumers walking the aisles of its stores. “We always thought that our store associates should look like super associates,” Godbole said. With the rise of gen AI, “That dream is so close.” Her vision of augmenting frontline workers with contextual intelligence is the kind of purposeful, practical application that could (should?) define the next phase of enterprise AI adoption.
Store associates have handheld devices loaded with a ChatGPT-like interface that helps them answer questions that may or may not be within their area of expertise. Computer vision algorithms clock customers lingering in aisles likely because they have questions, then send alerts to associates, essentially dispatching them to assist. “When you are in a Lowe’s aisle and you think, ‘I wish I had help,’ magically an associate is going to appear.” For its e-commerce business, Lowe’s similarly has AI-enabled expertise on hand to help would-be buyers not just find answers to their questions, but arrive at a decision.
“I feel like the possibilities are limitless,” she said.
The long tail of AI is where the real impact begins
While a handful of firms are “in the business of pure intelligence,” as Dell put it, most companies aren’t building frontier models. They’re looking for AI that helps them do what they already do, but better. That’s the real opportunity: helping organizations of all sizes put AI to work in purpose-built ways.
Even with JPMorgan’s physical presence in local markets, it’s still a $700 billion multinational. And while Lowe’s feels more Main Street, it’s a $100 billion public company. The real long tail of AI adoption — and maybe its most impactful frontier — lies in the businesses that aren’t in the S&P 500: clinics, a local brewery, mom-and-pop shops, startups; that’s where AI needs to go, and creating on-ramps is how it gets there.
Exhibit ‘A’ Brewing Company in Framingham, Mass., for instance, uses a combination of Dell PowerEdge servers, NVIDIA chips, and 5G connectivity to essentially brew more beer. This digitalization project includes sensors in fermentation tanks that stream pressure and temperature readings, helping brewers catch fluctuations that could affect quality or result in product loss. On the canning line, a computer vision system detects when cans fall over, freeing people to go do something more value-additive.
Then there’s Norby, an Australian startup that uses Dell workstations and NVIDIA GPUs to make a “clever little language robot” for linguistic education and speech therapy. Norby founder Adrian Mullan, who was in Las Vegas for the show, discussed in a video how he can locally run and fine-tune LLMs at the edge. He recounted that his youngest son spent several years in speech therapy, and his daughter struggled to focus in conventional learning paradigms, which led to the idea for Norby. Seeing his son use the robot, “as a parent…[has] been very fulfilling.”
It’s impressive to see what JPMorganChase and Lowe’s are doing; this is the kind of investment that drives the whole thing forward. But Exhibit ‘A’ and Norby illustrate the shift from global scale to local impact.
Dell’s pitch — and my takeaway from his keynote — is about giving everyone the tools they need to think faster and act with more intelligence, more purpose. Scaling AI is about pushing intelligence out to the margins, where decisions are made in real time. In an age where “AI becomes as essential as electricity,” as Dell put it, that also means it needs to be as accessible as electricity. “AI is for all of us,” he said. “AI is for human progress. And it’s powerful whether it’s in the hands of a Fortune 50 CIO…or a dad.”