Michael Dell is pitching a “decentralized” future for artificial intelligence that his company’s devices will make possible.
“The future of AI will be decentralized, low-latency, and hyper-efficient,” predicted the Dell Technologies founder, chairman, and CEO in his Dell World keynote, You can watch this on YouTube “AI will follow the data, not the other way around,” Dell announced at the kickoff of its four-day Las Vegas customer conference on Monday.
The rise of AI PCs – How businesses are reshaping tech to keep pace
Dell believes that the complexity of deploying AI on-premises is driving companies towards a vendor who can provide all the parts plus 24-hour support and monitoring.
On the second day of the show, Dell Chief Operating Officer Jeffrey Clarke noted the survey Dell conducted of enterprise customers, which showed 37% wanted an infrastructure vendor that would “build their entire AI stack for them,” and “We think Dell is becoming an enterprise’s ‘one-stop shop’ for all AI infrastructure.”
Dellās new offerings include products for so-called “edge computing”that is, within customers’ premises, rather than the cloud. Dell AI Factory, for example, is a managed AI service on-premise that Dell claims can be “up to 62% more cost-effective for inferencing LLMs on-premises than the public cloud.”
Dell brands one offering of its AI Factory with Nvidia (19459075) to showcase the chip giants’ offerings. This includes, in particular, PowerEdge servers with 256 Nvidia Blackwell ultra GPU chips and Grace-Blackwell CPU and GPU configurations are being redesigned . Dell has not provided any further details.
Dell has also announced new networking switches that run on Nvidia Spectrum-X networking silicon, or Nvidia InfiniBand. All of these parts – the PowerEdge server and the network switches – conform to the standard design that Nvidia has outlined as the Nvidia Enterprise AI Factory.
The second batch of updated PowerEdge systems will support AMD’s Instinct MI350 GPU family. Both PowerEdge flavors are available with either liquid cooling or air cooling.
“This is a game changer for faster AI deployments,” claimed the company. “We’ll leverage direct memory transfers to streamline data movement with minimal CPU involvement, making it ideal for scalable AI training and inference.”
Dell AI Factory emphasizes also the so-called AI PC workstations, which are tuned for running inference. This includes a new laptop that runs a Qualcomm circuit card, the AI 100 PC Inference Card. It is designed to make local predictions using Gen AI without going to a central computer.
The Dell Pro Max Plus laptop is “the world’s first mobile workstation with an enterprise-grade discrete NPU,” meaning a standalone chip for neural network processing, according to Dell’s analysis of workstation makers.
The Pro Max Plus is expected to be available later this year.
A number of Dell Software offeringshave been put forward to support the idea of a decentralized “disaggregated” AI Infrastructure.
The company, for example, made a detailed pitch for its file-management software, Project Lightning. It calls it “the world’s fastest parallel file system per new testing,” but claims that it can achieve “up to two times greater throughput than competing parallel file systems.” This is important for inference operations which must quickly consume large amounts of data. Also: Experts test and review the best laptops
In the software bucket, there is also what Dell calls Dell Private Cloud, which is designed to allow customers to switch between different software offerings, such as Broadcom’s VMware Hypervisors, Nutanix hyper-converged offerings, and IBM Red Hat competing offerings.
Dell Private Cloud’s automated capabilities, according to the company, can help customers “provision a private cloud stack in 90% fewer steps than manual processes, delivering a cluster in just two and a half hours with no manual effort.”
Want to read more about AI? Subscribe to our weekly newsletter, Innovation.