Welcome to Technology Trends

Providing technology buying information for more than 10 million IT and business executives.

Home >> Resources >> The AI Explosion and the Infrastructure Impact

Why Developing and Deploying AI Technology on Workstations Makes Sense


  • AI has taken off as an important, differentiating capability in all industries, and the hardware required to run AI is rapidly evolving.
  • The technology industry is often very focused on the exponential growth in size that the most advanced AI models are going through.
  • The discussions are about tens of billions of parameters, reducing precision, expanding memory, high-performance computing (HPC)–like needs for AI training and inferencing, and racks of accelerated servers.
  • In reality, this extraordinary scale of AI computing is the exception, especially in the enterprise.
  • Today, many businesses are working hard on AI initiatives including generative AI that do not require a supercomputer.
  • Indeed, a lot of AI development — and increasingly AI deployment, notably at the edge — is actually taking place on powerful workstations.
  • Workstations have numerous advantages for AI development and deployment.
  • They liberate the AI scientist or developer from having to negotiate server time.
  • They provide GPU acceleration even as server-based GPUs are still not easily available in the datacenter.
  • They are extremely affordable vis-à-vis servers, and they represent a smaller, one-time expense rather than a rapidly accumulating bill for a cloud instance.
  • There is the comfort of knowing that sensitive data is securely stored on premises.
  • That way, they also free the scientists or developers from the anxiety of racking up costs while merely experimenting on AI models.
  • IDC is seeing the edge grow faster than on-premises or cloud as an AI deployment scenario.
  • Here too, workstations play an increasingly vital role as AI inferencing platforms, often not even requiring GPUs but performing inferencing on software-optimized CPUs.
  • The use cases for AI inferencing at the edge on workstations are growing rapidly and include AIOps, disaster response, radiology, oil and gas exploration, land management, telehealth, traffic management, manufacturing plant monitoring, and drones.
  • This white paper looks at the increasing role that workstations play in AI development and deployment and briefly discusses Dell's portfolio of workstations for AI.

If you engage with the content, Technology Trends will share your data with dell. For details on their information practices and how to unsubscribe, see their Privacy Statement. You can unsubscribe at any time. Privacy Statement.

White Paper from Technology Trends

Get your free copy now!

* - marks a required field

Answer the following questions about your organization below:







By clicking DOWNLOAD button you agree to our Terms of Use. We take your privacy seriously.