top of page
Search

The next phase of artificial intelligence may require very different processors

  • Mar 20
  • 1 min read


THE ECONOMIST — Times are changing fast. Demand for AI computing is shifting from training models to getting them to answer real-world queries, a process known as inference. McKinsey, a consultancy, estimates that by the end of the decade inference will account for three-fifths of demand in AI data centres. 


Nvidia appears to recognise the shift. On March 16th it unveiled a new chip designed specifically for inference tasks, the Groq 3 LPX, with an architecture that departs from the traditional GPU.


This time, it will have plenty of competition. A crop of startups is building chips aimed at running AI models faster and more efficiently than Nvidia’s.


Read the full story  |  THE ECONOMIST




  • Twitter

© 2026 UnmissableAI

bottom of page