top of page
Search

DeepSeek touts new training method as China pushes AI efficiency

  • lastmansurfing
  • Jan 4
  • 1 min read


DeepSeek published a paper outlining a more efficient approach to developing AI, illustrating the Chinese artificial intelligence industry’s effort to compete with the likes of OpenAI despite a lack of free access to Nvidia Corp. chips.


The document, co-authored by founder Liang Wenfeng, introduces a framework it called Manifold-Constrained Hyper-Connections. It’s designed to improve scalability while reducing the computational and energy demands of training advanced AI systems, according to the authors.


Such publications from DeepSeek have foreshadowed the release of major models in the past. The Hangzhou-based startup stunned the industry with the R1 reasoning model a year ago, developed at a fraction of the cost of its Silicon Valley rivals. 


Read the full story | BLOOMBERG




 
 
  • Twitter
  • Instagram

© 2026 UnmissableAI

bottom of page