Today, although card performance is increasing, an increased number of accelerators will still be needed, so it is estimated that training GPT now requires over 30 Nvidia A100 cards. And it is the increased interest in AI applications that pushes up interest in the hardware on which this AI will be trained, which could greatly help Nvidia. In addition, according to the company, it could also mean a boost for competitor AMD with MI200 and MI300 accelerators, and the companies that produce these chips, which is primarily TSMC, should also benefit from it. Good times could also await other companies that develop AI chips for various applications, such as GUC, AIchip, Faraday Technology and eMemory.
The success of ChatGPT can be a nice boost for Nvidia and AMD as well
Chatboti are a big topic in recent weeks, but it’s not just about the software itself. Powerful hardware also plays a significant role here, which, according to the company, would Trendforce could be a significant boost for companies involved in the development and production of hardware for AI applications. Above all, the company was mentioned Nvidia, whose recent financial results were not the best. According to Trendforce, the GPT system on which ChatGPT is built grew from 120 million to 180 billion training parameters from 2018 to 2020. Trendforce estimates that around 20,000 GPUs were needed for this three years ago.
Apple is developing its own augmented reality glasses and its premiere would be in 2023 | Present
Finally! WhatsApp Beta tests long-requested status option on iOS
Mine of the Future! This Psyches mini asteroid is worth up to IDR 165 thousand trillion
The Cub Game Launch for PlayStation 5, PlayStation 4, Switch and PC with Xbox Release Postponed