Home » Business » NVIDIA’s H100 AI Graphics Accelerator Earns Up to 1,000% of Proceeds: Analysis

NVIDIA’s H100 AI Graphics Accelerator Earns Up to 1,000% of Proceeds: Analysis

Read full version

17.08.2023 23:02, Mykola Khizhnyak

NVIDIA earns up to 1,000% of the proceeds from each sale of the H100 dedicated AI graphics accelerator. About it claims Barron journalist Tae Kim (Tae Kim), referring to the analysis of the consulting company Raymond James.

Image Source: NVIDIA

At the moment, the cost of each NVIDIA H100 accelerator, depending on the sales region and supplier, averages $25-30 thousand. At the same time, we are talking about a less expensive PCIe version of this solution. According to Raymond James, the cost of the GPU used in this accelerator, as well as additional materials (circuit board and other auxiliary elements) is $3320. Unfortunately, Kim doesn’t elaborate on the depth of the costing analysis, and doesn’t explain whether this figure includes factors such as development costs, engineers’ salaries, and production and logistics costs.

The development of specialized accelerators requires significant time and resources. According to the same Glassdoor portal, the average salary of a hardware engineer at NVIDIA is about $202,000 per year. We are talking about only one engineer, but it is obvious that a whole team of specialists worked on the development of the same H100, and thousands of working hours were spent on the development itself. All this should be taken into account in the final cost of the product.

And yet it is clear that NVIDIA is currently out of competition in the supply of hardware for AI computing. There is such a demand for specialized “green” accelerators now that they are sold out long before they hit the conditional store shelves. Suppliers say the queue for them stretched until the second quarter of 2024. And taking into account the latest analyst estimates, according to which the AI ​​​​computing market will grow to $ 150 billion by 2027, the immediate future of NVIDIA seems to be definitely comfortable.

On the other hand, for the market as a whole, the high demand for AI computing accelerators has its negative consequences. In the latest analyst reports saysthat sales of traditional servers (HPC) are declining globally. The main reason for the drop in demand is that hyperscalers and data center operators are turning their attention to AI-optimized systems that use solutions like the NVIDIA H100. For this reason, the same manufacturers of DDR5 memory had to reconsider their expectations regarding the spread of the new RAM standard to the market, as data center operators are now actively investing in AI accelerators, and not in the new RAM standard. Against this backdrop, DDR5 adoption rates are only expected to reach parity with DDR4 by the third quarter of 2024.


2023-08-17 20:02:00
#NVIDIA #sells #H100 #accelerators #markup #demand #growing

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.