$NVIDIA Corp(NVDA)$ If you were to train GPT-4, 1.8T params model, On A100, it will take 25k A100s and take 3-5 months. On H100, it will take 8k GPUs and take ~3 months. On B100, it will take 2k GPUs and take ~ 3 months. Jenson at GTC.
Moore's Law - 10X every 5 years...100X every 10 yrs. In the last 8 yrs is at 1000X "still not fast enuf"
If NVDA can find new growth avenues—like boosting shipments via the B100 suite, bumping up profit forecasts for the year (TSMC's capacity is maxed out, so only structural tweaks are possible), or industry breakthroughs exceeding expectations (think nonlinear advancements in model capabilities or truly game-changing applications)—it could shift pricing from 2024 to 2025 or even further. 🚀
Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.
Comments