咖哥
liked
It is estimated that there will be a sideways trend today
Translated
4
咖哥
commented on
$NVIDIA (NVDA.US)$ Bought at 127. No way I'll let this go.
2
4
1
咖哥
liked
$NVIDIA (NVDA.US)$ Last week on January 20, DeepSeek-R1 was officially released, with its performance directly comparable to OpenAI o1 official version. When I first saw this news, I didn't think it would have much impact initially, after all, the low-cost computing power has been hyped in China before, be it the previous bean bags or the current DeepSeek. Models like DeepSeek achieving performance similar to ChatGPT at a lower cost truly reflect advancements in the field of AI. However, this does not mean that the future development of AI will completely eliminate computing power barriers. This phenomenon is backed by the results of technological optimization and efficiency improvement, but computing power demand remains one of the core challenges in AI development. The AI field still follows the "law of scale," where larger models and more data usually lead to performance improvements. For example, the training cost of GPT-4 exceeds $10 billion, and future directions like multimodal, embodied intelligence may require even more massive computing power support. This is the future trend of AI development and an irreversible law. Tasks such as autonomous driving, protein folding rely on massive real-time computing and simulation, even though individual model efficiency improves, the overall system still requires super large-scale computing power. While it is undeniable that the application of the new DeepSeek model indeed plays a crucial role in redefining computing power, computing power remains a key resource in the current development of AI. Except through technological optimization, such as breakthroughs in quantum computing...Technological optimization and efficiency improvement.Results, but computing power demand remains one of the core challenges in AI development. The AI field still follows the "law of scale," where larger models and more data usually lead to performance improvements. For example, the training cost of GPT-4 exceeds $0.1 billion, and future directions like multimodal, embodied intelligence may require even more massive computing power support. This is the future trend of AI development and an irreversible law. Tasks such as autonomous driving, protein folding rely on massive real-time computing and simulation, even though individual model efficiency improves, the overall system still requires super large-scale computing power.
Translated
7
1
2
咖哥
commented on
$NVIDIA (NVDA.US)$ Want to buy or sell?
Translated
2
咖哥
commented on
9
咖哥
reacted to
$NVIDIA (NVDA.US)$ How long can a scam last?
It is impossible to report that they have 0.05 million sets of H100 because that would be illegal.
It is impossible to report that they have 0.05 million sets of H100 because that would be illegal.
Translated
2
1
咖哥
liked
$NVIDIA (NVDA.US)$ predict 132-134 range. this is nivadia. those waiting for 110 or 100, pls continue to dream
4
1