ニャルガ
reacted to
ニャルガ
Set a live reminder
$Nippon Telegraph & Telephone.JP$ A financial results briefing for the 1st quarter of fiscal year 2024 is scheduled to be held on 2024/8/7 (Wednesday) at 4:00 p.m. Japan time. If you would like to watch it,“reservation”Click on the button.
Translated
![](https://usliveimg.moomoo.com/2024073111135562de56b8.png/thumb)
Aug 7 02:00
5
4
ニャルガ
Set a live reminder
ニャルガ
reacted to
ニャルガ
Set a live reminder
ニャルガ
Set a live reminder
I will explain market information before and after the Tokyo stock market close, individual stocks with conspicuous movements today, and market price forecasts for tomorrow and beyond.
Main Commentary: Kamata Shinichi (Radio NIKKEI Commentary Committee Member)
*This program does not solicit investment or other action.
Please use your own judgment when making the final investment decision.
Main Commentary: Kamata Shinichi (Radio NIKKEI Commentary Committee Member)
*This program does not solicit investment or other action.
Please use your own judgment when making the final investment decision.
Translated
![](https://sgliveimg.moomoo.com/live_client/182209400/20240712/76e436cba163d98f674aa6b6b1fd832c.png/thumb?area=101&is_public=true)
Jul 22 01:00
3
ニャルガ
liked
ニャルガ
reacted to and commented on
ニャルガ
liked
$MU.US$
HBM is a high-bandwidth memory primarily designed for data centers and high-performance computing, and is currently not commonly used in edge AI devices.
Edge AI devices usually have strict size, power consumption, and cost constraints, making it difficult to mount a high-performance, large memory system like HBM. Instead, edge devices often use smaller, more power-efficient memory solutions, such as LPDDR (Low Power Double Data Rate) memories.
However, due to improvements in AI models, there is a possibility that smaller and more efficient HBM-like technology will be applied to edge devices in the future. Currently, it is becoming possible to run AI models on edge devices like PCs and smartphones, but these mostly use existing memory technology
HBM is a high-bandwidth memory primarily designed for data centers and high-performance computing, and is currently not commonly used in edge AI devices.
Edge AI devices usually have strict size, power consumption, and cost constraints, making it difficult to mount a high-performance, large memory system like HBM. Instead, edge devices often use smaller, more power-efficient memory solutions, such as LPDDR (Low Power Double Data Rate) memories.
However, due to improvements in AI models, there is a possibility that smaller and more efficient HBM-like technology will be applied to edge devices in the future. Currently, it is becoming possible to run AI models on edge devices like PCs and smartphones, but these mostly use existing memory technology
Translated
10
2