Very insightful interview with a Former high ranking $Amazon...
Very insightful interview with a Former high ranking $Amazon (AMZN.US)$ AWS and $Intel (INTC.US)$ employee working in the field of AI & Inference:
1. In his view, the business model for companies having inference chips is to package it into a model or a service. There is a reason why people go to $Amazon (AMZN.US)$ 's SageMaker and pay a premium to run their models compared to running them by themselves directly on EC2.
2. It is very difficult for any company to keep up with upgrades of different compilers, such as CUDA, TensorFlow, etc., even if they are open-source.
3. Right now, if you are a chip provider, it is key that you also own the software and application layer. That is why the Big Tech companies have a big advantage. In his view, $Microsoft (MSFT.US)$ is partly panicking as they have the longest path to walk in their custom chip development journey compared to other companies like $Amazon (AMZN.US)$ and $Alphabet-A (GOOGL.US)$ . They made the mistake of investing in Graphcore, which didn't pay off.
4. $NVIDIA (NVDA.US)$ has the advantage because of its AI ecosystem. Everyone has learned about CUDA in school and university. It took them 10 years to build the CUDA ecosystem before AI was the thing. Once you build something like that, it becomes extremely sticky.
5. He thinks that unless there is a competitor with a +10x better chip than whatever $NVIDIA (NVDA.US)$ is doing, it is hard to justify that investment. He believes that in the future, whatever the next big model innovation happens in this space, it will come from $NVIDIA (NVDA.US)$ .
Disclaimer: Community is offered by Moomoo Technologies Inc. and is for educational purposes only.
Read more
Comment
Sign in to post a comment