In addition to these screenshots, xAI also revealed the information behind Grok's large model through a blog post. It shows that Grok-1, the large model behind Grok, achieved a good level after only two months of training, but did not surpass GPT-4, and its supported context length was not long. During the training, Grok-1 used Jax, a deep learning framework, instead of PyTorch.
Clue : It's's 's the beginning
calm Flamingo_6328 : You don't look like Vietnamese
Carter West OP calm Flamingo_6328 : Of course I'm not hh.
Carter West OP Clue : But I doubt whether it can succeed. I mean there are so many competitors now.
Clue : Well said well done Carter ! Guide us
Clue Carter West OP : Musk has charisma vision and he is a physicist