Grok-1
Open source release of xAI's LLM
Listed in categories:
GitHubTwitterArtificial IntelligenceDescription
Grok1 is a 314 billion parameter MixtureofExperts model trained from scratch by xAI. It is a base model checkpoint released under the Apache 2.0 license, not finetuned for any specific application such as dialogue. The model is designed for text data processing with 25% of the weights active on a given token.
How to use Grok-1?
To use Grok1, follow the instructions provided at github.com/xaiorg/grok. The model is ready for text data processing tasks and experimentation.
Core features of Grok-1:
1️⃣
Large language model with 314 billion parameters
2️⃣
MixtureofExperts model architecture
3️⃣
Trained from scratch by xAI
4️⃣
Released under Apache 2.0 license
5️⃣
Not finetuned for specific applications
Why could be used Grok-1?
# | Use case | Status | |
---|---|---|---|
# 1 | Text data processing | ✅ | |
# 2 | Language modeling | ✅ | |
# 3 | Research and experimentation | ✅ |
Who developed Grok-1?
xAI is the maker of Grok1, a company specializing in advanced AI models and training techniques.