Subscribe to get weekly email with the most promising tools 🚀

Grok-1

Open source release of xAI's LLM

Listed in categories:

GitHubTwitterArtificial Intelligence
Grok-1-image-0
Grok-1-image-1

Description

Grok1 is a 314 billion parameter MixtureofExperts model trained from scratch by xAI. It is a base model checkpoint released under the Apache 2.0 license, not finetuned for any specific application such as dialogue. The model is designed for text data processing with 25% of the weights active on a given token.

How to use Grok-1?

To use Grok1, follow the instructions provided at github.com/xaiorg/grok. The model is ready for text data processing tasks and experimentation.

Core features of Grok-1:

1️⃣

Large language model with 314 billion parameters

2️⃣

MixtureofExperts model architecture

3️⃣

Trained from scratch by xAI

4️⃣

Released under Apache 2.0 license

5️⃣

Not finetuned for specific applications

Why could be used Grok-1?

#Use caseStatus
# 1Text data processing
# 2Language modeling
# 3Research and experimentation
0

Who developed Grok-1?

xAI is the maker of Grok1, a company specializing in advanced AI models and training techniques.

FAQ of Grok-1