Sauerkraut Mixtral Instruct

Instruct
SauerkrautLM Mixtral Instruct is a German fine-tune of the powerful Mixtral 8x7B, a high-quality sparse mixture of experts model (SMoE). It was trained with a mix of translated and augmented German data.

For instructions on accessing this model or initializing it via API, please refer to our docs.

Configuration

Configuration

For more details about _model_provder--model_name, visit the model's page on Hugging Face.
NVIDIA L40S x 1
Slider