BGE-Multilingual-Gemma2 is a multilingual embedding model built on top of the google/gemma-2-9b large language model. Designed for high-performance text representation across languages, it has been trained on a wide variety of tasks and languages, including English, Chinese, Japanese, Korean, French, and more. The training corpus includes diverse task types such as retrieval, classification, and clustering.
Usage
Generate your API key and query the model through the OpenAI-compatible interface. For more details, see the documentation.
Loading...