Model logo

BGE-M3

by BAAI

Embedding
BGE-M3 is a versatile embedding model developed by BAAI, distinguished by its capabilities in Multi-Functionality, Multi-Linguality, and Multi-Granularity. It uniquely supports three retrieval methods—dense retrieval, multi-vector retrieval, and sparse retrieval—within a single framework, enabling flexible information retrieval strategies. The model is trained to handle over 100 languages, facilitating robust multilingual and cross-lingual retrieval. Additionally, BGE-M3 can process inputs ranging from short sentences to long documents of up to 8,192 tokens, accommodating various text granularities. Its training incorporates a novel self-knowledge distillation approach, integrating relevance scores from different retrieval functionalities to enhance embedding quality.
Provider
Context Size
Throughput
Latency
Input Cost
Output Cost

Usage

Generate your API key and query the model through the OpenAI-compatible interface. The preference parameter allows you to define the routing strategy. For more details, see the documentation.