KazBERT

Published:

KazBERT is a robust BERT-based model specifically designed and fine-tuned for Kazakh language tasks.

Achievements:

  • Over 14,000 downloads
  • 14 likes on Hugging Face

The model is trained using Masked Language Modeling (MLM) on a rich multilingual text corpus comprising Kazakh, Russian, and English texts.

Available on Hugging Face