Edit model card

BERT (Bidirectional Encoder Representations from Transformers) is a groundbreaking pre-trained language model developed by Google. It is designed to understand the context of a word in search queries and other text, making it highly effective for various natural language processing (NLP) tasks.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Dataset used to train pt-sk/bert