bert / README.md
pt-sk's picture
Update README.md
a5229e2 verified
|
raw
history blame contribute delete
No virus
340 Bytes
metadata
license: mit
datasets: pt-sk/imdb

BERT (Bidirectional Encoder Representations from Transformers) is a groundbreaking pre-trained language model developed by Google. It is designed to understand the context of a word in search queries and other text, making it highly effective for various natural language processing (NLP) tasks.