AgaMiko commited on
Commit
7578e05
1 Parent(s): 80167f1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -11
README.md CHANGED
@@ -1,14 +1,3 @@
1
- ---
2
- language:
3
- - pl
4
- tags:
5
- - sentence similarity
6
- license: CC by 4.0
7
- datasets:
8
- - Wikipedia
9
-
10
- ---
11
-
12
  # SHerbert - Polish SentenceBERT
13
  SentenceBERT is a modification of the pretrained BERT network that use siamese and triplet network structures to derive semantically meaningful sentence embeddings that can be compared using cosine-similarity. Training was based on the original paper [Siamese BERT models for the task of semantic textual similarity (STS)](https://arxiv.org/abs/1908.10084) with a slight modification of how the training data was used. The goal of the model is to generate different embeddings based on the semantic and topic similarity of the given text.
14
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  # SHerbert - Polish SentenceBERT
2
  SentenceBERT is a modification of the pretrained BERT network that use siamese and triplet network structures to derive semantically meaningful sentence embeddings that can be compared using cosine-similarity. Training was based on the original paper [Siamese BERT models for the task of semantic textual similarity (STS)](https://arxiv.org/abs/1908.10084) with a slight modification of how the training data was used. The goal of the model is to generate different embeddings based on the semantic and topic similarity of the given text.
3