v0.3.7
版本发布时间: 2020-09-30 04:17:02
UKPLab/sentence-transformers最新发布版本:v3.0.1(2024-06-07 21:01:30)
- Upgrade transformers dependency, transformers 3.1.0, 3.2.0 and 3.3.1 are working
- Added example code for model distillation: Sentence Embeddings models can be drastically reduced to e.g. only 2-4 layers while keeping 98+% of their performance. Code can be found in examples/training/distillation
- Transformer models can now accepts two inputs ['sentence 1', 'context for sent1'], which are encoded as the two inputs for BERT.
Minor changes:
- Tokenization in the multi-processes encoding setup now happens in the child processes, not in the parent process.
- Added models.Normalize() to allow the normalization of embeddings to unit length