Skip to content

Commit 4bdfada

Browse files
authored
Merge pull request #1612 from cronoik/patch-1
Spelling
2 parents c9ce433 + a2ad124 commit 4bdfada

File tree

1 file changed

+3
-3
lines changed
  • examples/unsupervised_learning/TSDAE

1 file changed

+3
-3
lines changed

examples/unsupervised_learning/TSDAE/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
# TSDAE
22

3-
This folder shows an example, how can we train an unsupervised [TSDAE (Tranformer-based Denoising AutoEncoder)](https://arxiv.org/abs/2104.06979) model with pure sentences as training data.
3+
This section shows an example, of how we can train an unsupervised [TSDAE (Tranformer-based Denoising AutoEncoder)](https://arxiv.org/abs/2104.06979) model with pure sentences as training data.
44

55
## Background
6-
During training, TSDAE encodes damaged sentences into fixed-sized vectors and requires the decoder to recon-struct the original sentences from this sentenceembeddings. For good reconstruction quality, thesemantics must be captured well in the sentenceembeddings from the encoder. Later, at inference,we only use the encoder for creating sentence embeddings. The architecture is illustrated in the figure below:
6+
During training, TSDAE encodes damaged sentences into fixed-sized vectors and requires the decoder to reconstruct the original sentences from these sentenceembeddings. For good reconstruction quality, thesemantics must be captured well in the sentenceembeddings from the encoder. Later, at inference,we only use the encoder for creating sentence embeddings. The architecture is illustrated in the figure below:
77

88
![](https://raw.githubusercontent.com/UKPLab/sentence-transformers/master/docs/img/TSDAE.png)
99

@@ -89,4 +89,4 @@ If you use the code for augmented sbert, feel free to cite our publication [TSDA
8989
year = "2021",
9090
url = "https://arxiv.org/abs/2104.06979",
9191
}
92-
```
92+
```

0 commit comments

Comments
 (0)