Normalized training loss vs. baselines (Facebook).
<p>DCOR reports both reconstruction-only and total (with RLC); baselines report reconstruction-only. Each curve is normalized as in <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0335135#pone.0335135.e243" target="_blank">Eq (29)</a> by di...
Sparad:
| Huvudupphovsman: | |
|---|---|
| Övriga upphovsmän: | , , |
| Publicerad: |
2025
|
| Ämnen: | |
| Taggar: |
Lägg till en tagg
Inga taggar, Lägg till första taggen!
|
| Sammanfattning: | <p>DCOR reports both reconstruction-only and total (with RLC); baselines report reconstruction-only. Each curve is normalized as in <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0335135#pone.0335135.e243" target="_blank">Eq (29)</a> by dividing by its epoch-1 value and EMA-smoothed (exponential moving average) with , where the EMA is computed as with . This normalization enables fair visual comparison across methods with different objectives and scales; the plot therefore emphasizes relative convergence trends (shape and stability) rather than raw magnitudes. Consistent with DCOR’s design, RLC regularizes late-phase training: the reconstruction curve decreases more conservatively than methods that minimize reconstruction alone, while the total objective continues to decrease.</p> |
|---|