Skip to content

Commit c8de52e

Browse files
committed
update docs example for subsampling
1 parent adaefc0 commit c8de52e

File tree

1 file changed

+5
-1
lines changed

1 file changed

+5
-1
lines changed

docs/src/tutorials/subsampling.md

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ In this tutorial, we will see how to perform subsampling with `KLMinRepGradProxD
88
[^HBWP2013]: Hoffman, M. D., Blei, D. M., Wang, C., & Paisley, J. (2013). Stochastic variational inference. *Journal of Machine Learning Research*, 14(1), 1303-1347.
99
[^TL2014]: Titsias, M., & Lázaro-Gredilla, M. (2014, June). Doubly stochastic variational Bayes for non-conjugate inference. In *Proceedings of the International Conference on Machine Learning* (pp. 1971-1979). PMLR.
1010
[^KTRGB2017]: Kucukelbir, A., Tran, D., Ranganath, R., Gelman, A., & Blei, D. M. (2017). Automatic differentiation variational inference. *Journal of Machine Learning Research*, 18(14), 1-45.
11-
## Setting Up a `LogDensityProblem` for Subsampling
11+
## Setting Up Subsampling
1212

1313
We will consider the same hierarchical logistic regression example used in the [Basic Example](@ref basic).
1414

@@ -287,3 +287,7 @@ nothing
287287
```
288288

289289
![](subsampling_example_time_accuracy.svg)
290+
291+
But remember that subsampling will always be *asymptotically* slower than no subsampling.
292+
That is, as the number of iterations increase, there will be a point where no subsampling will overtake subsampling even in terms of wallclock time.
293+
Therefore, subsampling is most beneficial when a crude solution to the VI problem suffices.

0 commit comments

Comments
 (0)