You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Since the support of `σ` is constrained to be positive and most VI algorithms assume an unconstrained Euclidean support, we need to use a *bijector* to transform `θ`.
71
71
We will use [`Bijectors`](https://github.com/TuringLang/Bijectors.jl) for this purpose.
72
-
The bijector corresponding to the joint support of our model can be constructed as follows:
72
+
This corresponds to the automatic differentiation variational inference (ADVI) formulation[^KTRGB2017].
73
73
74
74
```julia
75
75
using Bijectors: Bijectors
@@ -85,36 +85,6 @@ end;
85
85
86
86
A simpler approach would be to use [`Turing`](https://github.com/TuringLang/Turing.jl), where a `Turing.Model` can be automatically be converted into a `LogDensityProblem` and a corresponding `bijector` is automatically generated.
87
87
88
-
Since most VI algorithms assume that the posterior is unconstrained, we will apply a change-of-variable to our model to make it unconstrained.
89
-
This amounts to wrapping it into a `LogDensityProblem` that applies the transformation and the corresponding Jacobian adjustment.
For the dataset, we will use the popular [sonar classification dataset](https://archive.ics.uci.edu/dataset/151/connectionist+bench+sonar+mines+vs+rocks) from the UCI repository.
119
89
This can be automatically downloaded using [`OpenML`](https://github.com/JuliaAI/OpenML.jl).
120
90
The sonar dataset corresponds to the dataset id 40.
@@ -139,10 +109,7 @@ X = hcat(X, ones(size(X, 1)));
q, info, _ = AdvancedVI.optimize(alg, max_iter, model_ad, q_transformed;);
206
165
```
207
166
208
167
For more examples and details, please refer to the documentation.
209
168
210
169
[^TL2014]: Titsias, M., & Lázaro-Gredilla, M. (2014, June). Doubly stochastic variational Bayes for non-conjugate inference. In *International Conference on Machine Learning*. PMLR.
211
170
[^RMW2014]: Rezende, D. J., Mohamed, S., & Wierstra, D. (2014, June). Stochastic backpropagation and approximate inference in deep generative models. In *International Conference on Machine Learning*. PMLR.
212
171
[^KW2014]: Kingma, D. P., & Welling, M. (2014). Auto-encoding variational bayes. In *International Conference on Learning Representations*.
172
+
[^KTRGB2017]: Kucukelbir, A., Tran, D., Ranganath, R., Gelman, A., & Blei, D. M. (2017). Automatic differentiation variational inference. *Journal of machine learning research*.
0 commit comments