Replies: 1 comment
-
|
Because GPyTorch uses the linear_operator library under-the-hood, it's not well designed for torch's functional library. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I want to use torch.func.functional_call to call kernels with different parameters.
The "normal" way would be to change the state dictionary directly.
This prints a matrix with 2.57e-7 in the off diagonal in the first case and 0.0146 in the second. However, if I try to do this with torch.func.functional_call as follows:
this prints a matrix with 2.57e-7 in the off diagonal in both cases.
Any suggestions on why this does not work? One thing that is interesting to note is that the output of functional_call is a gpytorch.lazy.lazy_evaluated_kernel_tensor.LazyEvaluatedKernelTensor object instead of just a tensor - perhaps that is related?
The reason I want to use the second method instead of the first is that I have a mapping x --> lengthscales and I want to take the derivative of the output of a GP wrt x.
Thanks!
Edit: I have also tried this with _reparametrize_module as suggested here: #2662 (comment), but this also yields the same (incorrect) results
Beta Was this translation helpful? Give feedback.
All reactions