Return (inv) root of KroneckerProductAddedDiagLinearOperator as lazy #14
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This is a clone of cornellius-gp/gpytorch#1430
Previously,
_root_decomposition
and_inv_root_decomposition
were returning the (inv) root from the eigendecomposition as a dense tensor, which can be inefficient. This now returns the root asMatmulLazyTensor
instead. E.g. a matrix vector product of the root with some vectorv
is now implicitly computed asq_matrix @ (evals \dot v)
rather than(q_matrix @ diag(evals)) @ v
, which can make a big difference sinceq_matrix
is a Kronecker product.This can help runtime, but more importantly it significantly reduces memory footprint, since we don't need to instantiate the (inv) root, but only the constituent components.
This also fixes an issue with
KroneckerProductAddedDiagLinearOperator
implicitly assuming the diagonal to be constant, and returning incorrect results if that was not the case. The changes here make the tensor fall back to the superclass implementation in case of non-constant diagonals.