-
Hiya, I wanted to have a go at fitting an exact GP with a non-zero mean modelled by explicit basis functions (basically the model in section 2.7 of Rasmussen & Williams). The obvious way to do this would be to have the mean function and kernel take different subsets of the input dimensions. All the kernels have an |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Yes, good point, adding active dims to linear mean would be a good idea for a pull request. One other thing you could try is to do the sub-setting automatically in the forwards call: def forward(self, x):
mean = self.mean_module(x[..., :3]) # the mean only takes as inputs the first two dimensions
covar = self.mean_module(x)
return MultivariateNormal(mean, covar) |
Beta Was this translation helpful? Give feedback.
Yes, good point, adding active dims to linear mean would be a good idea for a pull request.
One other thing you could try is to do the sub-setting automatically in the forwards call: