Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migration to torch distributions and scoringrules integration #70

Draft
wants to merge 13 commits into
base: main
Choose a base branch
from

Conversation

MicheleCattaneo
Copy link
Collaborator

@MicheleCattaneo MicheleCattaneo commented Dec 6, 2024

Proposal

  • Remove the dependency of TFP in order to move to Keras3 and rely on torch as backend and torch.distributions for the probabilistic layers.
  • Rely on scoringrules for probabilistic loss functions.

Current progress

  • Update poetry dependencies.
  • scoringrules has been tested on normal and exponential distributions with crps.
  • Update code to remove tf dependencies.
  • Update all models to the new structure
    • TCN and phyisical layers are temporarily not supported.
  • Update tests to the new structure.
  • Create the equivalent of DistributionLossWrapper for sample losses.
  • Ensure configuration files are compatible with the updates
  • Optional torch CPU/GPU
  • Add distributions:
    • Censored normal (custom)
    • Truncated normal (custom)
    • Multivariate normal
    • Gamma
    • Beta
    • LogNormal
  • Update README.md (via notebook)
  • Add test for SHAP (include as dev dependency)
  • Extend support to python 3.12
  • Format with black
  • ...

Some implementation details:

  • Every distribution will implement BaseParametricDistribution
  • A probabilistic layer will implement BaseDistributionLayer, which contains an underlying distribution and is responsible for mapping its input into the correct number of parameters.
  • DistributionLossWrapper allows to wrap a function from scoringrules such that the loss can be computed between a tensor of observations and a predicted distribution.
  • There is a clear separation between models and layers.

Useful links:

  • Table with common tf operations and their corresponding keras3 variant: here

…tegration

Co-authored-by: Francesco Zanetta <zanetta.francesco@gmail.com>
@MicheleCattaneo MicheleCattaneo added the enhancement New feature or request label Dec 6, 2024
@MicheleCattaneo MicheleCattaneo self-assigned this Dec 6, 2024
@MicheleCattaneo MicheleCattaneo linked an issue Dec 13, 2024 that may be closed by this pull request
@MicheleCattaneo
Copy link
Collaborator Author

test inference code:

        # make predictions
        out_batch = [model(ds.x).sample(tfd_samples).numpy() for _ in range(mc_samples)]
        out_batch = np.concatenate(out_batch, axis=0).astype("float32")
        assert out_batch.shape[0] == tfd_samples * mc_samples

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Migrate to Keras 3
1 participant