-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Resolve gradient mismatches in benchmarks #340
Comments
Is the difference between all options or is Zygote the outlier? Reverse diff with compiled tape is known to have these problems with logical branches in the code. |
Compiled tape has issues. |
That makes sense... The improvement path here is track down the "wrong" branch(es) that are getting compiled. |
In the latest benchmarks in #392 I still see a single instance of this that needs resolving. |
Where is this benchmark? |
I think we can also use instabilities in our benchmarks (i.e #400 (comment)) to indicate where problems are. It would be useful to think if there is a more formal way of checking this that requires less tracking across PRs. (i.e here it suggests that observation error models are problematic). |
#414 localised some issues to the See: #414 (comment) |
Running some more repititions I see the following throwing gradient issues (due to compiled reverse diff):
Warnings from Model{typeof(generate_observations), (:obs_model, :y_t, :Y_t), (), (), Tuple{Ascertainment{NegativeBinomialError{HalfNormal{Float64}}, AbstractTuringLatentModel, var"#88#89", String}, Vector{Int64}, Vector{Int64}}, Tuple{}, DefaultContext}(EpiAware.EpiAwareBase.generate_observations, (obs_model = Ascertainment{NegativeBinomialError{HalfNormal{Float64}}, AbstractTuringLatentModel, var"#88#89", String}(NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.01)), PrefixLatentModel{FixedIntercept{Float64}, String}(FixedIntercept{Float64}(0.1), "Ascertainment"), var"#88#89"(), "Ascertainment"), y_t = [100, 100, 100, 100, 100, 100, 100, 100, 100, 100], Y_t = [100, 100, 100, 100, 100, 100, 100, 100, 100, 100]), NamedTuple(), DefaultContext()):
┌ Warning: `ad.compile` where `ad` is `AutoReverseDiff` has been deprecated and will be removed in v2. Instead it is available as a compile-time constant as `AutoReverseDiff{true}` or `AutoReverseDiff{false} Warnings from Model{typeof(generate_observations), (:obs_model, :y_t, :Y_t), (), (), Tuple{Ascertainment{NegativeBinomialError{HalfNormal{Float64}}, AbstractTuringLatentModel, var"#82#83", String}, Vector{Int64}, Vector{Int64}}, Tuple{}, DefaultContext}(EpiAware.EpiAwareBase.generate_observations, (obs_model = Ascertainment{NegativeBinomialError{HalfNormal{Float64}}, AbstractTuringLatentModel, var"#82#83", String}(NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.01)), PrefixLatentModel{FixedIntercept{Float64}, String}(FixedIntercept{Float64}(0.1), "Ascertainment"), var"#82#83"(), "Ascertainment"), y_t = [100, 100, 100, 100, 100, 100, 100, 100, 100, 100], Y_t = [100, 100, 100, 100, 100, 100, 100, 100, 100, 100]), NamedTuple(), DefaultContext()): Warnings from Model{typeof(generate_observations), (:obs_model, :y_t, :Y_t), (), (), Tuple{Ascertainment{NegativeBinomialError{HalfNormal{Float64}}, AbstractTuringLatentModel, var"#64#65", String}, Vector{Int64}, Vector{Int64}}, Tuple{}, DefaultContext}(EpiAware.EpiAwareBase.generate_observations, (obs_model = Ascertainment{NegativeBinomialError{HalfNormal{Float64}}, AbstractTuringLatentModel, var"#64#65", String}(NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.01)), PrefixLatentModel{FixedIntercept{Float64}, String}(FixedIntercept{Float64}(0.1), "Ascertainment"), var"#64#65"(), "Ascertainment"), y_t = [100, 100, 100, 100, 100, 100, 100, 100, 100, 100], Y_t = [100, 100, 100, 100, 100, 100, 100, 100, 100, 100]), NamedTuple(), DefaultContext()) Warnings from Model{typeof(generate_observations), (:obs_model, :y_t, :Y_t), (), (), Tuple{NegativeBinomialError{HalfNormal{Float64}}, Vector{Float64}, Vector{Float64}}, Tuple{}, DefaultContext}(EpiAware.EpiAwareBase.generate_observations, (obs_model = NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.01)), y_t = [10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0], Y_t = [10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0]), NamedTuple(), DefaultContext()) arnings from Model{typeof(generate_observations), (:obs_model, :y_t, :Y_t), (), (), Tuple{NegativeBinomialError{HalfNormal{Float64}}, Vector{Float64}, Vector{Float64}}, Tuple{}, DefaultContext}(EpiAware.EpiAwareBase.generate_observations, (obs_model = NegativeBinomialError{HalfNormal{Float64}}(HalfNormal{Float64}(μ=0.01)), y_t = [10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0], Y_t = [10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0]), NamedTuple(), DefaultContext()) Something of a pattern I think! |
For several utilities, benchmarking suggests that different backends given different gradients. This should be investigated as it may indicate performance issues.
The text was updated successfully, but these errors were encountered: