You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug(问题描述)
The following Variables were used a Lambda layer's call (tf.compat.v1.nn.fused_batch_norm), but
are not present in its tracked objects:
<tf.Variable 'batch_normalization/gamma:0' shape=(32,) dtype=float32>
<tf.Variable 'batch_normalization/beta:0' shape=(32,) dtype=float32>
It is possible that this is intended behavior, but it is more likely
an omission. This is a strong indication that this layer should be
formulated as a subclassed Layer rather than a Lambda layer.
To Reproduce(复现步骤)
I think there are some ploblem when import BatchNormalization,
I tried from tensorflow.python.keras.layers import BatchNormalization, but failed.
and i found the issue below in tensorflow.python.keras.layers.init.py
class VersionAwareLayers(object):
"""Utility to be used internally to access layers in a V1/V2-aware fashion.
When using layers within the Keras codebase, under the constraint that
e.g. `layers.BatchNormalization` should be the `BatchNormalization` version
corresponding to the current runtime (TF1 or TF2), do not simply access
`layers.BatchNormalization` since it would ignore e.g. an early
`compat.v2.disable_v2_behavior()` call. Instead, use an instance
of `VersionAwareLayers` (which you can use just like the `layers` module).
"""
def __getattr__(self, name):
serialization.populate_deserializable_objects()
if name in serialization.LOCAL.ALL_OBJECTS:
return serialization.LOCAL.ALL_OBJECTS[name]
return super(VersionAwareLayers, self).__getattr__(name)
but i don't know how to use it.
i tried use tf.keras.layers.BatchNormalization instead of import the BatchNormalization
and i also tried use from keras.layers import BatchNormalization
In both of the above methods, i encountered the initial problem of Lambda reporting error
Operating environment(运行环境):
python version [3.9]
tensorflow version [ 2.8.0]
deepctr version [e.g. 0.9.2,]
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered:
Describe the bug(问题描述)
The following Variables were used a Lambda layer's call (tf.compat.v1.nn.fused_batch_norm), but
are not present in its tracked objects:
<tf.Variable 'batch_normalization/gamma:0' shape=(32,) dtype=float32>
<tf.Variable 'batch_normalization/beta:0' shape=(32,) dtype=float32>
It is possible that this is intended behavior, but it is more likely
an omission. This is a strong indication that this layer should be
formulated as a subclassed Layer rather than a Lambda layer.
To Reproduce(复现步骤)
I think there are some ploblem when import BatchNormalization,
I tried from tensorflow.python.keras.layers import BatchNormalization, but failed.
and i found the issue below in tensorflow.python.keras.layers.init.py
but i don't know how to use it.
i tried use tf.keras.layers.BatchNormalization instead of import the BatchNormalization
and i also tried use from keras.layers import BatchNormalization
In both of the above methods, i encountered the initial problem of Lambda reporting error
Operating environment(运行环境):
Additional context
Add any other context about the problem here.
The text was updated successfully, but these errors were encountered: