Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

quantize_model() cannot detect a keras.Sequential model #1144

Open
DKMaCS opened this issue Oct 8, 2024 · 3 comments
Open

quantize_model() cannot detect a keras.Sequential model #1144

DKMaCS opened this issue Oct 8, 2024 · 3 comments
Labels
bug Something isn't working

Comments

@DKMaCS
Copy link

DKMaCS commented Oct 8, 2024

Prior to filing: check that this should be a bug instead of a feature request. Everything supported, including the compatible versions of TensorFlow, is listed in the overview page of each technique. For example, the overview page of quantization-aware training is here. An issue for anything not supported should be a feature request.

Describe the bug
I'm passing a keras sequential model into quantize_model(), and I'm getting the error that the model isn't a sequential model.

System information

Carried out in:
Google Colab

TensorFlow version (installed from source or binary):
2.17.0

TensorFlow Model Optimization version (installed from source or binary):
0.8.0

Python version:
3.10.12

Describe the expected behavior
prepare a keras sequential model built from scratch that was imported via load_model()

Describe the current behavior
ValueError: to_quantize can only either be a keras Sequential or Functional model.

Code to reproduce the issue
!pip install tensorflow_model_optimization

import pandas as pd
import numpy as np
import time
import tensorflow as tf
import os
import tempfile
import keras
import tensorflow_model_optimization as tfmot
from google.colab import drive
from tensorflow.keras.models import load_model

drive.mount('/content/drive')
%cd /content/drive/My Drive/CS528/HW3

model = load_model('/content/drive/My Drive/CS528/HW3/q1_model.keras')

quant_aware_model_tflite = '/content/drive/My Drive/CS528/HW3/s_mnist_quant_aware_training.tflite'
quantize_model = tfmot.quantization.keras.quantize_model
q_aware_model = quantize_model(model)

Screenshots
If applicable, add screenshots to help explain your problem.

Additional context
I've already checked the same compound conditional using the imported model just before calling quantize_model(), and it behaves as it should. Only when quantize_model() is actually handling the model does it seem to think the model isn't a tf.keras.Sequential object.

@DKMaCS DKMaCS added the bug Something isn't working label Oct 8, 2024
@DKMaCS
Copy link
Author

DKMaCS commented Oct 8, 2024

q1_model.keras was made beforehand using a sequential model with common layers using the save_model() function from tensorflow.keras.models

@pedrofrodenas
Copy link

The problem is that you are using Keras3. To avoid this problem, download tf_keras library that match your tensorflow version.
import tf_keras as keras and set up os.environ["TF_USE_LEGACY_KERAS"] = "1"

@MEHDI342
Copy link

MEHDI342 commented Oct 25, 2024

you could instead of importing the layers and sequential package this way`:
from keras.layers import LSTM, Dense
from keras.models import Sequential

do it this way instead:

from tensorflow.keras.layers import LSTM, Dense,
but for the sequential package you could just do a :
import keras
from keras import layers
from keras import ops
i think the import sequential is just a bug from gpt and claude , the layers and ops package are sufficient

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants