Skip to content

Loading a model returns either an untrained model or broken model #197

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Chriisbrown opened this issue Oct 14, 2023 · 1 comment
Closed

Comments

@Chriisbrown
Copy link

When trying to save a trained model and reload it in a separate python file using
model.save(filepath, save_format='tf') then model = tf.keras.models.load_model(filepath)

I see that the model type has changed, before saving I had a
<tensorflow_decision_forests.keras.GradientBoostedTreesModel object at ... >

after loading I have a

<keras.src.saving.legacy.saved_model.load.GradientBoostedTreesModel object at... >

This model evaluated correctly but if I call model.make_inspector() on the loaded model I have this error
AttributeError: 'GradientBoostedTreesModel' object has no attribute 'make_inspector' This command runs normally for the model that is trained and saved.

My alternative approach was to just save the model weights with model.save_weights(filepath) then load the model weights with model.load_weights(filepath) followed by model.compile(metrics=["binary_crossentropy"])

However, when evaluated this model gives 0s and the following error when I call model.make_inspector()


Traceback (most recent call last):
  File "/home/cb719/Documents/L1Trigger/Tracker/TrackQuality/TrackQuality_package/test.py", line 43, in <module>
    NoDegredation.model.make_inspector()
  File "/home/cb719/miniconda3/envs/tq/lib/python3.11/site-packages/tensorflow_decision_forests/keras/core_inference.py", line 411, in make_inspector
    path = self.yggdrasil_model_path_tensor().numpy().decode("utf-8")
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/cb719/miniconda3/envs/tq/lib/python3.11/site-packages/tensorflow/python/util/traceback_utils.py", line 153, in error_handler
    raise e.with_traceback(filtered_tb) from None
  File "/tmp/__autograph_generated_filegdj2nxzc.py", line 39, in tf__yggdrasil_model_path_tensor
    ag__.if_stmt(ag__.ld(multitask_model_index) >= ag__.converted_call(ag__.ld(len), (ag__.ld(self)._models,), None, fscope), if_body, else_body, get_state, set_state, (), 0)
                                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: in user code:

    File "/home/cb719/miniconda3/envs/tq/lib/python3.11/site-packages/tensorflow_decision_forests/keras/core_inference.py", line 436, in yggdrasil_model_path_tensor  *
        if multitask_model_index >= len(self._models):

    TypeError: object of type 'NoneType' has no len()

I see from #169 that pickling a model is not supported though if it were maybe that would also fix my issues.

Any advice on best practices for saving and loading models would be welcome, I would like to avoid having to retrain a model every time I want to inspect it.

Tensorflow decision forests version: 1.6.0
Python version: Python 3.11.6 | packaged by conda-forge | (main, Oct 3 2023, 10:40:35) [GCC 12.3.0] on linux

@rstz
Copy link
Collaborator

rstz commented Oct 17, 2023

Hi,

this is a known issue, but there is a workaround for most cases.

If your model does not use keras-specifics (e.g. it's not using neural networks preprocessing or model composition), you can create the inspector from the saved model on disk:

# Let's save the model first:
path_to_model = "/tmp/mymodel"
model.save(path_to_model)
# Now create an inspector from the raw YDF model inside our model
insp = tfdf.inspector.make_inspector(os.path.join(path_to_model, "assets"))

@rstz rstz closed this as completed Oct 21, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants