Skip to content

Instantly share code, notes, and snippets.

@uschmidt83
Last active September 29, 2023 15:43
Show Gist options
  • Save uschmidt83/4b747862fe307044c722d6d1009f6183 to your computer and use it in GitHub Desktop.
Save uschmidt83/4b747862fe307044c722d6d1009f6183 to your computer and use it in GitHub Desktop.

This note shows how to make a new Python environment containing TensorFlow 1.x, load a model based on CSBDeep, and then export via export_TF(). Please see this on why you might want to do this.

As an example, we load a StarDist model and export it from the new environment. But the same should also work accordingly with all CSBDeep model types.

Assuming that you use conda to manage your environments (if not, e.g. read this), do the following:

  1. Create a new Python environment and install TensorFlow 1.x, CSBDeep and other necessary packages (e.g. StarDist):
conda create -y --name tf1_model_export python=3.7
conda activate tf1_model_export
# note: gpu support is not necessary for tensorflow
pip install "tensorflow<2"
pip install "csbdeep[tf1]"
# also install stardist in this example
pip install "stardist[tf1]"
  1. Make a new Python script to export the model:
from stardist.models import StarDist2D
model = StarDist2D(None, name='my_model', basedir='.')
model.export_TF()
  1. Run the script to export the model
@dpoburko
Copy link

Hello. Sometime in January 2023, this work around worked without issue. I have come back to it to convert a new Stardist model from TensorFlow2 to TensorFlow1.14, and I am running into a series of errors. I am not sure if maybe the version of Keras is incompatible. If you have any suggestions of a starting place to troubleshoot, I would greatly appreciate it. I am running pythong from an Anaconda temrinal with the environment created as noted above, specifying TF version 1.14.

The error message is:

Using TensorFlow backend.
WARNING:tensorflow:From C:\Users\abc123\anaconda3\envs\tf1_model_export3\lib\site-packages\keras\backend\tensorflow_backend.py:4070: The name tf.nn.max_pool is deprecated. Please use tf.nn.max_pool2d instead.

Loading network weights from 'weights_best.h5'.
Traceback (most recent call last):
  File "tf2totf1.py", line 3, in <module>
    model = StarDist2D(None, name='rebuildingPhagocytes', basedir='G:\\My Drive\\Colab Notebooks\\models')
  File "C:\Users\abc123\anaconda3\envs\tf1_model_export3\lib\site-packages\stardist\models\model2d.py", line 292, in __init__
    super().__init__(config, name=name, basedir=basedir)
  File "C:\Users\abc123\anaconda3\envs\tf1_model_export3\lib\site-packages\stardist\models\base.py", line 223, in __init__
    super().__init__(config=config, name=name, basedir=basedir)
  File "C:\Users\abc123\anaconda3\envs\tf1_model_export3\lib\site-packages\csbdeep\models\base_model.py", line 113, in __init__
    self._find_and_load_weights()
  File "C:\Users\abc123\anaconda3\envs\tf1_model_export3\lib\site-packages\csbdeep\models\base_model.py", line 32, in wrapper
    return f(*args, **kwargs)
  File "C:\Users\abc123\anaconda3\envs\tf1_model_export3\lib\site-packages\csbdeep\models\base_model.py", line 168, in _find_and_load_weights
    self.load_weights(weights_chosen.name)
  File "C:\Users\abc123\anaconda3\envs\tf1_model_export3\lib\site-packages\csbdeep\models\base_model.py", line 32, in wrapper
    return f(*args, **kwargs)
  File "C:\Users\abc123\anaconda3\envs\tf1_model_export3\lib\site-packages\csbdeep\models\base_model.py", line 185, in load_weights
    self.keras_model.load_weights(str(self.logdir/name))
  File "C:\Users\abc123\anaconda3\envs\tf1_model_export3\lib\site-packages\keras\engine\saving.py", line 492, in load_wrapper
    return load_function(*args, **kwargs)
  File "C:\Users\abc123\anaconda3\envs\tf1_model_export3\lib\site-packages\keras\engine\network.py", line 1230, in load_weights
    f, self.layers, reshape=reshape)
  File "C:\Users\abc123\anaconda3\envs\tf1_model_export3\lib\site-packages\keras\engine\saving.py", line 1183, in load_weights_from_hdf5_group
    original_keras_version = f.attrs['keras_version'].decode('utf8')
AttributeError: 'str' object has no attribute 'decode'

The installed package list is below for reference:

tf1_model_export3) C:\Users\abc123>pip list
Package              Version
-------------------- ---------
absl-py              1.4.0
astor                0.8.1
certifi              2022.12.7
colorama             0.4.6
csbdeep              0.7.3
cycler               0.11.0
fonttools            4.38.0
gast                 0.5.4
google-pasta         0.2.0
grpcio               1.54.2
h5py                 3.8.0
imageio              2.31.0
importlib-metadata   6.6.0
Keras                2.3.1
Keras-Applications   1.0.8
Keras-Preprocessing  1.1.2
kiwisolver           1.4.4
llvmlite             0.39.1
Markdown             3.4.3
MarkupSafe           2.1.3
matplotlib           3.5.3
networkx             2.6.3
numba                0.56.4
numpy                1.21.6
packaging            23.1
Pillow               9.5.0
pip                  22.3.1
protobuf             3.20.3
pyparsing            3.0.9
python-dateutil      2.8.2
PyWavelets           1.3.0
PyYAML               6.0
scikit-image         0.19.3
scipy                1.7.3
setuptools           65.6.3
six                  1.16.0
stardist             0.8.3
tensorboard          1.14.0
tensorflow           1.14.0
tensorflow-estimator 1.14.0
termcolor            2.3.0
tifffile             2021.11.2
tqdm                 4.65.0
typing_extensions    4.6.3
Werkzeug             2.2.3
wheel                0.38.4
wincertstore         0.2
wrapt                1.15.0
zipp                 3.15.0

@uschmidt83
Copy link
Author

Hi @dpoburko, it seems that downgrading h5py will solve the issue (c.f stardist/stardist#236).
The correct version of h5py should've been automatically installed when you do pip install "csbdeep[tf1]" or pip install "stardist[tf1]".

@uschmidt83
Copy link
Author

The correct version of h5py should've been automatically installed when you do pip install "csbdeep[tf1]" or pip install "stardist[tf1]".

Sorry, just seeing that this has changed in our code, but we haven't released this yet.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment