Hallo,
I am challenging the export of a tensorflow model for serving it with tensorflow-serving.
After training the model with “Keras Network Learner”, the model is converted with “Keras to TensorFlow Network Converter” and finally writted to disk using “TensorFlow Network Writer”.
The chosen name is in this case “model-tf.zip”. After unzipping the archive “model-tf.zip” it contains a folder named “2d667448-fc0f-446b-b20e-d3bef4214ed3”. In this folder one can find the file “saved_model.pb” as well the folder “variables”.
For the application or deployment of the model the usual way is to use TensorFlows serving option via Docker: https://github.com/tensorflow/serving
After installing and testing Docker, it is described in the TensorFlow documentation to load the previously trained model by means of this command in my case in a Ubuntu machine:
docker run -t --rm -p 8501:8501 \
-v "$TESTDATA/saved_model_half_plus_two_cpu:/models/half_plus_two" \
-e MODEL_NAME=half_plus_two \
tensorflow/serving &
regarding the naming of my model this results in:
sudo docker run -t --rm -p 8501:8501 \
-v "$TESTDATA/model-tf:/models/model-tf" \
-e MODEL_NAME=model-tf \
tensorflow/serving &
After execution of the command in terminal it throws some errors:
sudo docker run -t --rm -p 8500:8500 -v "$TESTDATA/model-tf:/models/model-tf" -e MODEL_NAME=model-tf tensorflow/serving
2020-01-21 08:40:25.042162: I tensorflow_serving/model_servers/server.cc:86] Building single TensorFlow model file config: model_name: model-tf model_base_path: /models/model-tf
2020-01-21 08:40:25.042310: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
2020-01-21 08:40:25.042339: I tensorflow_serving/model_servers/server_core.cc:573] (Re-)adding model: model-tf
2020-01-21 08:40:25.142767: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: model-tf version: 1}
2020-01-21 08:40:25.142809: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: model-tf version: 1}
2020-01-21 08:40:25.142819: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: model-tf version: 1}
2020-01-21 08:40:25.142835: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /models/model-tf/1
2020-01-21 08:40:25.144402: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
2020-01-21 08:40:25.144847: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:333] SavedModel load for tags { serve }; Status: fail: Not found: Could not find meta graph def matching supplied tags: { serve }. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: `saved_model_cli`. Took 2008 microseconds.
2020-01-21 08:40:25.144912: E tensorflow_serving/util/retrier.cc:37] Loading servable: {name: model-tf version: 1} failed: Not found: Could not find meta graph def matching supplied tags: { serve }. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: `saved_model_cli`
Analyzing the model with the recommended “saved_model_cli” give the following result:
saved_model_cli show --dir /home/mat/serving/tensorflow_serving/servables/tensorflow/testdata/model-tf/1
The given SavedModel contains the following tag-sets:
knime
I guess due to the defined tag-set, which is because of the export “knime” I cannot load the model for deployment/ testing.
How can one change the tag-set in Knime or do I have to change anything on side of TensorFlow?
Any help appreciated.
Best regards