How to service a retrained initial model using Tensorflow support?

So, I trained the initial model to recognize flowers in accordance with this guide. https://www.tensorflow.org/versions/r0.8/how_tos/image_retraining/index.html

bazel build tensorflow/examples/image_retraining:retrain bazel-bin/tensorflow/examples/image_retraining/retrain --image_dir ~/flower_photos 

To classify an image through the command line, I can do this:

 bazel build tensorflow/examples/label_image:label_image && \ bazel-bin/tensorflow/examples/label_image/label_image \ --graph=/tmp/output_graph.pb --labels=/tmp/output_labels.txt \ --output_layer=final_result \ --image=$HOME/flower_photos/daisy/21652746_cc379e0eea_m.jpg 

But how do I maintain this schedule through the Tensorflow service?

The Tensorflow Service Configuration Guide ( https://tensorflow.imtqy.com/serving/serving_basic ) does not talk about how to enable the graph (output_graph.pb). The server expects a different file format:

 $>ls /tmp/mnist_model/00000001 checkpoint export-00000-of-00001 export.meta 
+6
source share
3 answers

In order to serve the chart after its training, you will need to export it using this api: https://www.tensorflow.org/versions/r0.8/api_docs/python/train.html#export_meta_graph

That the api generates a metagraph def, which is needed by the serving code (this will generate this .meta file you are asking about)

In addition, you need to restore the breakpoint using Saver.save (), which is the Saver class https://www.tensorflow.org/versions/r0.8/api_docs/python/train.html#Saver

After you do this, you will receive both the metagraph def and the control point files needed to restore the chart.

+2
source

Check out this principle on how to load .pb graphical output in a session:

https://github.com/eldor4do/Tensorflow-Examples/blob/master/retraining-example.py

0
source

You need to export the model. I have a PR that exports a model during retraining. Its essence is below:

 import tensorflow as tf def export_model(sess, architecture, saved_model_dir): if architecture == 'inception_v3': input_tensor = 'DecodeJpeg/contents:0' elif architecture.startswith('mobilenet_'): input_tensor = 'input:0' else: raise ValueError('Unknown architecture', architecture) in_image = sess.graph.get_tensor_by_name(input_tensor) inputs = {'image': tf.saved_model.utils.build_tensor_info(in_image)} out_classes = sess.graph.get_tensor_by_name('final_result:0') outputs = {'prediction': tf.saved_model.utils.build_tensor_info(out_classes)} signature = tf.saved_model.signature_def_utils.build_signature_def( inputs=inputs, outputs=outputs, method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME ) legacy_init_op = tf.group(tf.tables_initializer(), name='legacy_init_op') # Save out the SavedModel. builder = tf.saved_model.builder.SavedModelBuilder(saved_model_dir) builder.add_meta_graph_and_variables( sess, [tf.saved_model.tag_constants.SERVING], signature_def_map={ tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY: signature }, legacy_init_op=legacy_init_op) builder.save() 

Above, a directory of variables and the file saved_model.pb will be created. If you put it in the parent directory representing the version number (e.g. 1 /), you can call the tensor flow function using:

 tensorflow_model_server --port=9000 --model_name=inception --model_base_path=/path/to/saved_models/ 
0
source

All Articles