I am trying to set up a retrained initial model using the service tools configured for the initial model. I follow the tutorial here . I was able to configure the server using a retrained model. I added the following code at the end of the retrain.py file to export it.
export_path = "/tmp/export"
export_version = "1"
init_op = tf.group(tf.initialize_all_tables(), name='init_op')
saver = tf.train.Saver(sharded=True)
model_exporter = exporter.Exporter(saver)
signature = exporter.classification_signature(input_tensor=jpeg_data_tensor, scores_tensor=final_tensor)
model_exporter.init(sess.graph.as_graph_def(), default_graph_signature=signature)
model_exporter.export(FLAGS.export_dir, tf.constant(export_version), sess)
print('Successfully exported model to %s' % export_path)
Currently, I only have 4 classes. I created a model using Tensorflow tools (no maintenance), I was able to check whether my model works on test images. Now I'm trying to serve him. I configured the server on the model using the following command:
bazel-bin/tensorflow_serving/example/inception_inference --port=9000 /tmp/export/ &> retrain.log &
I get the following output: it continuously prints the last two lines.
I tensorflow_serving/core/basic_manager.cc:189] Using InlineExecutor for BasicManager.
I tensorflow_serving/example/inception_inference.cc:383] Waiting for models to be loaded...
I tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:147] Aspiring version for servable default from path: /tmp/exportdir/00000001
I tensorflow_serving/session_bundle/session_bundle.cc:130] Attempting to load a SessionBundle from: /tmp/exportdir/00000001
I tensorflow_serving/session_bundle/session_bundle.cc:107] Running restore op for SessionBundle
I tensorflow_serving/session_bundle/session_bundle.cc:184] Done loading SessionBundle
I tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:147] Aspiring version for servable default from path: /tmp/exportdir/00000001
I tensorflow_serving/example/inception_inference.cc:349] Running...
I tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:147] Aspiring version for servable default from path: /tmp/exportdir/00000001
I tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:147] Aspiring version for servable default from path: /tmp/exportdir/00000001
python , , , , , :
vagrant@ubuntu:~/serving$ bazel-bin/tensorflow_serving/example/inception_client
Traceback (most recent call last):
File "/home/vagrant/serving/bazel bin/tensorflow_serving/example/inception_client.runfiles/tf_serving/tensorflow_serving/example/inception_client.py", line 57, in <module>
tf.app.run()
File "/home/vagrant/serving/bazel-bin/tensorflow_serving/example/inception_client.runfiles/tf/tensorflow/python/platform/app.py", line 30, in run
sys.exit(main(sys.argv))
File "/home/vagrant/serving/bazel-bin/tensorflow_serving/example/inception_client.runfiles/tf_serving/tensorflow_serving/example/inception_client.py", line 51, in main
result = stub.Classify(request, 5.0)
File "/usr/local/lib/python2.7/dist-packages/grpc/framework/crust/implementations.py", line 75, in __call__
protocol_options, metadata, request)
File "/usr/local/lib/python2.7/dist-packages/grpc/framework/crust/_calls.py", line 109, in blocking_unary_unary
return next(rendezvous)
File "/usr/local/lib/python2.7/dist-packages/grpc/framework/crust/_control.py", line 415, in next
raise self._termination.abortion_error
grpc.framework.interfaces.face.face.NetworkError: NetworkError(code=StatusCode.INTERNAL, details="FetchOutputs node : not found")
- ? Tensorflow .