Skip to content

Instantly share code, notes, and snippets.

View vishal2612200's full-sized avatar
🎯
Focusing

vishal sharma vishal2612200

🎯
Focusing
View GitHub Profile
@vishal2612200
vishal2612200 / export_inference_graph_unfrozen.py
Created June 19, 2019 06:36 — forked from dnlglsn/export_inference_graph_unfrozen.py
Export a checkpointed object_detection model for serving with TensorFlow Serving
"""
References:
https://github.com/tensorflow/models/blob/master/object_detection/g3doc/exporting_models.md
https://github.com/tensorflow/models/issues/1988
Unfortunately, the tutorial for saving a model for inference "freezes" the
variables in place and makes them unservable by tensorflow_serving.
export_inference_graph.py exports an empty "variables" directory, which needs to
be populated.