The saved model CLI can't handle string input w/ the input_examples
flag RE: tensorflow/tensorflow#27662 :
saved_model_cli run \
--dir . \
--tag_set serve \
--signature_def predict \
--input_examples 'examples=[{"menu_item":["this is a sentence"]}]'
File .../saved_model_cli.py", line 489, in _create_example_string feature_list)
TypeError: 'this is a sentence' has type str, but expected one of: bytes
Here is a hack which builds the proto string manually (what input_examples
SHOULD do):
import tensorflow as tf
def serialize_example_string(strings):
serialized_examples = []
for s in strings:
try:
value = [bytes(s, "utf-8")]
except TypeError: # python 2
value = [bytes(s)]
example = tf.train.Example(
features=tf.train.Features(
feature={
"menu_item": tf.train.Feature(bytes_list=tf.train.BytesList(value=value))
}
)
)
serialized_examples.append(example.SerializeToString())
return "examples=" + repr(serialized_examples).replace("'", "\"")
strings=["kouign amann", "pot a feu", "truffles"]
print(serialize_example_string(strings))
This outputs a list of string encoded Example protos:
examples=[b"\n\x1f\n\x1d\n\tmenu_item\x12\x10\n\x0e\n\x0ckouign amann", b"\n\x1c\n\x1a\n\tmenu_item\x12\r\n\x0b\n\tpot a feu", b"\n\x1b\n\x19\n\tmenu_item\x12\x0c\n\n\n\x08truffles"]
Since you built the protos by hand you can use the input_exprs
flag instead:
saved_model_cli run \
--dir . \
--tag_set serve \
--signature_def predict \
--input_exprs='examples=[b"\n\x1f\n\x1d\n\tmenu_item\x12\x10\n\x0e\n\x0ckouign amann", b"\n\x1c\n\x1a\n\tmenu_item\x12\r\n\x0b\n\tpot a feu", b"\n\x1b\n\x19\n\tmenu_item\x12\x0c\n\n\n\x08truffles"]'