# TensorFlow optimize_for_inference: Not able to restore graph properly after optimize_for_inference script

I am facing problems trying to port a tensorflow model to android. I am able to freeze a graph to a .pb file. The frozen graph (frozen_graph.pb) is giving me correct predictions when I try to evaluate the frozen graph (by using tf.import_graph_def). The problem starts after running the optimize_for_inference script on the frozen graph. The optimized graph (optimized_graph.pb) is giving me the following error:

File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/framework/importer.py",

line 365, in import_graph_def

`node, 'Input tensor %r %s' % (input_name, te))) ValueError: graph_def is invalid at node u'fifo_queue_Dequeue': Input tensor 'inputs/prefetch_queue/fifo_queue:0' Cannot convert a tensor of type float32 to an input of type string_ref.`

If I try to verify the input node in the frozen graph (using the method explained here), I get this:

[n for n in g.node if n.name.find("inputs")!=-1] [name: "inputs/prefetch_queue/fifo_queue" op: "FIFOQueue"

attr { key: "_output_shapes" value { list { shape { dim { size: 2 } } } } }

attr { key: "capacity" value { i: 2 } }

attr { key: "component_types" value { list { type: DT_FLOAT type: DT_FLOAT } } }

attr { key: "container" value { s: "" } }

attr { key: "shapes" value { list { shape { dim { size: 128 } dim { size: 224 } dim { size: 224 } dim { size: 3 } } shape { dim { size: 128 } dim { size: 2 } } } } }

attr { key: "shared_name" value { s: "" } } ]

Trying to access the same input node in optimized_graph.pb gives me this:

[n for n in g2.node if n.name.find("inputs")!=-1]

[name: "inputs/prefetch_queue/fifo_queue" op: "Placeholder"

attr { key: "_output_shapes" value { list { shape { dim { size: 2 } } } } }

attr { key: "dtype" value { type: DT_FLOAT } } ]

I have unsuccessfully looked here for a solution. I have tried using graph_transform tools as suggested at github by @concretevitamin as well, but the issue is not solved. Any help in this regard would be greatly appreciated.