TFLite Converter, Easy Converter?

It looks easy when you read the tutorial about convert model, with 2 lines of codes.


Yes it works, but in some conditions.

At that article, we know that TFLite converter doesn’t support string and float16, at least not yet.

There are some tutorials about text classification that use String for the input shape parameter at input layer, for example this tutorial provided by tensorflow. For now, you cannot convert the model from that tutorial into TFLite.

But actually you can make the text classification by encoding the text string into float or int.

If you don’t wanna make your hands dirty by encoding the text input into float or integer to make the model supported by the converter, you can create TFlite model for text classification with Tensorflow Lite Model Maker.

Tensorflow provides Some Model Makers that you can use: Image Classification, Text Classification and Question Answer.

import tensorflow as tf
converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
converter.optimizations = [tf.lite.Optimize.DEFAULT]
def representative_dataset_gen():
for input_value in
# Get sample input data as a numpy array in a method of your choosing.
yield [input]
converter.representative_dataset = representative_dataset_gen
converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8]
converter.inference_input_type = tf.int8 # or tf.uint8
converter.inference_output_type = tf.int8 # or tf.uint8
tflite_quant_model = converter.convert()
  • This results in a smaller model and increased inferencing speed, which is valuable for low-power devices such as microcontrollers. This data format is also required by integer-only accelerators such as the Edge TPU.
  • This converts input and output type float 32 to uint_8.
  • Cannot use float 64 to convert to uint_8.
  • Firebase ML use uint_8 for produce tflite model.

Still believe, can change the world with code..