• 【tensorflow-v2.0】如何将模型转换成tflite模型


    前言

    TensorFlow Lite 提供了转换 TensorFlow 模型,并在移动端(mobile)、嵌入式(embeded)和物联网(IoT)设备上运行 TensorFlow 模型所需的所有工具。之前想部署tensorflow模型,需要转换成tflite模型。

    实现过程

    1.不同模型的调用函数接口稍微有些不同

    # Converting a SavedModel to a TensorFlow Lite model.
    converter = lite.TFLiteConverter.from_saved_model(saved_model_dir)
    tflite_model = converter.convert()
    
    # Converting a tf.Keras model to a TensorFlow Lite model.
    converter = lite.TFLiteConverter.from_keras_model(model)
    tflite_model = converter.convert()
    
    # Converting ConcreteFunctions to a TensorFlow Lite model.
    converter = lite.TFLiteConverter.from_concrete_functions([func])
    tflite_model = converter.convert()

    2. 完整的实现

    import tensorflow as tf
    saved_model_dir = './mobilenet/'
    converter = tf.lite.TFLiteConverter.from_saved_model(saved_model_dir)
    converter.experimental_new_converter = True
    tflite_model = converter.convert()
    open('model_tflite.tflite', 'wb').write(tflite_model)

    其中,

    @classmethod
    from_saved_model(
        cls,
        saved_model_dir,
        signature_keys=None,
        tags=None
    )

    另外

    For more complex SavedModels, the optional parameters that can be passed into TFLiteConverter.from_saved_model() are input_arrays, input_shapes, output_arrays, tag_set and signature_key. Details of each parameter are available by running help(tf.lite.TFLiteConverter).

    对于如何查看模型的操作op,可查看here;

    help(tf.lite.TFLiteConverter)结果

    Help on class TFLiteConverterV2 in module tensorflow.lite.python.lite:
    
    class TFLiteConverterV2(TFLiteConverterBase)
     |  TFLiteConverterV2(funcs, trackable_obj=None)
     |  
     |  Converts a TensorFlow model into TensorFlow Lite model.
     |  
     |  Attributes:
     |    allow_custom_ops: Boolean indicating whether to allow custom operations.
     |      When false any unknown operation is an error. When true, custom ops are
     |      created for any op that is unknown. The developer will need to provide
     |      these to the TensorFlow Lite runtime with a custom resolver.
     |      (default False)
     |    target_spec: Experimental flag, subject to change. Specification of target
     |      device.
     |    optimizations: Experimental flag, subject to change. A list of optimizations
     |      to apply when converting the model. E.g. `[Optimize.DEFAULT]
     |    representative_dataset: A representative dataset that can be used to
     |      generate input and output samples for the model. The converter can use the
     |      dataset to evaluate different optimizations.
     |    experimental_enable_mlir_converter: Experimental flag, subject to change.
     |      Enables the MLIR converter instead of the TOCO converter.
     |  
     |  Example usage:
     |  
     |    ```python
     |    # Converting a SavedModel to a TensorFlow Lite model.
     |    converter = lite.TFLiteConverter.from_saved_model(saved_model_dir)
     |    tflite_model = converter.convert()
     |  
     |    # Converting a tf.Keras model to a TensorFlow Lite model.
     |    converter = lite.TFLiteConverter.from_keras_model(model)
     |    tflite_model = converter.convert()
     |  
     |    # Converting ConcreteFunctions to a TensorFlow Lite model.
     |    converter = lite.TFLiteConverter.from_concrete_functions([func])
     |    tflite_model = converter.convert()
     |    ```
     |  
     |  Method resolution order:
     |      TFLiteConverterV2
     |      TFLiteConverterBase
     |      builtins.object
     |  
     |  Methods defined here:
     |  
     |  __init__(self, funcs, trackable_obj=None)
     |      Constructor for TFLiteConverter.
     |      
     |      Args:
     |        funcs: List of TensorFlow ConcreteFunctions. The list should not contain
     |          duplicate elements.
     |        trackable_obj: tf.AutoTrackable object associated with `funcs`. A
     |          reference to this object needs to be maintained so that Variables do not
     |          get garbage collected since functions have a weak reference to
     |          Variables. This is only required when the tf.AutoTrackable object is not
     |          maintained by the user (e.g. `from_saved_model`).
     |  
     |  convert(self)
     |      Converts a TensorFlow GraphDef based on instance variables.
     |      
     |      Returns:
     |        The converted data in serialized format.
     |      
     |      Raises:
     |        ValueError:
     |          Multiple concrete functions are specified.
     |          Input shape is not specified.
     |          Invalid quantization parameters.
     |  
     |  ----------------------------------------------------------------------
     |  Class methods defined here:
     |  
     |  from_concrete_functions(funcs) from builtins.type
     |      Creates a TFLiteConverter object from ConcreteFunctions.
     |      
     |      Args:
     |        funcs: List of TensorFlow ConcreteFunctions. The list should not contain
     |          duplicate elements.
     |      
     |      Returns:
     |        TFLiteConverter object.
     |      
     |      Raises:
     |        Invalid input type.
     |  
     |  from_keras_model(model) from builtins.type
     |      Creates a TFLiteConverter object from a Keras model.
     |      
     |      Args:
     |        model: tf.Keras.Model
     |      
     |      Returns:
     |        TFLiteConverter object.
     |  
     |  from_saved_model(saved_model_dir, signature_keys=None, tags=None) from builtins.type
     |      Creates a TFLiteConverter object from a SavedModel directory.
     |      
     |      Args:
     |        saved_model_dir: SavedModel directory to convert.
     |        signature_keys: List of keys identifying SignatureDef containing inputs
     |          and outputs. Elements should not be duplicated. By default the
     |          `signatures` attribute of the MetaGraphdef is used. (default
     |          saved_model.signatures)
     |        tags: Set of tags identifying the MetaGraphDef within the SavedModel to
     |          analyze. All tags in the tag set must be present. (default set(SERVING))
     |      
     |      Returns:
     |        TFLiteConverter object.
     |      
     |      Raises:
     |        Invalid signature keys.
     |  
     |  ----------------------------------------------------------------------
     |  Data descriptors inherited from TFLiteConverterBase:
     |  __dict__
     |      dictionary for instance variables (if defined)
     |  
     |  __weakref__
     |      list of weak references to the object (if defined)
    View Code

    问题:

    使用tf_saved_model中生成mobilenet网络模型转换成tfLite能够成功,为什么使用另一个设计的模型进行转换却出现问题了呢??

    Traceback (most recent call last):
      File "pb2tflite.py", line 9, in <module>
        tflite_model = converter.convert()
      File "~/.local/lib/python3.7/site-packages/tensorflow_core/lite/python/lite.py", line 428, in convert
        "invalid shape '{1}'.".format(_get_tensor_name(tensor), shape_list))
    ValueError: None is only supported in the 1st dimension. Tensor 'images' has invalid shape '[None, None, None, None]'.

    facebox模型节点:

    (tf_test) ~/workspace/test_code/github_test/faceboxes-tensorflow$ saved_model_cli show --dir model/detector/ --all
    
    MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
    
    signature_def['__saved_model_init_op']:
      The given SavedModel SignatureDef contains the following input(s):
      The given SavedModel SignatureDef contains the following output(s):
        outputs['__saved_model_init_op'] tensor_info:
            dtype: DT_INVALID
            shape: unknown_rank
            name: NoOp
      Method name is: 
    
    signature_def['serving_default']:
      The given SavedModel SignatureDef contains the following input(s):
        inputs['images'] tensor_info:
            dtype: DT_FLOAT
            shape: (-1, -1, -1, -1)
            name: serving_default_images:0
      The given SavedModel SignatureDef contains the following output(s):
        outputs['boxes'] tensor_info:
            dtype: DT_FLOAT
            shape: (-1, 100, 4)
            name: StatefulPartitionedCall:0
        outputs['num_boxes'] tensor_info:
            dtype: DT_INT32
            shape: (-1)
            name: StatefulPartitionedCall:1
        outputs['scores'] tensor_info:
            dtype: DT_FLOAT
            shape: (-1, 100)
            name: StatefulPartitionedCall:2
      Method name is: tensorflow/serving/predict

    mobilenet的模型节点

    ~/workspace/test_code/github_test/faceboxes-tensorflow/mobilenet$ saved_model_cli show --dir ./ --all
    
    MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
    
    signature_def['__saved_model_init_op']:
      The given SavedModel SignatureDef contains the following input(s):
      The given SavedModel SignatureDef contains the following output(s):
        outputs['__saved_model_init_op'] tensor_info:
            dtype: DT_INVALID
            shape: unknown_rank
            name: NoOp
      Method name is: 
    
    signature_def['serving_default']:
      The given SavedModel SignatureDef contains the following input(s):
        inputs['input_1'] tensor_info:
            dtype: DT_FLOAT
            shape: (-1, 224, 224, 3)
            name: serving_default_input_1:0
      The given SavedModel SignatureDef contains the following output(s):
        outputs['act_softmax'] tensor_info:
            dtype: DT_FLOAT
            shape: (-1, 1000)
            name: StatefulPartitionedCall:0
      Method name is: tensorflow/serving/predict

    得到大神指点,tflite是静态图,需要指定hwc的值,在此谢过,那么问题来了,怎么指定hwc呢?

    import tensorflow as tf
    saved_model_dir = './model/detector/'
    model = tf.saved_model.load(saved_model_dir)
    concrete_func = model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY]
    concrete_func.inputs[0].set_shape([1, 512, 512, 3])
    converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func])
    # converter.experimental_new_converter = True
    tflite_model = converter.convert()
    open('model_tflite_facebox.tflite', 'wb').write(tflite_model)

    error

    Some of the operators in the model are not supported by the standard TensorFlow Lite runtime. If those are native TensorFlow operators, you might be able to use the extended runtime by passing --enable_select_tf_ops, or by setting target_ops=TFLITE_BUILTINS,SELECT_TF_OPS when calling tf.lite.TFLiteConverter(). Otherwise, if you have a custom implementation for them you can disable this error with --allow_custom_ops, or by setting allow_custom_ops=True when calling tf.lite.TFLiteConverter(). Here is a list of builtin operators you are using: ADD, AVERAGE_POOL_2D, CONCATENATION, CONV_2D, MAXIMUM, MINIMUM, MUL, NEG, PACK, RELU, RESHAPE, SOFTMAX, STRIDED_SLICE, SUB, UNPACK. Here is a list of operators for which you will need custom implementations: TensorListFromTensor, TensorListReserve, TensorListStack, While.

     TensorFlow Lite 已经内置了很多运算符,并且还在不断扩展,但是仍然还有一部分 TensorFlow 运算符没有被 TensorFlow Lite 原生支持。这些不被支持的运算符会给 TensorFlow Lite 的模型转换带来一些阻力。

    import tensorflow as tf
    saved_model_dir = './model/detector/'
    model = tf.saved_model.load(saved_model_dir)
    concrete_func = model.signatures[tf.saved_model.DEFAULT_SERVING_SIGNATURE_DEF_KEY]
    concrete_func.inputs[0].set_shape([1, 512, 512, 3])
    converter = tf.lite.TFLiteConverter.from_concrete_functions([concrete_func])
    converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS, tf.lite.OpsSet.SELECT_TF_OPS]
    # converter.experimental_new_converter = True
    tflite_model = converter.convert()
    open('model_tflite_facebox.tflite', 'wb').write(tflite_model)

    还是有点问题。。。

    参考

    1. tf.lite.TFLiteConverter

    2. stackoverflow_how-to-create-a-tflite-file-from-saved-model-ssd-mobilenet;

    3. tfv1-模型文件转换

    4. github_keras_lstm;

    5. tf_saved_model;

    6. tf_tflite_get_start;

    7. tflite_convert_python_api;

    8. ops_select;

  • 相关阅读:
    2021.1.11
    2021.1.10(每周总结)
    2021.1.9
    2021.1.8
    2021.1.7
    构建之法阅读笔记01
    [java] XML DTD XSD
    详解 泛型 与 自动拆装箱
    详解 正则表达式
    详解 LinkedHashMap
  • 原文地址:https://www.cnblogs.com/happyamyhope/p/11822111.html
Copyright © 2020-2023  润新知