Giter Site home page Giter Site logo

am15h / tflite_flutter_plugin Goto Github PK

View Code? Open in Web Editor NEW
498.0 19.0 350.0 73.02 MB

TensorFlow Lite Flutter Plugin

Home Page: https://pub.dev/packages/tflite_flutter

License: Apache License 2.0

Java 1.42% Swift 1.35% Objective-C 0.41% Dart 51.97% Ruby 1.45% Shell 1.09% C 1.41% Batchfile 0.56% CMake 13.54% C++ 26.79%

tflite_flutter_plugin's Introduction



Platform Pub Package Docs

Announcement

Update: 26 April, 2023

The TensorFlow team has officially migrated this project to a new repository, deprecating this one. We will be focusing on getting the plugin to a stable and usable state to help our developers add robust machine learning features to their Flutter apps. PRs and contributions are more than welcome there, though please be mindful that this is a work in progress, so some things may be a bit broken for a bit :)

We do want to say a huge thank you to Amish for working on this initial plugin, and we're excited to keep it progressing.

Feel free to reach out to me with questions until then.

Thanks!

Overview

TensorFlow Lite Flutter plugin provides a flexible and fast solution for accessing TensorFlow Lite interpreter and performing inference. The API is similar to the TFLite Java and Swift APIs. It directly binds to TFLite C API making it efficient (low-latency). Offers acceleration support using NNAPI, GPU delegates on Android, Metal and CoreML delegates on iOS, and XNNPack delegate on Desktop platforms.

Key Features

  • Multi-platform Support for Android, iOS, Windows, Mac, Linux.
  • Flexibility to use any TFLite Model.
  • Acceleration using multi-threading and delegate support.
  • Similar structure as TensorFlow Lite Java API.
  • Inference speeds close to native Android Apps built using the Java API.
  • You can choose to use any TensorFlow version by building binaries locally.
  • Run inference in different isolates to prevent jank in UI thread.

(Important) Initial setup : Add dynamic libraries to your app

Android

  1. Place the script install.sh (Linux/Mac) or install.bat (Windows) at the root of your project.

  2. Execute sh install.sh (Linux) / install.bat (Windows) at the root of your project to automatically download and place binaries at appropriate folders.

    Note: The binaries installed will not include support for GpuDelegateV2 and NnApiDelegate however InterpreterOptions().useNnApiForAndroid can still be used.

  3. Use sh install.sh -d (Linux) or install.bat -d (Windows) instead if you wish to use these GpuDelegateV2 and NnApiDelegate.

These scripts install pre-built binaries based on latest stable tensorflow release. For info about using other tensorflow versions follow instructions in wiki.

iOS

  1. Download TensorFlowLiteC.framework. For building a custom version of tensorflow, follow instructions in wiki.
  2. Place the TensorFlowLiteC.framework in the pub-cache folder of this package.

Pub-Cache folder location: (ref)

  • ~/.pub-cache/hosted/pub.dartlang.org/tflite_flutter-<plugin-version>/ios/ (Linux/ Mac)
  • %LOCALAPPDATA%\Pub\Cache\hosted\pub.dartlang.org\tflite_flutter-<plugin-version>\ios\ (Windows)

Desktop

Follow instructions in this guide to build and use desktop binaries.

TFLite Flutter Helper Library

A dedicated library with simple architecture for processing and manipulating input and output of TFLite Models. API design and documentation is identical to the TensorFlow Lite Android Support Library. Strongly recommended to be used with tflite_flutter_plugin. Learn more.

Examples

Title Code Demo Blog
Text Classification App Code Blog/Tutorial
Image Classification App Code -
Object Detection App Code Blog/Tutorial
Reinforcement Learning App Code Blog/Tutorial

Import

import 'package:tflite_flutter/tflite_flutter.dart';

Usage instructions

Creating the Interpreter

  • From asset

    Place your_model.tflite in assets directory. Make sure to include assets in pubspec.yaml.

    final interpreter = await tfl.Interpreter.fromAsset('your_model.tflite');

Refer to the documentation for info on creating interpreter from buffer or file.

Performing inference

See TFLite Flutter Helper Library for easy processing of input and output.

  • For single input and output

    Use void run(Object input, Object output).

    // For ex: if input tensor shape [1,5] and type is float32
    var input = [[1.23, 6.54, 7.81. 3.21, 2.22]];
    
    // if output tensor shape [1,2] and type is float32
    var output = List.filled(1*2, 0).reshape([1,2]);
    
    // inference
    interpreter.run(input, output);
    
    // print the output
    print(output);
  • For multiple inputs and outputs

    Use void runForMultipleInputs(List<Object> inputs, Map<int, Object> outputs).

    var input0 = [1.23];  
    var input1 = [2.43];  
    
    // input: List<Object>
    var inputs = [input0, input1, input0, input1];  
    
    var output0 = List<double>.filled(1, 0);  
    var output1 = List<double>.filled(1, 0);
    
    // output: Map<int, Object>
    var outputs = {0: output0, 1: output1};
    
    // inference  
    interpreter.runForMultipleInputs(inputs, outputs);
    
    // print outputs
    print(outputs)

Closing the interpreter

interpreter.close();

Improve performance using delegate support

Note: This feature is under testing and could be unstable with some builds and on some devices.
  • NNAPI delegate for Android

    var interpreterOptions = InterpreterOptions()..useNnApiForAndroid = true;
    final interpreter = await Interpreter.fromAsset('your_model.tflite',
        options: interpreterOptions);
    

    or

    var interpreterOptions = InterpreterOptions()..addDelegate(NnApiDelegate());
    final interpreter = await Interpreter.fromAsset('your_model.tflite',
        options: interpreterOptions);
    
  • GPU delegate for Android and iOS

    • Android GpuDelegateV2

      final gpuDelegateV2 = GpuDelegateV2(
              options: GpuDelegateOptionsV2(
              false,
              TfLiteGpuInferenceUsage.fastSingleAnswer,
              TfLiteGpuInferencePriority.minLatency,
              TfLiteGpuInferencePriority.auto,
              TfLiteGpuInferencePriority.auto,
          ));
      
      var interpreterOptions = InterpreterOptions()..addDelegate(gpuDelegateV2);
      final interpreter = await Interpreter.fromAsset('your_model.tflite',
          options: interpreterOptions);
    • iOS Metal Delegate (GpuDelegate)

      final gpuDelegate = GpuDelegate(
            options: GpuDelegateOptions(true, TFLGpuDelegateWaitType.active),
          );
      var interpreterOptions = InterpreterOptions()..addDelegate(gpuDelegate);
      final interpreter = await Interpreter.fromAsset('your_model.tflite',
          options: interpreterOptions);

Refer Tests to see more example code for each method.

Credits

  • Tian LIN, Jared Duke, Andrew Selle, YoungSeok Yoon, Shuangfeng Li from the TensorFlow Lite Team for their invaluable guidance.
  • Authors of dart-lang/tflite_native.

tflite_flutter_plugin's People

Contributors

aadilmaan avatar am15h avatar captaindario avatar dcharkes avatar devoncarew avatar juliangeissler avatar kevmoo avatar lambdabaa avatar mannprerak2 avatar mattsday avatar mgalgs avatar mit-mit avatar paultr avatar sjindel-google avatar truongsinh avatar windmaple avatar yingshaoxo avatar yunhankyu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

tflite_flutter_plugin's Issues

Old iOS TFLite-Framework version - Didn't find op for builtin opcode 'RESIZE_NEAREST_NEIGHBOR' version '3'

While running our tf-lite model on Android works perfectly, the model on iOS cannot be interpreted with the following error:
Didn't find op for builtin opcode 'RESIZE_NEAREST_NEIGHBOR' version '3'

We're using an UpSamling Layer which uses that operation you can see above.

I guess, the reason for that is, that the iOS-side of the plug-in does not contain the newest TFLite-Framework-C release, because the Tensorflow-Team released the RESIZE_NEAREST_NEIGHBOR operation just a few months ago.

Could this be fixed and the new version of the TFLite-Framework for iOS be added in the near future?

Cheers :)

W/FlutterEngine( 8943): Tried to automatically register plugins with FlutterEngine (io.flutter.embedding.engine.FlutterEngine@17540cb) but could not find and invoke the GeneratedPluginRegistrant. Error connecting to the service protocol: failed to connect to

Launching lib/main.dart on HUAWEI MT7 TL10 in debug mode...
Running Gradle task 'assembleDebug'...
โœ“ Built build/app/outputs/apk/debug/app-debug.apk.
W/FlutterEngine( 8943): Tried to automatically register plugins with FlutterEngine (io.flutter.embedding.engine.FlutterEngine@17540cb) but could not find and invoke the GeneratedPluginRegistrant.
Error connecting to the service protocol: failed to connect to http://127.0.0.1:57991/4YEMNcZHU3A=/

Audio Support

Does this flutter package support audio models? (E.g. I'd like my app to listen to short commands and then execute a specific action based on the command)

Missing License

Hi, very nice effort! My team is also using bindings from tflite_native package in Flutter, but your additions to this package are very interesting. Could you add a License so we could evaluate if we can use it and possibly contribute to it?

How to use your own custom built TensorFlowLiteC.framework on iOS?

I'm building my own tflite libraries because I need a more recent version than what is currently released. For Android I'm using a Docker-based build environment from the tf repo and build commands from this repo and that's working great ๐ŸŽ‰

For iOS I have the libraries built but can't seem to use them in my project. I'm following these instructions, briefly:

cd <tensorflow_source>
./configure (answer yes when it asks about iOS support)
bazel build --config=ios_fat -c opt //tensorflow/lite/ios:TensorFlowLiteC_framework

This gives me a TensorFlowLiteC_framework.zip, which I don't exactly know what to do with. I've tried replacing ios/Pods/TensorFlowLiteC/Frameworks/TensorFlowLiteC.framework in my app with my freshly built one, but that didn't seem to be getting picked up and gets wiped when I do a flutter clean...

So I tried replacing the one in pub-cache:

cd /Users/mgalgs/development/flutter/.pub-cache/hosted/pub.dartlang.org/tflite_flutter-0.5.0/ios
mv TensorFlowLiteC.framework{,.sav}
unzip ~/development/tensorflow/bazel-bin/tensorflow/lite/ios/TensorFlowLiteC_framework.zip

but then when I build my app with flutter build ios I get linker errors:

Build output
$ flutter build ios
Running "flutter pub get" in RoutespotterApp...                     0.6s
Building com.mgalgs.routespotter for device (ios-release)...
Automatically signing iOS for device deployment using specified development team in Xcode project: PH2D8HJC83
Running Xcode build...
Xcode build done.                                            3.6s
Failed to build iOS app
Error output from Xcode build:
โ†ณ
    ** BUILD FAILED **


Xcode's output:
โ†ณ
      "tflite::tensor_utils::MeanStddevNormalization(float const*, float*, int, int)", referenced from:
          l1728 in TensorFlowLiteC
          l1729 in TensorFlowLiteC
      "tflite::tensor_utils::VectorVectorDotProduct(float const*, float const*, int)", referenced from:
          l1416 in TensorFlowLiteC
          l1417 in TensorFlowLiteC
      "tflite::tensor_utils::IsZeroVector(float const*, int)", referenced from:
          l1254 in TensorFlowLiteC
          l1417 in TensorFlowLiteC
          l1722 in TensorFlowLiteC
          l1724 in TensorFlowLiteC
          l4108 in TensorFlowLiteC
      "tflite::tensor_utils::ApplyLayerNormFloat(short const*, short const*, int, int, int const*, int, int, short*)", referenced from:
          l1727 in TensorFlowLiteC
      "tflite::Subgraph::ModifyGraphWithDelegate(TfLiteDelegate*)", referenced from:
          l1451 in TensorFlowLiteC
          l1470 in TensorFlowLiteC
      "tflite::tensor_utils::MatrixScalarMultiplyAccumulate(signed char const*, int, int, int, int*)", referenced from:
          l764 in TensorFlowLiteC
          l1981 in TensorFlowLiteC
      "tflite::Subgraph::SetInputs(std::__1::vector<int, std::__1::allocator<int> >)", referenced from:
          l1447 in TensorFlowLiteC
          l3454 in TensorFlowLiteC
      "tflite::tensor_utils::ReductionSumVector(float const*, float*, int, int)", referenced from:
          l1416 in TensorFlowLiteC
          l1417 in TensorFlowLiteC
      "tflite::tensor_utils::MatrixBatchVectorMultiplyAccumulate(signed char const*, int const*, signed char const*, int, int, int, int, int, int, int*, short*,
      tflite::CpuBackendContext*)", referenced from:
          l1726 in TensorFlowLiteC
          l1730 in TensorFlowLiteC
      "tflite::tensor_utils::SymmetricQuantizeFloats(float const*, int, signed char*, float*, float*, float*)", referenced from:
          l1254 in TensorFlowLiteC
          l1417 in TensorFlowLiteC
          l1724 in TensorFlowLiteC
          l4108 in TensorFlowLiteC
          l4542 in TensorFlowLiteC
          l4551 in TensorFlowLiteC
          l4837 in TensorFlowLiteC
          ...
      "tflite::Subgraph::AllocateTensors()", referenced from:
          l1450 in TensorFlowLiteC
          l3006 in TensorFlowLiteC
          l3008 in TensorFlowLiteC
          l4741 in TensorFlowLiteC
          l5015 in TensorFlowLiteC
    ld: symbol(s) not found for architecture arm64
    clang: error: linker command failed with exit code 1 (use -v to see invocation)
    note: Using new build system
    note: Building targets in parallel
    note: Planning build
    note: Constructing build description
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.3.99. (in target
    'image_picker' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.3.99. (in target
    'video_player' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.3.99. (in target
    'url_launcher' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.3.99. (in target
    'shared_preferences' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.3.99. (in target
    'path_provider' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.3.99. (in target
    'package_info' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.3.99. (in target
    'flutter_isolate' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.3.99. (in target
    'camera' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.3.99. (in target
    'Flutter' from project 'Pods')

Encountered error while building for device.

What's the proper way to integrate my custom built TensorFlowLiteC.framework into my app?

Thanks!

Cant run app on IOS

When i build the project i got this message:

 ld: warning: Could not find or use auto-linked library 'swiftSwiftOnoneSupport'
               ld: warning: Could not find or use auto-linked library 'swiftCoreFoundation'
               ld: warning: Could not find or use auto-linked library 'swiftCompatibility50'
               ld: warning: Could not find or use auto-linked library 'swiftObjectiveC'
               ld: warning: Could not find or use auto-linked library 'swiftUIKit'
               ld: warning: Could not find or use auto-linked library 'swiftDarwin'
               ld: warning: Could not find or use auto-linked library 'swiftQuartzCore'
               ld: warning: Could not find or use auto-linked library 'swiftCore'
               ld: warning: Could not find or use auto-linked library 'swiftCoreGraphics'
               ld: warning: Could not find or use auto-linked library 'swiftFoundation'
               ld: warning: Could not find or use auto-linked library 'swiftCoreImage'
               ld: warning: Could not find or use auto-linked library 'swiftCompatibilityDynamicReplacements'
               ld: warning: Could not find or use auto-linked library 'swiftMetal'
               ld: warning: Could not find or use auto-linked library 'swiftDispatch'
               ld: warning: Could not find or use auto-linked library 'swiftCoreMedia'
               ld: warning: Could not find or use auto-linked library 'swiftCoreAudio'
               Undefined symbols for architecture arm64:
                 "value witness table for Builtin.UnknownObject", referenced from:
                     full type metadata for tflite_flutter.SwiftTfliteFlutter in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "__swift_FORCE_LOAD_$_swiftCompatibilityDynamicReplacements", referenced from:
                     __swift_FORCE_LOAD_$_swiftCompatibilityDynamicReplacements_$_tflite_flutter in libtflite_flutter.a(SwiftTfliteFlutter.o)
                    (maybe you meant: __swift_FORCE_LOAD_$_swiftCompatibilityDynamicReplacements_$_tflite_flutter)
                 "_swift_allocObject", referenced from:
                     @objc tflite_flutter.SwiftTfliteFlutter.handle(_: __C.FlutterMethodCall, result: (Any?) -> ()) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "static (extension in Foundation):Swift.String._unconditionallyBridgeFromObjectiveC(__C.NSString?) -> Swift.String", referenced from:
                     tflite_flutter.SwiftTfliteFlutter.handle(_: __C.FlutterMethodCall, result: (Any?) -> ()) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "_swift_unknownObjectRelease", referenced from:
                     static tflite_flutter.SwiftTfliteFlutter.register(with: __C.FlutterPluginRegistrar) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                     @nonobjc __C.FlutterMethodChannel.__allocating_init(name: Swift.String, binaryMessenger: __C.FlutterBinaryMessenger) -> __C.FlutterMethodChannel in libtflite_flutter.a(SwiftTfliteFlutter.o)
                     @objc static tflite_flutter.SwiftTfliteFlutter.register(with: __C.FlutterPluginRegistrar) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                     reabstraction thunk helper from @escaping @callee_unowned @convention(block) (@unowned Swift.AnyObject?) -> () to @escaping @callee_guaranteed (@in_guaranteed Any?) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "_swift_release", referenced from:
                     tflite_flutter.SwiftTfliteFlutter.handle(_: __C.FlutterMethodCall, result: (Any?) -> ()) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                     ___swift_destroy_boxed_opaque_existential_0 in libtflite_flutter.a(SwiftTfliteFlutter.o)
                     @objc tflite_flutter.SwiftTfliteFlutter.handle(_: __C.FlutterMethodCall, result: (Any?) -> ()) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "_swift_deallocObject", referenced from:
                     l_objectdestroy in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "_swift_retain", referenced from:
                     tflite_flutter.SwiftTfliteFlutter.handle(_: __C.FlutterMethodCall, result: (Any?) -> ()) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "_swift_unknownObjectRetain", referenced from:
                     @objc static tflite_flutter.SwiftTfliteFlutter.register(with: __C.FlutterPluginRegistrar) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "Swift._bridgeAnythingToObjectiveC<A>(A) -> Swift.AnyObject", referenced from:
                     reabstraction thunk helper from @escaping @callee_unowned @convention(block) (@unowned Swift.AnyObject?) -> () to @escaping @callee_guaranteed (@in_guaranteed Any?) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "(extension in Foundation):Swift.String._bridgeToObjectiveC() -> __C.NSString", referenced from:
                     @nonobjc __C.FlutterMethodChannel.__allocating_init(name: Swift.String, binaryMessenger: __C.FlutterBinaryMessenger) -> __C.FlutterMethodChannel in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "_swift_getObjCClassFromMetadata", referenced from:
                     @nonobjc __C.FlutterMethodChannel.__allocating_init(name: Swift.String, binaryMessenger: __C.FlutterBinaryMessenger) -> __C.FlutterMethodChannel in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "static Swift.String.+ infix(Swift.String, Swift.String) -> Swift.String", referenced from:
                     tflite_flutter.SwiftTfliteFlutter.handle(_: __C.FlutterMethodCall, result: (Any?) -> ()) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "type metadata for Swift.String", referenced from:
                     tflite_flutter.SwiftTfliteFlutter.handle(_: __C.FlutterMethodCall, result: (Any?) -> ()) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "__swift_FORCE_LOAD_$_swiftCompatibility50", referenced from:
                     __swift_FORCE_LOAD_$_swiftCompatibility50_$_tflite_flutter in libtflite_flutter.a(SwiftTfliteFlutter.o)
                    (maybe you meant: __swift_FORCE_LOAD_$_swiftCompatibility50_$_tflite_flutter)
                 "_swift_bridgeObjectRelease", referenced from:
                     @nonobjc __C.FlutterMethodChannel.__allocating_init(name: Swift.String, binaryMessenger: __C.FlutterBinaryMessenger) -> __C.FlutterMethodChannel in libtflite_flutter.a(SwiftTfliteFlutter.o)
                     tflite_flutter.SwiftTfliteFlutter.handle(_: __C.FlutterMethodCall, result: (Any?) -> ()) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "_swift_getObjCClassMetadata", referenced from:
                     type metadata accessor for __C.FlutterMethodChannel in libtflite_flutter.a(SwiftTfliteFlutter.o)
                     @objc static tflite_flutter.SwiftTfliteFlutter.register(with: __C.FlutterPluginRegistrar) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "_swift_getInitializedObjCClass", referenced from:
                     type metadata accessor for __C.FlutterMethodChannel in libtflite_flutter.a(SwiftTfliteFlutter.o)
                     type metadata accessor for tflite_flutter.SwiftTfliteFlutter in libtflite_flutter.a(SwiftTfliteFlutter.o)
                     tflite_flutter.SwiftTfliteFlutter.handle(_: __C.FlutterMethodCall, result: (Any?) -> ()) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                 "Swift.String.init(_builtinStringLiteral: Builtin.RawPointer, utf8CodeUnitCount: Builtin.Word, isASCII: Builtin.Int1) -> Swift.String", referenced from:
                     static tflite_flutter.SwiftTfliteFlutter.register(with: __C.FlutterPluginRegistrar) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
                     tflite_flutter.SwiftTfliteFlutter.handle(_: __C.FlutterMethodCall, result: (Any?) -> ()) -> () in libtflite_flutter.a(SwiftTfliteFlutter.o)
               ld: symbol(s) not found for architecture arm64

Usage of deprecated API

Hi,
Thanks for your valuable efforts.
Recently Android Studio complains about your package, saying:
Note: C:\flutter\.pub-cache\hosted\pub.dartlang.org\tflite_flutter-0.5.0\android\src\main\java\com\tfliteflutter\tflite_flutter_plugin\TfliteFlutterPlugin.java uses or overrides a deprecated API. Note: Recompile with -Xlint:deprecation for details.
I think this won't cause an issue soon, but you may consider improving it for better support in the future.

Unhandled Exception: Exception: FormatException: Missing extension byte (at offset 2610)

Hi,

I am getting above error during my loadModel() block. Here is the full error stack:

I/flutter (21928): Interpreter Created Successfully
I/flutter (21928): input_shape:
I/flutter (21928): [1, 224, 224, 3]
I/flutter (21928): TfLiteType.uint8
I/flutter (21928): output_shape:
I/flutter (21928): [1, 965]
I/flutter (21928): output_type:
I/flutter (21928): TfLiteType.uint8
E/flutter (21928): [ERROR:flutter/lib/ui/ui_dart_state.cc(166)] Unhandled Exception: Exception: FormatException: Missing extension byte (at offset 2610)
E/flutter (21928): #0      _Utf8Decoder.convertSingle (dart:convert-patch/convert_patch.dart:1783:7)
E/flutter (21928): #1      Utf8Decoder.convert (dart:convert/utf.dart:321:42)
E/flutter (21928): #2      Utf8Codec.decode (dart:convert/utf.dart:62:20)
E/flutter (21928): #3      AssetBundle._utf8decode (package:flutter/src/services/asset_bundle.dart:81:17)
E/flutter (21928): #4      _IsolateConfiguration.apply (package:flutter/src/foundation/_isolates_io.dart:83:34)
E/flutter (21928): #5      _spawn.<anonymous closure> (package:flutter/src/foundation/_isolates_io.dart:90:65)
E/flutter (21928): #6      Timeline.timeSync (dart:developer/timeline.dart:163:22)
E/flutter (21928): #7      _spawn (package:flutter/src/foundation/_isolates_io.dart:87:35)
E/flutter (21928): #8      _startIsolate.<anonymous closure> (dart:isolate-patch/isolate_patch.dart:304:17)
E/flutter (21928): #9      _RawReceivePortImpl._handleMessage (dart:isolate-patch/isolate_patch.dart:168:12)
E/flutter (21928): 

And below is my code:

class _HomeState extends State<Home> {
  Interpreter interpreter;
  List<int> _inputShape;
  List<int> _outputShape;
  TfLiteType _outputType = TfLiteType.uint8;
  TensorImage _inputImageTensor;
  TensorBuffer _outputBuffer;
  List<String> labels;
  final int _labelsLength = 965;
  final String _labelsFileName = 'assets/aiy_birds_V1_labelmap.txt';
  String modelName = 'aiy_vision_birds.tflite';

  bool _predicting = false;
  final picker = ImagePicker();
  File _imageFile;
  Category category;

  @override
  void initState(){
    super.initState();
    loadStuff();
  }

  loadStuff() async{
    await loadModel();
    await loadLabels();
  }

  Future<void> loadModel() async {
    try {
      interpreter = await Interpreter.fromAsset(modelName);
      print('Interpreter Created Successfully');

      _inputShape = interpreter.getInputTensor(0).shape;
      print('input_shape:');
      print(_inputShape);
      print(interpreter.getInputTensor(0).type);
      _outputShape = interpreter.getOutputTensor(0).shape;
      print('output_shape:');
      print(_outputShape);
      _outputType = interpreter.getOutputTensor(0).type;
      print('output_type:');
      print(_outputType);

      _outputBuffer = TensorBuffer.createFixedSize(_outputShape, _outputType);

    } catch (e) {
      print('Unable to create interpreter, Caught Exception: ${e.toString()}');
    }
  }

I have also run the required install.bat the output of which is below:

C:\AndroidStudioProjects\flutter_app_tfliter>install.bat
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   652  100   652    0     0    652      0  0:00:01 --:--:--  0:00:01  1605
100 1270k  100 1270k    0     0  1270k      0  0:00:01  0:00:01 --:--:-- 1270k
        1 file(s) moved.
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   654  100   654    0     0    654      0  0:00:01 --:--:--  0:00:01  1901
100 1843k  100 1843k    0     0  1843k      0  0:00:01  0:00:01 --:--:-- 3080k
        1 file(s) moved.
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   652  100   652    0     0    652      0  0:00:01 --:--:--  0:00:01  1738
100 4774k  100 4774k    0     0  2387k      0  0:00:02  0:00:02 --:--:-- 6254k
        1 file(s) moved.
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   655  100   655    0     0    655      0  0:00:01 --:--:--  0:00:01  1824
100 4732k  100 4732k    0     0  2366k      0  0:00:02  0:00:02 --:--:-- 5031k
        1 file(s) moved.

Please suggest what may be causing this?

Thank you

App crashes when interpreter is run

Hi,

My model loads up correctly and also gives the input output details, but when I run the intepreter.run command the app crashed with the following errors/messages:

F/libc    (18866): /buildbot/src/android/ndk-release-r18/external/libcxx/../../external/libcxxabi/src/abort_message.cpp:73: abort_message: assertion "terminating with uncaught exception of type std::bad_alloc: std::bad_alloc" failed
F/libc    (18866): Fatal signal 6 (SIGABRT), code -1 (SI_QUEUE) in tid 18941 (1.ui), pid 18866 (z.mytflite_app)
*** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
Build fingerprint: 'google/sdk_gphone_x86/generic_x86:10/QSR1.190920.001/5891938:user/release-keys'
Revision: '0'
ABI: 'x86'
Timestamp: 2021-02-06 22:41:55+0000
pid: 18866, tid: 18941, name: 1.ui  >>> com.companyName.mytflite_app <<<
uid: 10140
signal 6 (SIGABRT), code -1 (SI_QUEUE), fault addr --------
Abort message: '/buildbot/src/android/ndk-release-r18/external/libcxx/../../external/libcxxabi/src/abort_message.cpp:73: abort_message: assertion "terminating with uncaught exception of type std::bad_alloc: std::bad_alloc" failed'
    eax 00000000  ebx 000049b2  ecx 000049fd  edx 00000006
    edi ec18933e  esi bfafbf50
    ebp ed317ad0  esp bfafbef8  eip ed317ad9
backtrace:
      #00 pc 00000ad9  [vdso] (__kernel_vsyscall+9)
      #01 pc 00092328  /apex/com.android.runtime/lib/bionic/libc.so (syscall+40) (BuildId: 76290498408016ad14f4b98c3ab6c65c)
      #02 pc 000ad651  /apex/com.android.runtime/lib/bionic/libc.so (abort+193) (BuildId: 76290498408016ad14f4b98c3ab6c65c)
      #03 pc 000adb88  /apex/com.android.runtime/lib/bionic/libc.so (__assert2+56) (BuildId: 76290498408016ad14f4b98c3ab6c65c)
      #04 pc 0042d524  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #05 pc 0042d637  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #06 pc 0042a7e9  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #07 pc 0042a0de  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #08 pc 0042a033  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #09 pc 0042d008  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #10 pc 000b2fd6  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #11 pc 000b9f39  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #12 pc 000b8af3  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #13 pc 000b7348  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #14 pc 000ac3f0  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #15 pc 000aafab  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #16 pc 000e8a9b  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #17 pc 000e71b9  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #18 pc 000a9988  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #19 pc 00209dd0  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #20 pc 0020dd66  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so
      #21 pc 00011513  /data/app/com.companyName.mytflite_app-4kMl80Ab1ad_sBxcgJCUwQ==/lib/x86/libtensorflowlite_c.so (TfLiteInterpreterInvoke+35)
      #22 pc 00003f2f  <anonymous:bf300000>
Lost connection to device.

Below are the input details of my model:

var interpreterOptions = InterpreterOptions()..threads = 4;
    _interpreter = await Interpreter.fromAsset(
      'model_float32.tflite',
      options: interpreterOptions,
    );
    var _inputShape = _interpreter.getInputTensor(0).shape;    // [1, 512, 512, 3]
    var outputTensors = _interpreter.getOutputTensors();
    var _inputType = _interpreter.getInputTensor(0).type;       //Tflite.Float32
    _outputShapes = [];
    _outputTypes = [];
    outputTensors.forEach((tensor) {
      _outputShapes.add(tensor.shape);
      _outputTypes.add(tensor.type);
    });

And outputTensors looks like

image_2021-02-06_225102

So after searching for this problem I came accross this issus: #29 and believe my model is very similar so I created the input and output tensors as follows:

// Create TensorImage from image
    TensorImage inputImage = TensorImage(TfLiteType.float32);
    inputImage.loadImage(image);
    inputImage = getProcessedImage(inputImage);
    // Use [TensorImage.buffer] or [TensorBuffer.buffer] to pass by reference
    List<Object> input = [inputImage.buffer];
    TensorBuffer d0 = TensorBuffer.createFixedSize(_interpreter.getOutputTensor(0).shape, _interpreter.getOutputTensor(0).type);
    TensorBuffer d1 = TensorBuffer.createFixedSize(_interpreter.getOutputTensor(1).shape, _interpreter.getOutputTensor(1).type);
    TensorBuffer d2 = TensorBuffer.createFixedSize(_interpreter.getOutputTensor(2).shape, _interpreter.getOutputTensor(2).type);
    TensorBuffer d3 = TensorBuffer.createFixedSize(_interpreter.getOutputTensor(3).shape, _interpreter.getOutputTensor(3).type);
    TensorBuffer d4 = TensorBuffer.createFixedSize(_interpreter.getOutputTensor(4).shape, _interpreter.getOutputTensor(4).type);
    TensorBuffer d5 = TensorBuffer.createFixedSize(_interpreter.getOutputTensor(5).shape, _interpreter.getOutputTensor(5).type);
    TensorBuffer d6 = TensorBuffer.createFixedSize(_interpreter.getOutputTensor(6).shape, _interpreter.getOutputTensor(6).type);


    // Outputs map
    Map<int, Object> outputs = {
      0: d0.buffer,
      1: d1.buffer,
      2: d2.buffer,
      3: d3.buffer,
      4: d4.buffer,
      5: d5.buffer,
      6: d6.buffer
    };
    //run inference
    _interpreter.runForMultipleInputs(input, outputs);

But my app crashes with the above error :(

This tflite model works without any problems in python and produces the expected results.

What can be the problem? Why is app creashing?

Thank you

ios build fails with error: conflicting names: tensorflowlitec.framework.

I'm having an issue building for iOS. When I run "pod install", I get the error:
`-> Installing tflite_flutter (0.1.0)
- Running pre install hooks
[!] The 'Pods-Runner' target has frameworks with conflicting names: tensorflowlitec.framework.

/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.10.0/lib/cocoapods/installer/xcode/target_validator.rb:67:in
'verify_no_duplicate_names'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.10.0/lib/cocoapods/installer/xcode/target_validator.rb:54:in
'block (2 levels) in verify_no_duplicate_framework_and_library_names'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.10.0/lib/cocoapods/installer/xcode/target_validator.rb:48:in
'each_key'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.10.0/lib/cocoapods/installer/xcode/target_validator.rb:48:in
'block in verify_no_duplicate_framework_and_library_names'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.10.0/lib/cocoapods/installer/xcode/target_validator.rb:47:in
'each'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.10.0/lib/cocoapods/installer/xcode/target_validator.rb:47:in
'verify_no_duplicate_framework_and_library_names'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.10.0/lib/cocoapods/installer/xcode/target_validator.rb:37:in
'validate!'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.10.0/lib/cocoapods/installer.rb:595:in 'validate_targets'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.10.0/lib/cocoapods/installer.rb:162:in 'install!'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.10.0/lib/cocoapods/command/install.rb:52:in 'run'
/Library/Ruby/Gems/2.6.0/gems/claide-1.0.3/lib/claide/command.rb:334:in 'run'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.10.0/lib/cocoapods/command.rb:52:in 'run'
/Library/Ruby/Gems/2.6.0/gems/cocoapods-1.10.0/bin/pod:55:in '<top (required)>'
/usr/local/bin/pod:23:in 'load'
/usr/local/bin/pod:23:in '<main>'`

Could not find com.android.tools.build:gradle:5.4.1.

I don't know why my code is failing. I've run out of solutions to try to fix it and would appreciate any help. Tried everything I could find online.

**FAILURE: Build failed with an exception.

  • What went wrong:
    A problem occurred configuring the project ':tflite_flutter'.

Could not resolve all artifacts for configuration ':tflite_flutter:classpath'.
Could not find com.android.tools.build:gradle:5.4.1.
Searched in the following locations:
- https://dl.google.com/dl/android/maven2/com/android/tools/build/gradle/5.4.1/gradle-5.4.1.pom
- https://dl.google.com/dl/android/maven2/com/android/tools/build/gradle/5.4.1/gradle-5.4.1.jar
- https://jcenter.bintray.com/com/android/tools/build/gradle/5.4.1/gradle-5.4.1.pom
- https://jcenter.bintray.com/com/android/tools/build/gradle/5.4.1/gradle-5.4.1.jar
Required by:
project :tflite_flutter
Could not get unknown property 'android' for project ':tflite_flutter' of type org.gradle.api.Project.**

Below is the content of build.gradle file in the android folder:
**buildscript {
ext.kotlin_version = '1.3.50'
repositories {
google()
jcenter()
}

dependencies {
    classpath 'com.android.tools.build:gradle:3.6.0'
    classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
}

}

allprojects {
repositories {
google()
jcenter()
}
}

rootProject.buildDir = '../build'
subprojects {
project.buildDir = "${rootProject.buildDir}/${project.name}"
}
subprojects {
project.evaluationDependsOn(':app')
}

task clean(type: Delete) {
delete rootProject.buildDir
}**

Below is the content of wrapper file
distributionUrl=https://services.gradle.org/distributions/gradle-5.6.4-all.zip

When I run "gradlew build --stacktrace" in the terminal, I get the following:
**FAILURE: Build failed with an exception.

  • What went wrong:
    Could not initialize class org.codehaus.groovy.runtime.InvokerHelper**

I'm using Windows 10, and this is some additional information. Just in case it can help,
Kotlin: 1.3.41
Groovy: 2.5.4
Ant: Apache Ant(TM) version 1.9.14 compiled on March 12 2019
JVM: 14.0.2 (Oracle Corporation 14.0.2+12-46)
OS: Windows 10 10.0 amd64

I don't know what to do anymore. I keep getting the same error after doing the initial setup of running "install.bat".

Didn't find op for builtin opcode 'RESIZE_NEAREST_NEIGHBOR' version '3'

While trying to create an Interpreter from an asset the following error is caught:

E/tflite  (24363): Didn't find op for builtin opcode 'RESIZE_NEAREST_NEIGHBOR' version '3'
E/tflite  (24363): Registration failed.
I/flutter (24363): Error while creating interpreter: Invalid argument(s): Unable to create interpreter.

The asset is a custom YOLO model exported to TFLite using these instructions.
It has 1 input float32[1,640,640,3] and 3 outputs float32[1,3,6400,85], float32[1,3,1600,85] and float32[1,3,400,85]

sh install.sh error

used flutter pub get to download the package, but some how the TensorImage class cannot be used. So I copied and ran the bash file in project root but it gives errors:

readlink: illegal option -- f
usage: readlink [-n] [file ...]
install.sh: line 29: wget: command not found
mv: rename libtensorflowlite_c_arm.so to android/app/src/main/jniLibs/armeabi-v7a/libtensorflowlite_c.so: No such file or directory
install.sh: line 29: wget: command not found
mv: rename libtensorflowlite_c_arm64.so to android/app/src/main/jniLibs/arm64-v8a/libtensorflowlite_c.so: No such file or directory
install.sh: line 29: wget: command not found
mv: rename libtensorflowlite_c_x86.so to android/app/src/main/jniLibs/x86/libtensorflowlite_c.so: No such file or directory
install.sh: line 29: wget: command not found
mv: rename libtensorflowlite_c_x86_64.so to android/app/src/main/jniLibs/x86_64/libtensorflowlite_c.so: No such file or directory``

Prediction using tflite_flutter takes too long (8 seconds) while same model in Kotlin predicts in 200ms??

Hi @am15h ,

I'm at my wits end and can't figure out how else to optimize my model but in flutter, prediction takes about 8 to 9 seconds, which is very long.. I thought something was wrong with my model but when I tried the same model in Kotlin, it gave result in under 200 ms.

I'm only taking into account the interpreter.run() command and using Stopwatch() to keep track of it.

 timer.start();
_interpreter.run(inputIds, predictions);
print('inference done in ' + timer.elapsedMilliseconds.toString());
timer.reset();

I'm initializing the model like:

var interpreterOptions = InterpreterOptions()..threads = NUM_LITE_THREADS;
    _interpreter = await Interpreter.fromAsset(
      modelFile,
      options: interpreterOptions,
    );

I'm not using NNAPI as it does not improve the inference speed, and can't use gpudelegate as it fails to initialize model.

My input is of the shape [1, 32] and is of type int8. My outputs are of shape [1, 32, 50527] and if of type float32

I thought this was an error in my model but when I ran the same model in Kotlin using:

tflite.runForMultipleInputsOutputs(arrayOf(inputIds), outputs)

I get the same prediction in under 200ms.. The Kotlin model is initialized on the CPU just like the flutter one is:

private suspend fun loadModel(): Interpreter = withContext(Dispatchers.IO) {
        val assetFileDescriptor = getApplication<Application>().assets.openFd(MODEL_PATH)
        assetFileDescriptor.use {
            val fileChannel = FileInputStream(assetFileDescriptor.fileDescriptor).channel
            val modelBuffer = fileChannel.map(FileChannel.MapMode.READ_ONLY, it.startOffset, it.declaredLength)

            val opts = Interpreter.Options()
            opts.setNumThreads(NUM_LITE_THREADS)
            return@use Interpreter(modelBuffer, opts)
        }
    }

Is there any reason why the model is performing so poorly in flutter? What can I change fix? Any thoughts on this will be very helpful.

Thank you

[Feature] Support for FlexDelegate

Hi,

Thanks for the useful plugin. If my custom model graph has tensorflow Ops which are not standard built in Ops in TensorFlow lite, then we can use FlexDelegate to include them. But I don't see this is currently supported in this plugin. Do you have any plan to add them?

How to get values from a model producing output of type int64?

Hi,

I have a model with the following details:

var _inputShape = _interpreter.getInputTensor(0).shape;    //[256, 256]
    var _inputType = _interpreter.getInputTensor(0).type;    //float32
    _outputShape = _interpreter.getOutputTensor(0).shape;   //[256, 256]
    _outputType = _interpreter.getOutputTensor(0).type;    /// int64

I tried using tflite_flutter_helper package but cannot as I get the error: TensorBuffer does not support type int64. So I am handling of input and output of my model as follows:

img.Image originalImage = img.decodeImage(File(imgFile.path).readAsBytesSync());
    img.Image imgResize = img.copyResize(originalImage, width:256, height:256);
    Uint8List inputImage = _imageToByteListUInt8(imgResize, 256, 0, 255);

    var predictions = List.filled(256, List.filled(256, 0));        // create empty list of size [256, 256] with 0 fill
    _interpreter.run(inputImage, predictions);

    img.Image outputMask = _convertArrayToImage(predictions, 256);

The helper functions used above are:

Uint8List _imageToByteListUInt8(
      img.Image image,
      int inputSize,
      double mean,
      double std,
      ) {
    var convertedBytes = Float32List(1 * inputSize * inputSize * 3);
    var buffer = Float32List.view(convertedBytes.buffer);
    int pixelIndex = 0;

    for (var i = 0; i < inputSize; i++) {
      for (var j = 0; j < inputSize; j++) {
        var pixel = image.getPixel(j, i);
        buffer[pixelIndex++] = (img.getRed(pixel) - mean) / std;
        buffer[pixelIndex++] = (img.getGreen(pixel) - mean) / std;
        buffer[pixelIndex++] = (img.getBlue(pixel) - mean) / std;
      }
    }
    return convertedBytes.buffer.asUint8List();
  }

The _interpreter runs but I cannot extract data from the predictions.

The model is a segmentation model and the output is color code for each detected region e.g. in Python the output from the interpreter is 0=ground, 1=sky etc..

How do I extract this data from the predictions?

I tried to convert the output Array to image using:

img.Image _convertArrayToImage(List<List<int>> imageArray, int inputSize) {
    img.Image image = img.Image.rgb(inputSize, inputSize);
    for (var x = 0; x < imageArray.length; x++) {
      for (var y = 0; y < imageArray[0].length; y++) {
        var r = (imageArray[x][y]).toInt();
        var g = (imageArray[x][y+1]).toInt();
        var b = (imageArray[x][y+2]).toInt();
        var a = (imageArray[x][y+3]).toInt();
        image.setPixelRgba(x, y, r, g, b);
      }
    }
    return image;
  }

But I get the error: Unhandled Exception: RangeError (index): Invalid value: Not in inclusive range 0..255: 256 And this error shows that my method for this is wrong, I dont want Uint8 values, I want the int64 color codes..

Thanks

How to use within an Isolate?

Hi,

Congrats for this plugin!

I've been doing some tests to pass my predict function to a separate thread using an Isolate.

My first approach was loading your plugin within an Isolate, but that's not possible as it requires the initialization of the Bindings.

The second approach was to pass the Interpreter to the Isolate, but that is also not possible. It only accepts primitive values, and the interpreter is a dart:ffi object.

Do you have any hints on how to accomplish such a task?

Unhandled Exception: Bad state: failed precondition

Hello, I am trying to use the mediapipe facemesh model, which is this in tfjs format, but I can't get it to work, I have no idea what I'm doing wrong, I'd appreciate any help, thanks

Model: https://github.com/google/mediapipe/tree/master/mediapipe/models#face-mesh
File: https://github.com/google/mediapipe/blob/master/mediapipe/models/face_landmark.tflite

Code:

Future _loadModel() async {
    _interpreter = await Interpreter.fromAsset('models/face_landmark.tflite');
    print('Interpreter loaded successfully');
  }

 _onStream() async {
    final CameraDescription description =
        await ScannerUtils.getCamera(_direction);
    controller = CameraController(description, ResolutionPreset.ultraHigh,
        enableAudio: false);
    await controller.initialize();
    setState(() {});
    await controller.startImageStream((CameraImage img) async {
      if (!_isDetecting) {
        var input = img.planes.map((plane) {return plane.bytes;}).toList();
        int list0 = 0;
        var output = new Map<int, Object>.from({0: list0});
        _interpreter.runForMultipleInputs(input, output);

        print(output.toString());

        _isDetecting = false;
      }
    });
  }

Error:

2020-07-10 18:59:50.076078-0500 Runner[24953:4058491] flutter: Interpreter loaded successfully 2020-07-10 18:59:50.683677-0500 Runner[24953:4058491] [VERBOSE-2:ui_dart_state.cc(166)] Unhandled Exception: Bad state: failed precondition #0 ced (PXe:71) #1 jca.Tkd (XYe:150) #2 gca.Fkd (UYe:183) #3 _ho._cxb.<anonymous closure> (DPe:68) #4 _ho._cxb.<anonymous closure> (DPe:63) #5 am.yjb.<anonymous closure> (HOe:412) #6 _fma (tLe:1198) #7 _qc.wza (tLe:1100) #8 _qc.Bza (tLe:1005) #9 _BufferingStreamSubscription._Pya (qLe:357) #10 _Kb.Rza (qLe:611) #11 _Nb.TBa (qLe:730) #12 _Ib.RBa.<anonymous closure> (qLe:687) #13 _ema (tLe:1182) #14 _qc.vza (tLe:1093) #15 _qc.Aza (tLe:997) #16 _qc.Iza.<anonymous closure> (tLe:1037) #17 _ema (tLe:1190) #18 _qc.vza (tLe:1093) #19 _qc.Aza (tLe:997) #20 _qc.Iza.<anonymous closure> (tLe:1037) #21 _Mla (oLe:41) #22 _Nla (oLe:50)

Outputs from model:

[
      {
        faceInViewConfidence: 1, // The probability of a face being present.
        boundingBox: { // The bounding box surrounding the face.
          topLeft: [232.28, 145.26],
          bottomRight: [449.75, 308.36],
        },
        mesh: [ // The 3D coordinates of each facial landmark.
          [92.07, 119.49, -17.54],
          [91.97, 102.52, -30.54],
          ...
        ],
        scaledMesh: [ // The 3D coordinates of each facial landmark, normalized.
          [322.32, 297.58, -17.54],
          [322.18, 263.95, -30.54]
        ],
        annotations: { // Semantic groupings of the `scaledMesh` coordinates.
          silhouette: [
            [326.19, 124.72, -3.82],
            [351.06, 126.30, -3.00],
            ...
          ],
          ...
        }
      }
    ]

Info: https://www.npmjs.com/package/@tensorflow-models/facemesh

UPDATE

I update my code to this:

try {
      Interpreter interpreter =
          await Interpreter.fromAsset("models/face_detection_front.tflite");

      var _inputShape = interpreter.getInputTensor(0).shape;
      var _outputShape = interpreter.getOutputTensor(0).shape;
      var _outputType = interpreter.getOutputTensor(0).type;

      ImageProcessor imageProcessor = ImageProcessorBuilder()
          .add(ResizeOp(
              _inputShape[1], _inputShape[2], ResizeMethod.NEAREST_NEIGHBOUR))
          .build();
      var file = File(
          (await picker.getImage(source: ImageSource.gallery, maxWidth: 560))
              .path);
      TensorImage tensorImage = TensorImage.fromFile(file);
      tensorImage = imageProcessor.process(tensorImage);

      var _outputBuffer =
          TensorBuffer.createFixedSize(_outputShape, _outputType);
      print(_outputShape.toString());
      interpreter.run(tensorImage.buffer, _outputBuffer.getBuffer());
      print(_outputBuffer);
    } catch (e) {
      print(e);
    }

But i always get
flutter: Bad state: failed precondition

Setting Input Tensors

Hi, cool package!
I was wondering if there is a set tensor method or an equivalent approach to setting the input tensors before calling invoke.
Thank you

Unable to loading a Custom TFlite Model

I follow along this Colab to train a custom model.

Conversion process Colab

After completing the training process I converted the .pb to .tflite and I got these files. When I loaded the model into the official Android demo I got the following error:

 java.lang.RuntimeException: Unable to start activity ComponentInfo{com.example.sd_detect/com.example.sd_detect.MainActivity}: java.lang.IllegalStateException: This model does not contain associated files, and is not a Zip file.

The solution to that error is discussed in this issue

I followed the solution:

Installing the metadata library with:

pip install tflite-support

and then executing the following command into python

>>> from tflite_support import metadata as _metadata
>>> populator = _metadata.MetadataPopulator.with_model_file('final_model.tflite')
>>> populator.load_associated_files(["final_model.txt"])
>>> populator.populate() 

And I got the following warning:

/home/username/.local/lib/python3.8/site-packages/tensorflow_lite_support/metadata/python/metadata.py:342: UserWarning: File, 'final_model.txt', does not exsit in the metadata. But packing it to tflite model is still allowed.
  warnings.warn(

Back to Android Studio example I was able to run the model successfully, by just modifying this information:

private static final String TF_OD_API_MODEL_FILE = "final_model.tflite";  
private static final boolean IS_MODEL_QUANTIZED = false;
private static final String TF_OD_API_LABELS_FILE = "final_model.txt";  

But when I bring my model to my Flutter application, I got the following exeption:

/// Throws a [StateError] if the given [expression] is `false`.

void  checkState(bool expression, {message}) {

if (!expression) {

throw  StateError(_resolveMessage(message, 'failed precondition'));

}

}

The flutter project is fully working with coco_ssd_mobilenet_v1_1.0_quant_2018_06_29

I also tried to re-run the metadata commands on my flutter application assets but still the same issue.

What am I missing?

Edited:
In Android Studio (Android App) I set the variable IS_MODEL_QUANTIZED to false, is the issue related to this?

Recompile Binaries using flex delegates

Hi,
If we need to use flex delegate with our models do I simply need to recompile with flex delegates option

bazel build -c opt --config=ios --ios_multi_cpus=armv7,arm64,x86_64
//tensorflow/lite/ios:TensorFlowLiteSelectTfOps_framework

As such? Thank you

IMPORTANT: breaks on iOS due to ffmpeg duplicates

When adding this plugin to my app the following error occurs

I use flutter_sound which also uses mobile-ffmpeg-full

When running on iOS the following error occurs.

   duplicate symbol '_oc_frag_copy' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(fragment.o)
    duplicate symbol '_oc_restore_fpu' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(fragment.o)
    duplicate symbol '_oc_frag_recon_inter' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(fragment.o)
    duplicate symbol '_oc_frag_copy_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(fragment.o)
    duplicate symbol '_oc_restore_fpu_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(fragment.o)
    duplicate symbol '_oc_frag_recon_inter_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(fragment.o)
    duplicate symbol '_oc_frag_recon_intra_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(fragment.o)
    duplicate symbol '_oc_frag_recon_inter2_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(fragment.o)
    duplicate symbol '_oc_frag_recon_intra' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(fragment.o)
    duplicate symbol '_oc_frag_recon_inter2' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(fragment.o)
    duplicate symbol '_oc_idct8x8_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(idct.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(idct.o)
    duplicate symbol '_oc_idct8x8' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(idct.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(idct.o)
    duplicate symbol '_th_comment_init' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(info.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(info.o)
    duplicate symbol '_th_info_init' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(info.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(info.o)
    duplicate symbol '_th_comment_clear' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(info.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(info.o)
    duplicate symbol '_th_info_clear' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(info.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(info.o)
    duplicate symbol '_oc_ycbcr_buffer_flip' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(internal.o)
    duplicate symbol '_th_version_string' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(internal.o)
    duplicate symbol '_oc_free_2d' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(internal.o)
    duplicate symbol '_oc_malloc_2d' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(internal.o)
    duplicate symbol '_OC_MB_MAP_IDXS' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(internal.o)
    duplicate symbol '_OC_MB_MAP_NIDXS' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(internal.o)
    duplicate symbol '_OC_DCT_TOKEN_EXTRA_BITS' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(internal.o)
    duplicate symbol '_OC_MB_MAP' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(internal.o)
    duplicate symbol '_OC_IZIG_ZAG' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(internal.o)
    duplicate symbol '_OC_FZIG_ZAG' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(internal.o)
    duplicate symbol '_OC_SET_CHROMA_MVS_TABLE' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(internal.o)
    duplicate symbol '_oc_dequant_tables_init' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(quant.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(quant.o)
    duplicate symbol '_oc_state_frag_copy_list' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(state.o)
    duplicate symbol '_oc_state_loop_filter_init' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(state.o)
    duplicate symbol '_oc_state_init' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(state.o)
    duplicate symbol '_oc_state_borders_fill_rows' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(state.o)
    duplicate symbol '_oc_state_loop_filter_frag_rows' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(state.o)
    duplicate symbol '_oc_state_get_mv_offsets' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(state.o)
    duplicate symbol '_oc_state_borders_fill_caps' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(state.o)
    duplicate symbol '_oc_state_clear' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(state.o)
    duplicate symbol '_th_granule_frame' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(state.o)
    duplicate symbol '_oc_state_frag_copy_list_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(state.o)
    duplicate symbol '_oc_state_vtable_init_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(state.o)
    duplicate symbol '_oc_state_loop_filter_frag_rows_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(state.o)
    duplicate symbol '_oc_state_frag_recon_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoradec.framework/libtheoradec(state.o)
    duplicate symbol '_oc_frag_copy' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(fragment.o)
    duplicate symbol '_oc_restore_fpu' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(fragment.o)
    duplicate symbol '_oc_frag_recon_inter' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(fragment.o)
    duplicate symbol '_oc_frag_copy_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(fragment.o)
    duplicate symbol '_oc_restore_fpu_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(fragment.o)
    duplicate symbol '_oc_frag_recon_inter_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(fragment.o)
    duplicate symbol '_oc_frag_recon_intra_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(fragment.o)
    duplicate symbol '_oc_frag_recon_inter2_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(fragment.o)
    duplicate symbol '_oc_frag_recon_intra' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(fragment.o)
    duplicate symbol '_oc_frag_recon_inter2' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fragment.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(fragment.o)
    duplicate symbol '_oc_idct8x8_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(idct.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(idct.o)
    duplicate symbol '_oc_idct8x8' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(idct.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(idct.o)
    duplicate symbol '_oc_ycbcr_buffer_flip' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(internal.o)
    duplicate symbol '_th_version_string' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(internal.o)
    duplicate symbol '_oc_free_2d' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(internal.o)
    duplicate symbol '_oc_malloc_2d' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(internal.o)
    duplicate symbol '_OC_MB_MAP_IDXS' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(internal.o)
    duplicate symbol '_OC_MB_MAP_NIDXS' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(internal.o)
    duplicate symbol '_OC_DCT_TOKEN_EXTRA_BITS' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(internal.o)
    duplicate symbol '_OC_MB_MAP' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(internal.o)
    duplicate symbol '_OC_IZIG_ZAG' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(internal.o)
    duplicate symbol '_OC_FZIG_ZAG' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(internal.o)
    duplicate symbol '_OC_SET_CHROMA_MVS_TABLE' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(internal.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(internal.o)
    duplicate symbol '_oc_state_frag_copy_list' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(state.o)
    duplicate symbol '_oc_state_loop_filter_init' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(state.o)
    duplicate symbol '_oc_state_init' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(state.o)
    duplicate symbol '_oc_state_borders_fill_rows' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(state.o)
    duplicate symbol '_oc_state_loop_filter_frag_rows' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(state.o)
    duplicate symbol '_oc_state_get_mv_offsets' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(state.o)
    duplicate symbol '_oc_state_borders_fill_caps' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(state.o)
    duplicate symbol '_oc_state_clear' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(state.o)
    duplicate symbol '_th_granule_frame' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(state.o)
    duplicate symbol '_oc_state_frag_copy_list_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(state.o)
    duplicate symbol '_oc_state_vtable_init_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(state.o)
    duplicate symbol '_oc_state_loop_filter_frag_rows_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(state.o)
    duplicate symbol '_oc_state_frag_recon_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(state.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(state.o)
    duplicate symbol '_oc_dequant_tables_init' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(quant.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(quant.o)
    duplicate symbol '_oc_mode_scheme_chooser_init' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(analyze.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(analyze.o)
    duplicate symbol '_oc_enc_analyze_inter' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(analyze.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(analyze.o)
    duplicate symbol '_oc_enc_analyze_intra' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(analyze.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(analyze.o)
    duplicate symbol '_oc_enc_fdct8x8_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fdct.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(fdct.o)
    duplicate symbol '_oc_enc_fdct8x8' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(fdct.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(fdct.o)
    duplicate symbol '_oc_enc_frag_recon_inter' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_satd_thresh' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_hadamard_sad_thresh' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_satd2_thresh' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_intra_satd' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_sad' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_satd_thresh_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_sad_thresh_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_satd2_thresh_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_sad2_thresh_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_intra_satd_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_sad_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_sub_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_sub_128_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_copy2_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_sub' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_recon_intra' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_sub_128' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_enc_frag_copy2' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encfrag.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encfrag.o)
    duplicate symbol '_oc_state_flushheader' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encinfo.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encinfo.o)
    duplicate symbol '_th_encode_packetout' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encode.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encode.o)
    duplicate symbol '_th_encode_flushheader' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encode.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encode.o)
    duplicate symbol '_th_encode_ycbcr_in' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encode.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encode.o)
    duplicate symbol '_th_encode_ctl' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encode.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encode.o)
    duplicate symbol '_th_encode_free' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encode.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encode.o)
    duplicate symbol '_th_encode_alloc' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encode.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encode.o)
    duplicate symbol '_oc_enc_vtable_init_c' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encode.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encode.o)
    duplicate symbol '_OC_MV_BITS' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encode.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encode.o)
    duplicate symbol '_OC_MODE_BITS' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encode.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encode.o)
    duplicate symbol '_OC_BLOCK_RUN_CODE_NBITS' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encode.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encode.o)
    duplicate symbol '_OC_SB_RUN_CODE_NBITS' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encode.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encode.o)
    duplicate symbol '_TH_DEF_QUANT_INFO' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encode.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encode.o)
    duplicate symbol '_TH_VP31_QUANT_INFO' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encode.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encode.o)
    duplicate symbol '_OC_SB_RUN_VAL_MIN' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(encode.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(encode.o)
    duplicate symbol '_oc_enquant_tables_init' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(enquant.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(enquant.o)
    duplicate symbol '_oc_enquant_qavg_init' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(enquant.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(enquant.o)
    duplicate symbol '_oc_quant_params_pack' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(enquant.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(enquant.o)
    duplicate symbol '_oc_huff_codes_pack' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(huffenc.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(huffenc.o)
    duplicate symbol '_TH_VP31_HUFF_CODES' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(huffenc.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(huffenc.o)
    duplicate symbol '_oc_bexp64' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(mathops.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(mathops.o)
    duplicate symbol '_oc_ilog64' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(mathops.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(mathops.o)
    duplicate symbol '_oc_blog64' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(mathops.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(mathops.o)
    duplicate symbol '_oc_ilog32' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(mathops.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(mathops.o)
    duplicate symbol '_oc_mcenc_refine4mv' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(mcenc.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(mcenc.o)
    duplicate symbol '_oc_mcenc_refine1mv' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(mcenc.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(mcenc.o)
    duplicate symbol '_oc_mcenc_search' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(mcenc.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(mcenc.o)
    duplicate symbol '_oc_mcenc_search_frame' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(mcenc.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(mcenc.o)
    duplicate symbol '_oc_enc_rc_2pass_out' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(rate.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(rate.o)
    duplicate symbol '_oc_rc_state_init' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(rate.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(rate.o)
    duplicate symbol '_oc_rc_state_clear' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(rate.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(rate.o)
    duplicate symbol '_oc_enc_rc_2pass_in' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(rate.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(rate.o)
    duplicate symbol '_oc_enc_select_qi' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(rate.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(rate.o)
    duplicate symbol '_oc_enc_rc_resize' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(rate.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(rate.o)
    duplicate symbol '_oc_enc_update_rc_state' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(rate.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(rate.o)
    duplicate symbol '_oc_enc_calc_lambda' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(rate.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(rate.o)
    duplicate symbol '_oc_enc_tokenize_dc_frag_list' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(tokenize.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(tokenize.o)
    duplicate symbol '_oc_enc_tokenize_start' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(tokenize.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(tokenize.o)
    duplicate symbol '_oc_enc_pred_dc_frag_rows' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(tokenize.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(tokenize.o)
    duplicate symbol '_oc_enc_tokenlog_rollback' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(tokenize.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(tokenize.o)
    duplicate symbol '_oc_enc_tokenize_finish' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(tokenize.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(tokenize.o)
    duplicate symbol '_oc_enc_tokenize_ac' in:
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheora.framework/libtheora(tokenize.o)
        /Path/to/MyApp/ios/Pods/mobile-ffmpeg-full/libtheoraenc.framework/libtheoraenc(tokenize.o)

String utils

Hi!

As practice I wanted to try out converting the android implementation of Smart Reply to Flutter but I'm having trouble with this function:

// Predict with TfLite model.
void ExecuteTfLite(const std::string& sentence,
                   ::tflite::Interpreter* interpreter,
                   std::map<std::string, float>* response_map) {
  {
    TfLiteTensor* input = interpreter->tensor(interpreter->inputs()[0]);
    tflite::DynamicBuffer buf;
    buf.AddString(sentence.data(), sentence.length());
    buf.WriteToTensorAsVector(input);
    interpreter->AllocateTensors();

    interpreter->Invoke();

    TfLiteTensor* messages = interpreter->tensor(interpreter->outputs()[0]);
    TfLiteTensor* confidence = interpreter->tensor(interpreter->outputs()[1]);

    for (int i = 0; i < confidence->dims->data[0]; i++) {
      float weight = confidence->data.f[i];
      auto response_text = tflite::GetString(messages, i);
      if (response_text.len > 0) {
        (*response_map)[string(response_text.str, response_text.len)] += weight;
      }
    }
  }
}

I don't think the model has a fixed size for input and all the readme suggests is that the input string is converted to tensors. From this .h file it looks like the input eventually ends up in this format for the string 'AB' after calling buf.WriteToTensorAsVector:

// [
//   2, 0, 0, 0,     # 2 strings.
//   16, 0, 0, 0,    # 0-th string starts from index 16.
//   18, 0, 0, 0,    # 1-st string starts from index 18.
//   18, 0, 0, 0,    # total length of array.
//   'A', 'B',       # 0-th string [16..17]: "AB"
// ]                 # 1-th string, empty

Is it possible to make bindings for these string utils or to have some builtin to this library to help with this?

Sorry if I'm mistaking some things - definitely not a C++ programmer.

How to improve latency of object detection model ssd_mobilenet_v3_small_coco?

First of all thanks for this awesome plugin!

I am using this plugin to do interference with the model 'ssd_mobilenet_v3_small_coco' which can be found here.

This is how I invoke the interpreter:

interpreter.runForMultipleInputs( [input], {0: output1, 1: output2, 2: output3, 3: output4} );

I get good predictions, but interference takes longer than expected. According to the model zoo 'ssd_mobilenet_v3_small_coco' needs 43ms on a Pixel 1. I am testing on a One Plus 7 Pro where it takes ~250ms (without delegates). Shouldn't I be able to achieve similiar if not better latency, because the One Plus should have better processing units or does the Pixel 1 have some kind of special processing unit which enables faster interference?

When I try to use

var interpreterOptions = InterpreterOptions()..useNnApiForAndroid = true;

I get the error

I/TypeManager(13204): Failed to read /vendor/etc/nnapi_extensions_app_allowlist ; No app allowlisted for vendor extensions use.

and interference takes around 2 seconds.

Using

    final gpuDelegateV2 = GpuDelegateV2(
        options: GpuDelegateOptionsV2(
          true,
          TfLiteGpuInferenceUsage.fastSingleAnswer,
          TfLiteGpuInferencePriority.minLatency,
          TfLiteGpuInferencePriority.minLatency,
          TfLiteGpuInferencePriority.minLatency,
        ));

does not improve the latency.

Is there anything I can do to improve the latency?

Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.

Hi,

I'm getting the following error when initializing interpreter:

I/tflite  ( 6266): Initialized TensorFlow Lite runtime.
E/tflite  ( 6266): Regular TensorFlow ops are not supported by this interpreter. Make sure you apply/link the Flex delegate before inference.
E/tflite  ( 6266): Node number 0 (FlexPlaceholder) failed to prepare.
E/flutter ( 6266): [ERROR:flutter/lib/ui/ui_dart_state.cc(177)] Unhandled Exception: Bad state: failed precondition
E/flutter ( 6266): #0      checkState (package:quiver/check.dart:73:5)
E/flutter ( 6266): #1      Interpreter.allocateTensors (package:tflite_flutter/src/interpreter.dart:150:5)
E/flutter ( 6266): #2      new Interpreter._ (package:tflite_flutter/src/interpreter.dart:31:5)
E/flutter ( 6266): #3      new Interpreter._create (package:tflite_flutter/src/interpreter.dart:42:24)
E/flutter ( 6266): #4      new Interpreter.fromBuffer (package:tflite_flutter/src/interpreter.dart:91:37)
E/flutter ( 6266): #5      Interpreter.fromAsset (package:tflite_flutter/src/interpreter.dart:114:24)
E/flutter ( 6266): <asynchronous suspension>
E/flutter ( 6266): #6      _MyHomePageState.loadModel (package:text_gen_gpu/main.dart:321:20)
E/flutter ( 6266): <asynchronous suspension>
E/flutter ( 6266): #7      _MyHomePageState.init (package:text_gen_gpu/main.dart:299:5)
E/flutter ( 6266): <asynchronous suspension>
E/flutter ( 6266): #8      _MyHomePageState.initState.<anonymous closure> (package:text_gen_gpu/main.dart)
E/flutter ( 6266): <asynchronous suspension>
E/flutter ( 6266): 

Im initializing on cpu:

var interpreterOptions = InterpreterOptions()..threads = NUM_LITE_THREADS;
    _interpreter = await Interpreter.fromAsset(
      modelFile,
      options: interpreterOptions,
    );

Duplicate symbols preventing launch on iOS

Hey,
for days and hours I've been trying to fix this issue now, but I've run out of things to try to be honest. I try to flutter run but it gives me this output:

Launching lib/main.dart on iPad in debug mode...
 
Automatically signing iOS for device deployment using specified development team in Xcode project: XXXXXXX
Running pod install...                                              5.0s
Running Xcode build...                                                  
 โ””โ”€Compiling, linking and signing...                         3.5s
Xcode build done.                                           15.3s
Failed to build iOS app
Error output from Xcode build:
โ†ณ
    ** BUILD FAILED **


Xcode's output:
โ†ณ
    7 warnings generated.
    In file included from /Users/username/Documents/Flutter/flutter/.pub-cache/hosted/pub.dartlang.org/firebase_crashlytics-0.2.1/ios/Classes/FLTFirebaseCrashlyticsPlugin.m:7:
    /Users/username/Desktop/projectgit/project_flutter/ios/Pods/Headers/Public/Firebase/Firebase.h:75:10: warning: "FirebaseAnalytics.framework is not included in your target. Please add `Firebase/Analytics` to your Podfile or add FirebaseAnalytics.framework to
    your project to ensure Firebase Messaging works as intended." [-W#warnings]
            #warning "FirebaseAnalytics.framework is not included in your target. Please add \
             ^
    1 warning generated.
    duplicate symbol '_AnnotateRWLockDestroy' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_ValgrindSlowdown' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_AnnotateEnableRaceDetection' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_AnnotateIgnoreWritesBegin' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_AnnotateIgnoreReadsBegin' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_AnnotateRWLockCreate' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_AnnotateThreadName' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_AnnotateBenignRace' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_RunningOnValgrind' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_AnnotateIgnoreWritesEnd' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_AnnotateIgnoreReadsEnd' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_AnnotateMemoryIsUninitialized' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_AnnotateMemoryIsInitialized' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_AnnotateBenignRaceSized' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_AnnotateRWLockReleased' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    duplicate symbol '_AnnotateRWLockAcquired' in:
        /Users/username/Desktop/projectgit/tflite_flutter_plugin/ios/TensorFlowLiteC.framework/TensorFlowLiteC(dynamic_annotations_c9bf866fe89c02b98f86dda9d34be0c4.o)
        /Users/username/Desktop/projectgit/project_flutter/build/ios/Debug-iphoneos/abseil.framework/abseil(dynamic_annotations.o)
    ld: 16 duplicate symbols for architecture arm64
    clang: error: linker command failed with exit code 1 (use -v to see invocation)
    note: Using new build system
    note: Building targets in parallel
    note: Planning build
    note: Constructing build description
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'GoogleUtilities' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'nanopb' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'Reachability' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'Protobuf' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'PromisesObjC' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'GoogleDataTransport' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'webview_flutter' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'shared_preferences' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'rate_my_app' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'permission_handler' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'path_provider' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'device_info' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'keyboard_visibility' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'connectivity' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'GoogleDataTransportCCTSupport' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'FirebaseCoreDiagnostics' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'FirebaseCore' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'FirebaseInstallations' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'FirebaseInstanceID' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'FirebaseCrashlytics' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'FirebaseMessaging' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'firebase_core' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'firebase_messaging' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'firebase_crashlytics' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'FirebaseCoreDiagnosticsInterop' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'FirebaseFirestore' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'Flutter' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'FirebaseAnalyticsInterop' from project 'Pods')
    warning: The iOS deployment target 'IPHONEOS_DEPLOYMENT_TARGET' is set to 8.0, but the range of supported deployment target versions is 9.0 to 14.0.99. (in target 'Firebase' from project 'Pods')

Could not build the precompiled application for the device.

Error launching application on iPad 

My flutter doctor output:

[โœ“] Flutter (Channel stable, 1.22.0, on Mac OS X 10.15.6 19G71a, locale en-GB)
    โ€ข Flutter version 1.22.0 at /Users/username/Documents/Flutter/flutter
    โ€ข Framework revision d408d302e2 (6 days ago), 2020-09-29 11:49:17 -0700
    โ€ข Engine revision 5babba6c4d
    โ€ข Dart version 2.10.0

[โœ—] Android toolchain - develop for Android devices
    โœ— Unable to locate Android SDK.
      Install Android Studio from:
      https://developer.android.com/studio/index.html
      On first launch it will assist you in installing the Android SDK
      components.
      (or visit https://flutter.dev/docs/get-started/install/macos#android-setup
      for detailed instructions).
      If the Android SDK has been installed to a custom location, set
      ANDROID_SDK_ROOT to that location.
      You may also want to add it to your PATH environment variable.


[โœ“] Xcode - develop for iOS and macOS (Xcode 12.0.1)
    โ€ข Xcode at /Applications/Xcode.app/Contents/Developer
    โ€ข Xcode 12.0.1, Build version 12A7300
    โ€ข CocoaPods version 1.9.3

[!] Android Studio (not installed)
    โ€ข Android Studio not found; download from
      https://developer.android.com/studio/index.html
      (or visit https://flutter.dev/docs/get-started/install/macos#android-setup
      for detailed instructions).

[โœ“] Connected device (1 available)
    โ€ข iPad (mobile) โ€ข 6b98f33877e0cab838d2ffafbc6aec62c7e4b98b โ€ข ios โ€ข
      iOS 14.0

! Doctor found issues in 2 categories.

I've tried literally every cleaning thing (flutter clean, pod deintegrate + pod setup, xcode clean project, ...) and nothing helped, so it doesn't seem to be corrupt build files. I also tried the solutions proposed in #18, but no luck there either.

I'd be very grateful for any ideas on how to resolve this.

Failed to load tensorflowlib when firebase_admob is used

I really appreciate your nice library that eases the usage of TF.

I am developing a mobile app for android and I am using tflite_flutter for some image processing tasks. Everything was fine until I added firebase_admob to my project.
It seems that initiating the firebase_admob prevents tflite_flutter from loading the tensorflow lib!

Here is the runtime error I get:
ArgumentError (Invalid argument(s): Failed to load dynamic library (dlopen failed: library "libtensorflowlite_c.so" not found))
If I initiate firebase_admob in advance, like this:
FirebaseAdMob.instance.initialize(appId: "APP ID");

By disabling the firebase_admob tflite_flutter works fine. I am not sure if the problem is with firebase_admob or tflite_flutter, but I came across a similar issue for libsqlite:
simolus3/drift#420
According to the link, it seems that ffi is the root of problem.

I could reproduce the error with two versions if tflite_flutter lib: tflite_flutter: ^0.1.3, and tflite_flutter: ^0.5.0

Here is also the flutter doctor results:
[โœ“] Flutter (Channel stable, v1.17.3, on Mac OS X 10.14.5 18F203, locale en-DE)
โ€ข Flutter version 1.17.3 at /Users/q416892/Documents/tt/flutter
โ€ข Framework revision b041144f83 (7 weeks ago), 2020-06-04 09:26:11 -0700
โ€ข Engine revision ee76268252
โ€ข Dart version 2.8.4

[โœ“] Android toolchain - develop for Android devices (Android SDK version 29.0.3)
โ€ข Android SDK at /Users/q416892/Library/Android/sdk
โ€ข Platform android-29, build-tools 29.0.3
โ€ข Java binary at: /Applications/Android Studio.app/Contents/jre/jdk/Contents/Home/bin/java
โ€ข Java version OpenJDK Runtime Environment (build 1.8.0_242-release-1644-b3-6222593)
โ€ข All Android licenses accepted.

[โœ—] Xcode - develop for iOS and macOS
โœ— Xcode installation is incomplete; a full installation is necessary for iOS development.
Download at: https://developer.apple.com/xcode/download/
Or install Xcode via the App Store.
Once installed, run:
sudo xcode-select --switch /Applications/Xcode.app/Contents/Developer
sudo xcodebuild -runFirstLaunch
โœ— CocoaPods not installed.
CocoaPods is used to retrieve the iOS and macOS platform side's plugin code that responds to your plugin usage on the Dart side.
Without CocoaPods, plugins will not work on iOS or macOS.
For more info, see https://flutter.dev/platform-plugins
To install:
sudo gem install cocoapods

[โœ“] Android Studio (version 4.0)
โ€ข Android Studio at /Applications/Android Studio.app/Contents
โ€ข Flutter plugin version 46.0.2
โ€ข Dart plugin version 193.7361
โ€ข Java version OpenJDK Runtime Environment (build 1.8.0_242-release-1644-b3-6222593)

[โœ“] Connected device (1 available)
โ€ข Android SDK built for x86 โ€ข emulator-5554 โ€ข android-x86 โ€ข Android 5.0.2 (API 21) (emulator)

! Doctor found issues in 1 category.

Thanks in advance.

ios: Failed to lookup symbol (dlsym(RTLD_DEFAULT, TfLiteGpuDelegateOptionsV2Default): symbol not found)

Doctor summary (to see all details, run flutter doctor -v):
[โœ“] Flutter (Channel stable, 1.22.4, on macOS 11.0.1 20B50 darwin-x64, locale zh-Hans-CN)
 
[!] Android toolchain - develop for Android devices (Android SDK version 30.0.2)
    ! Some Android licenses not accepted.  To resolve this, run: flutter doctor --android-licenses
[โœ“] Xcode - develop for iOS and macOS (Xcode 12.3)
[!] Android Studio (version 4.0)
    โœ— Flutter plugin not installed; this adds Flutter specific functionality.
    โœ— Dart plugin not installed; this adds Dart specific functionality.
[โœ“] VS Code (version 1.52.1)
[โœ“] Connected device (2 available)

! Doctor found issues in 2 categories.
    final gpuDelegate = GpuDelegateV2();
    var interpreterOptions = InterpreterOptions()..addDelegate(gpuDelegate);
    interpreter = await Interpreter.fromAsset(
      'ml/yolov4-tiny-1280-final.tflite',
      options: interpreterOptions,
    );

HELP PLEASE!

Add support for Web builds

It would be really nice if this plugin could somehow be used for exporting to web.
Maybe tfjs could be used for that.
Are there any plans for supporting web builds?

Slow input tensors preparing

Hello, first of all - thank you for that plugin!

It works perfectly, inference time is running near realtime image processing, but i found out that preparing input tensors takes a huge amount of time, ex. for shape [1, 257, 257, 3] its about 350ms in release mode (inference running for 45-50ms). Reshaping list by that shape for input - taking about 60ms.

This is the bottleneck in runForMultipleInputs method:

var inputTensors = getInputTensors();
    for (var i = 0; i < inputs.length; i++) {
      if (inputTensors[i].shape != (inputs[i] as List).shape) {
        resizeInputTensor(i, (inputs[i] as List).shape);
        allocateTensors();
        inputTensors = getInputTensors();
      }
      inputTensors[i].setTo(inputs[i]);
    }

Output is null

Hello,
I am working on single input and multiply outputs. I write some code to do resizing and normilization(I didn't find them in this repo). But when I run this model in flutter, the outputs keeps being null. Here is the code:
`
import 'dart:io';
import 'dart:typed_data';

import 'package:flutter/services.dart';
import 'package:tflite_flutter/tflite_flutter.dart';
import 'package:image/image.dart' as img;

class MyClassifier {
final _modelFile = 'page2.tflite';

final int inputSize = 224;

Interpreter _interpreter;

MyClassifier() {
_loadModel();
}

img.Image resizeImage(ByteBuffer imageBytes, int inputSize){
//var imageBytes = (await rootBundle.load(image.path)).buffer;
img.Image oriImage = img.decodeJpg(imageBytes.asUint8List());
img.Image resizedImage = img.copyResize(oriImage, height: inputSize, width: inputSize);
return resizedImage;
}

Float32List imageToByteListFloat32(
img.Image image, int inputSize, double mean, double std) {

var convertedBytes = Float32List(1 * inputSize * inputSize * 3);
var buffer = Float32List.view(convertedBytes.buffer);
int pixelIndex = 0;
for (var i = 0; i < inputSize; i++) {
  for (var j = 0; j < inputSize; j++) {
      
    var pixel = image.getPixel(j, i);
    buffer[pixelIndex++] = (img.getRed(pixel) - mean) / std;
    buffer[pixelIndex++] = (img.getGreen(pixel) - mean) / std;
    buffer[pixelIndex++] = (img.getBlue(pixel) - mean) / std;
  }
}
print("===============");
print(buffer);
return convertedBytes.buffer.asFloat32List();

}

Uint8List imageToByteListUint8(img.Image image, int inputSize) {
var convertedBytes = Uint8List(1 * inputSize * inputSize * 3);
var buffer = Uint8List.view(convertedBytes.buffer);
int pixelIndex = 0;
for (var i = 0; i < inputSize; i++) {
for (var j = 0; j < inputSize; j++) {
var pixel = image.getPixel(j, i);
buffer[pixelIndex++] = img.getRed(pixel);
buffer[pixelIndex++] = img.getGreen(pixel);
buffer[pixelIndex++] = img.getBlue(pixel);
}
}
return convertedBytes.buffer.asUint8List();
}

void _loadModel() async {
_interpreter = await Interpreter.fromAsset(_modelFile);
print('Interpreter loaded successfully');
}

classify(File image) async{
//var image = await ImagePicker.pickImage(source: ImageSource.gallery);
//Uint8List bytes = file.readAsBytesSync();
var imageBytes = image.readAsBytesSync().buffer;
var resizedImage = resizeImage(imageBytes, inputSize);
print("&&&&&&&&&&&&&&&&&&&&&&&");
var input = imageToByteListFloat32(resizedImage,inputSize,125.5,255).reshape([1,3,224,224]);
//var input = imageToByteListFloat32(resizedImage,inputSize,125.5,255).reshape([1,3,224,224]);
print(input);
var output0 = List(1481414).reshape([1,48,14,14]);
var output1 = List(1
241414).reshape([1,24,14,14]);
var output2 = List(110081414).reshape([1,1008,14,14]);
var output3 = List(1
3361414).reshape([1,336,14,14]);
// output: Map<int, Object>
var outputs = {0: output0, 1: output1, 2: output2, 3: output3};

_interpreter.run(input, outputs);

print(outputs[0]);
print(outputs[1]);

print(outputs[2]);

print(outputs[3]);

return outputs;

}

}

`
and the outputs are null.
Any help will be appreciated!

install.bat and install.sh have inaccessable download URL in them

Hi, thanks for your awesome plugin. I wanted to let you know that I was able to find the binaries from an older version of the project, however you should be made aware that the download URL listed in the install.bat and install.sh files is not currently publicly accessable (i.e. "set URL=https://github.com/am15h/tflite_flutter_plugin/releases/download/" displays a gitHub 404 screen). As mentioned, the work-around is easy enough, but I figured that you might want to know about this issue and adjust accordingly. Thanks.

NoSuchMethodError: The method 'run' was called on null even though install.bat is run

Hi,

I'm teaching myself how to use tflite in dart using your awesome library, but I am having trouble making a basic example work.

I am trying to use DeepLabv3 model to segment an image. I am following your provided example as a guide. Below is my code for a simple process in order to see what the output will be like.

class SegmentImage{
  Interpreter interpreter;
  InterpreterOptions _interpreterOptions;
  List<int> _inputShape;
  List<int> _outputShape;
  String modelName = 'deeplabv3_257_mv_gpu.tflite';

  TensorImage _inputImage;
  TensorBuffer _outputBuffer;

  // from example code, apparently this is how I call the NormalizeOp function
  NormalizeOp get preProcessNormalizeOp => NormalizeOp(127.5, 127.5);

  TfLiteType _outputType = TfLiteType.uint8;

  Future<void> loadModel() async {
    try {
      final interpreter = await Interpreter.fromAsset(modelName);
      print('Interpreter Created Successfully');

      _inputShape = interpreter.getInputTensor(0).shape;    // [1, 257, 257, 3]
      print('input_shape:');
      print(_inputShape);
      print(interpreter.getInputTensor(0).type);    //TfLiteType.float32
      _outputShape = interpreter.getOutputTensor(0).shape;
      print('output_shape:');
      print(_outputShape);
      _outputType = interpreter.getOutputTensor(0).type;
      print(_outputType);

      _outputBuffer = TensorBuffer.createFixedSize(_outputShape, _outputType);

    } catch (e) {
      print('Unable to create interpreter, Caught Exception: ${e.toString()}');
    }
  }

  Future<void> predict(File image) async{
    //read the image as bytes for TensorImage
    img.Image imageInput = img.decodeImage(image.readAsBytesSync());
    //this will be the tensor that will be used for prediction
    _inputImage = TensorImage.fromImage(imageInput);
    _inputImage = _preProcess();
    interpreter.run(_inputImage.buffer, _outputBuffer.getBuffer());
    print('output buffer shape and type');
    print(_outputBuffer.getShape());
    print(_outputBuffer.getDataType());
  }

  TensorImage _preProcess() {
    int cropSize = min(_inputImage.height, _inputImage.width);
    return ImageProcessorBuilder()
        .add(ResizeWithCropOrPadOp(cropSize, cropSize))
        .add(ResizeOp(
        _inputShape[1], _inputShape[2], ResizeMethod.NEAREST_NEIGHBOUR))
        .add(preProcessNormalizeOp)
        .build()
        .process(_inputImage);
  }

}

and this is my getImage() function that calls all the above:

  Future getImage() async {
    final pickedFile = await picker.getImage(source: ImageSource.gallery);
    _image = File(pickedFile.path);
    await segmentImage.loadModel();
    await segmentImage.predict(_image);

But I am getting this error on run:

E/flutter (15207): [ERROR:flutter/lib/ui/ui_dart_state.cc(166)] Unhandled Exception: NoSuchMethodError: The method 'run' was called on null.
E/flutter (15207): Receiver: null
E/flutter (15207): Tried calling: run(Instance of '_ByteBuffer', Instance of '_ByteBuffer')
E/flutter (15207): #0      Object.noSuchMethod (dart:core-patch/object_patch.dart:51:5)
E/flutter (15207): #1      SegmentImage.predict (package:easy_bg_changer/main.dart:102:17)
E/flutter (15207): #2      _HomePageState.getImage (package:easy_bg_changer/main.dart:37:24)
E/flutter (15207): <asynchronous suspension>
E/flutter (15207): #3      _InkResponseState._handleTap (package:flutter/src/material/ink_well.dart:992:19)
E/flutter (15207): #4      _InkResponseState.build.<anonymous closure> (package:flutter/src/material/ink_well.dart:1098:38)
E/flutter (15207): #5      GestureRecognizer.invokeCallback (package:flutter/src/gestures/recognizer.dart:184:24)
E/flutter (15207): #6      TapGestureRecognizer.handleTapUp (package:flutter/src/gestures/tap.dart:524:11)
E/flutter (15207): #7      BaseTapGestureRecognizer._checkUp (package:flutter/src/gestures/tap.dart:284:5)
E/flutter (15207): #8      BaseTapGestureRecognizer.acceptGesture (package:flutter/src/gestures/tap.dart:256:7)
E/flutter (15207): #9      GestureArenaManager.sweep (package:flutter/src/gestures/arena.dart:158:27)
E/flutter (15207): #10     GestureBinding.handleEvent (package:flutter/src/gestures/binding.dart:224:20)
E/flutter (15207): #11     GestureBinding.dispatchEvent (package:flutter/src/gestures/binding.dart:200:22)
E/flutter (15207): #12     GestureBinding._handlePointerEvent (package:flutter/src/gestures/binding.dart:158:7)
E/flutter (15207): #13     GestureBinding._flushPointerEventQueue (package:flutter/src/gestures/binding.dart:104:7)
E/flutter (15207): #14     GestureBinding._handlePointerDataPacket (package:flutter/src/gestures/binding.dart:88:7)
E/flutter (15207): #15     _rootRunUnary (dart:async/zone.dart:1206:13)
E/flutter (15207): #16     _CustomZone.runUnary (dart:async/zone.dart:1100:19)
E/flutter (15207): #17     _CustomZone.runUnaryGuarded (dart:async/zone.dart:1005:7)
E/flutter (15207): #18     _invoke1 (dart:ui/hooks.dart:267:10)
E/flutter (15207): #19     _dispatchPointerDataPacket (dart:ui/hooks.dart:176:5)
E/flutter (15207): 

I have seen this closed issue: #15 and I have run the install.bat script. If I run it again it asks me if I want to overwrite said files.. So the files have been installed.

What am I doing wrong?

P.S. An additional though after reading this issue on your other library: am15h/tflite_flutter_helper#15, are image outputs not yet supported?

Thanks

[Question] Why not make bindings for common.h?

Hi!

I'm curious why bindings are never made for common.h? For example, instead of using the TfLiteTensor defined in common the library instead defines its own class Tensor that calls out to methods in the c_api. Does the tflite lib only want people to implement the c_api?

Unable to create interpreter

I/tflite  (10801): Created TensorFlow Lite delegate for GPU.
I/tflite  (10801): Initialized TensorFlow Lite runtime.
E/tflite  (10801): Following operations are not supported by GPU delegate:
E/tflite  (10801): EXP: Operation is not supported.
E/tflite  (10801): SPLIT: Operation is not supported.
E/tflite  (10801): SPLIT_V: Operation is not supported.
E/tflite  (10801): 80 operations will run on the GPU, and the remaining 64 operations will run on the CPU.
E/tflite  (10801): Can not open OpenCL library on this device - dlopen failed: library "libOpenCL.so" not found
E/tflite  (10801): Falling back to OpenGL
D/        (10801): HostConnection::get() New Host Connection established 0x7fb7d140, tid 10858
D/EGL_emulation(10801): eglCreateContext: 0x85aa7220: maj 3 min 0 rcv 3
E/EGL_emulation(10801): rcMakeCurrent returned EGL_FALSE
E/EGL_emulation(10801): tid 10858: eglMakeCurrent(1590): error 0x3006 (EGL_BAD_CONTEXT)
E/libEGL  (10801): eglMakeCurrent:1062 error 3006 (EGL_BAD_CONTEXT)
E/tflite  (10801): TfLiteGpuDelegate Init: No EGL error, but eglMakeCurrent failed.
I/tflite  (10801): Created 0 GPU delegate kernels.
E/tflite  (10801): TfLiteGpuDelegate Prepare: delegate is not initialized
E/tflite  (10801): Node number 144 (TfLiteGpuDelegateV2) failed to prepare.
E/tflite  (10801): Restored original execution plan after delegate application failure.

any solution?

cant add my own teachablemachine.withgoogle.com models

when i try to add my own model i have this error? what it might be?
[ERROR:flutter/runtime/dart_isolate.cc(993)] Unhandled exception: E/flutter ( 6540): RangeError (index): Invalid value: Only valid value is 0: 1 E/flutter ( 6540): #0 List.[] (dart:core-patch/growable_array.dart:183:60) E/flutter ( 6540): #1 Classifier.predict (package:object_detection/tflite/classifier.dart:118:65) E/flutter ( 6540): #2 IsolateUtils.entryPoint (package:object_detection/utils/isolate_utils.dart:45:51)

How to perform inference in isolates?

Hi,

In the key features its mentioned:

- Run inference in different isolates to prevent jank in UI thread.

But I am having trouble achieving this. And I cant find any example of this in documentation of examples. Can you please provide a very simple example of how this would work?

I have a very simple app, that runs inference on a list of int. Since its a very small app, I'm doing everything in main.dart.

The reason I am having trouble with isolates is because I initialize the Interpreter _interpreter in my main and then I cant pass this _interpreter to the isolate. As the isolate function has to be completely separate. I tried declaring the Interpreter globally above main, but that also didnt work.

I dont want to initialize the _interpreter in the isolate as that would load the model for every inference.

How would I call inference in isolate please?

Thanks

Tflite regression model gives different output while input is the same

I have a tflite regression model, I tested the output of my tflite model in python. It's giving me the same output result as the .h5 model. However, tflite_flutter interpretter gives me wrong results. I have randomly checked 10 different pixels and the input values are totally same. So the input is same, the models are same, but tflite_flutter interpretter gives a different result than tensorflow tflite interpretter.

Here is my code:
var interpreterOptions = InterpreterOptions()..addDelegate(NnApiDelegate());
final interpreter = await Interpreter.fromAsset('modelPath',options: interpreterOptions);
var imageBytes = (await rootBundle.load('imagePath')).buffer;
imageLib.Image oriImage = imageLib.decodePng(imageBytes.asUint8List());
imageLib.Image copyImage = imageLib.copyCrop(oriImage, 60, 0, 60, 30);
var resizedImage = copyImage.getBytes(format: imageLib.Format.rgb);
var input = [];
for (int i = 0 ; i < resizedImage.length ; i++){ input.add(resizedImage[i].toDouble() / 255); }
input = input.reshape([1,30,60,3]);
var output = List(1).reshape([1,1]);
interpreter.run(input, output);

Have you tested the package for DL regression models? I'm using a Deep CNN regressor.

null even though install.bat is run

I'm really confused on what I'm doing wrong, my model is throwing 0.0% and a null prediction

`import 'dart:io;
import 'package:tflite_flutter/tflite_flutter.dart';
import 'package:tflite_flutter_helper/tflite_flutter_helper.dart';

class Classifier {
Classifier();

classifyImage(var image) async {
var inputImage = File(image.path);

ImageProcessor imageProcessor = ImageProcessorBuilder()
.add(ResizeOp(300, 300, ResizeMethod.BILINEAR))
.add(NormalizeOp(0, 255))
.build();

TensorImage tensorImage = TensorImage.fromFile(inputImage);
tensorImage = imageProcessor.process(tensorImage);

TensorBuffer probabilityBuffer =
    TensorBuffer.createFixedSize(<int>[1,7], TfLiteType.float32);

try{
  Interpreter interpreter = await Interpreter.fromAsset("model.tflite");
  interpreter.run(tensorImage.buffer, probabilityBuffer.buffer);
}
catch(e) {
  print("Error loading or running model");
}

List<String> labels = await FileUtil.loadLabels("assets/labels.txt");
TensorProcessor probabilityProcessor =
    TensorProcessorBuilder().build();
TensorLabel tensorLabel = TensorLabel.fromList(labels, probabilityProcessor.process(probabilityBuffer));

Map labeledProb = tensorLabel.getMapWithFloatValue();
double highestProb = 0;
String check;

labeledProb.forEach((Done, probability) {
  if (probability * 100 > highestProb) {
    highestProb = probability * 100;
    check = Done;
  }
});

var outputProb = highestProb.toStringAsFixed(1);
return [check, outputProb];


}

}
`

Making sense of outputs from runForMultipleInputs

First, thank you for the excellent library!

I'm trying to get a custom object detection model working in Flutter and I feel like I'm right on the finish line but the outputs aren't really making sense. I'm not sure if I'm messing up my usage of your library or if I made a mistake earlier (in training or something) so let me know if I should post this to SO or file an issue somewhere else. I am able to use my custom model under full Tensorflow using the Python bindings so I think my model is good...

Summary:

My "detect" method:

  detect(Image image) async {
    TensorImage timg = TensorImage.fromImage(image);
    timg = _preProcess(timg);

    _interpreter.runForMultipleInputs([timg.buffer], _outputBuffers);

    for (int i = 0; i < _outputTensorBuffers.length; i++) {
      TensorBuffer buffer = _outputTensorBuffers[i];
      print("${_outputTensorNames[i]}: ${buffer.getDoubleList()}");
    }
  }

(full source listing in Listing A below.)

results in the following output:

I/flutter (12516): StatefulPartitionedCall:5: [100.0]
I/flutter (12516): StatefulPartitionedCall:2: [5.0]
I/flutter (12516): StatefulPartitionedCall:7: [0.003288567066192627, 0.00007053170702420175, 0.00040328502655029297, 0.00022339820861816406, 0.0009590387344360352, 0.00682443380355835, 0.006241738796234131, 0.004405707120895386, 0.006697863340377808, 0.0030375123023986816, 0.0009462237358093262, 0.0014250576496124268, 0.0004329383373260498, 0.00403711199760437, 0.001086801290512085, 0.003039032220840454, 0.0006709098815917969, 0.003288090229034424, 0.0037768781185150146, 0.00008151835208991542, 0.0005600154399871826, 0.0005548596382141113, 0.0011392831802368164, 0.008495986461639404, 0.005149126052856445, 0.004283487796783447, 0.007070749998092651, 0.0038556158542633057, 0.00033274292945861816, 0.0002855658531188965, 0.000373154878616333, 0.0014458894729614258, 0.0006978213787078857, 0.0024288594722747803, 0.0006158053874969482, 0.0030787885189056396, 0.0032474100589752197, 0.0006365478038787842, 0.00037983059883117676, 0.0007419586181640625, 0.0025945305824279785, 0.0018936693668365479, 0.0027012526988983154, 0.00430268049240
I/flutter (12516): StatefulPartitionedCall:4: [0.03188779950141907]
I/flutter (12516): StatefulPartitionedCall:6: [-0.009938137605786324, -0.011441117152571678, 0.05219971388578415, 0.04861987382173538, -0.016602735966444016, -0.010687697678804398, 0.0830603837966919, 0.07274940609931946, -0.007124748080968857, -0.009804163128137589, 0.03815270960330963, 0.04826470836997032, -0.011091910302639008, -0.008176783099770546, 0.06167861819267273, 0.06926850229501724, -0.005919046700000763, -0.013556137681007385, 0.05681357532739639, 0.03645564243197441, -0.02163616754114628, -0.011458851397037506, 0.0747147724032402, 0.06488804519176483, -0.0017826557159423828, -0.02006278745830059, 0.08506569266319275, 0.07615071535110474, -0.016631372272968292, -0.019205741584300995, 0.13718745112419128, 0.0941062793135643, -0.012604944407939911, -0.009613707661628723, 0.06616833060979843, 0.0632333755493164, -0.01230534166097641, -0.006209280341863632, 0.10614445805549622, 0.07859446108341217, 0.016275424510240555, -0.02164752036333084, 0.07109864056110382, 0.07278952747583389, 0.003784455358982086, -0.0303153768
I/flutter (12516): StatefulPartitionedCall:3: [0.002476513385772705]
I/flutter (12516): StatefulPartitionedCall:1: [0.07430143654346466]
I/flutter (12516): StatefulPartitionedCall:0: [40274.0]

I've been unable to make sense of those outputs...

Any idea what I'm doing wrong here?

Figure A: Input and Output Shapes

image

Listing A: object_detect.dart

import 'dart:math';
import 'dart:typed_data';

import 'package:flutter/services.dart' show rootBundle;
import 'package:tflite_flutter/tflite_flutter.dart';
import 'package:tflite_flutter_helper/tflite_flutter_helper.dart';
import 'package:image/image.dart';

class ObjectDetector {
  // should live in assets/. Don't include assets/ in the name.
  final String modelName;
  ObjectDetector(this.modelName);

  Interpreter _interpreter;
  List<int> _inputShape;
  Map<int, ByteBuffer> _outputBuffers = new Map<int, ByteBuffer>();
  Map<int, TensorBuffer> _outputTensorBuffers = new Map<int, TensorBuffer>();
  Map<int, String> _outputTensorNames = new Map<int, String>();

  initialize() async {
    _interpreter = await Interpreter.fromAsset(this.modelName);
    _inputShape = _interpreter.getInputTensor(0).shape;

    var outputTensors = _interpreter.getOutputTensors();

    outputTensors.asMap().forEach((i, tensor) {
      TensorBuffer output =
          TensorBuffer.createFixedSize(tensor.shape, tensor.type);
      _outputTensorBuffers[i] = output;
      _outputBuffers[i] = output.buffer;
      _outputTensorNames[i] = tensor.name;
    });
  }

  TensorImage _preProcess(TensorImage timg) {
    int cropSize = min(timg.height, timg.width);
    ImageProcessor processor = ImageProcessorBuilder()
        .add(ResizeWithCropOrPadOp(cropSize, cropSize))
        .add(ResizeOp(
        _inputShape[1], _inputShape[2], ResizeMethod.NEAREST_NEIGHBOUR))
        .add(NormalizeOp(0, 1))
        .build();
    return processor.process(timg);
  }

  detect(Image image) async {
    TensorImage timg = TensorImage.fromImage(image);
    timg = _preProcess(timg);

    _interpreter.runForMultipleInputs([timg.buffer], _outputBuffers);

    for (int i = 0; i < _outputTensorBuffers.length; i++) {
      TensorBuffer buffer = _outputTensorBuffers[i];
      print("${_outputTensorNames[i]}: ${buffer.getDoubleList()}");
    }
  }
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.