Flutter plugin for OnnxRuntime with FFI
provides an easy, flexible, and fast Dart API to integrate Onnx models in flutter apps across mobile and desktop platforms.
- Multi-platform Support for Android and iOS
- Flexibility to use any Onnx Model.
- Acceleration using multi-threading.
- Similar structure as OnnxRuntime Java and C# API.
- Inference speed is not slower than native Android/iOS Apps built using the Java/Objective-C API.
- Run inference in different isolates to prevent jank in UI thread.
In your flutter project add the dependency:
dependencies:
...
onnxruntime: x.y.z
import 'package:onnxruntime/onnxruntime.dart';
OrtEnv.instance.init();
final sessionOptions = OrtSessionOptions();
const assetFileName = 'assets/models/test.onnx';
final rawAssetFile = await rootBundle.load(assetFileName);
final bytes = rawAssetFile.buffer.asUint8List();
final session = OrtSession.fromBuffer(bytes, sessionOptions!);
final shape = [1, 2, 3];
final inputOrt = OrtValueTensor.createTensorWithDataList(data, shape);
final inputs = {'input': inputOrt};
final runOptions = OrtRunOptions();
final outputs = await _session?.runAsync(runOptions, inputs);
inputOrt.release();
runOptions.release();
outputs?.forEach((element) {
element?.release();
});
OrtEnv.instance.release();