I'm trying to use apple_vision_flutter library using the Camera (with dependency camera: ^0.10.5+9) controller image stream like that:
await controller.startImageStream((image) async {
Uint8List imageBytes = image.planes[0].bytes;
final recognizeResult = await AppleVisionFlutter().recognizeText(imageBytes.buffer.asUint8List());
print('Recognize result: ${recognizeResult.observations.map((o) => o.textOptions.join('\n')).join('\n')}');
});
But the recognized results are always empty.
When I just take a picture from the Camera, everything works fine:
await controller.takePicture().then((file) async {
final Uint8List imageData = await file.readAsBytes();
final recognizeResult = await AppleVisionFlutter().recognizeText(imageData);
print('Recognize result: ${recognizeResult.observations.map((o) => o.textOptions.join('\n')).join('\n')}');
});
Do I have to convert somehow the ìmage before passing the bytes to the AppleVisionFlutter().recognizeText() method ? Thanks for the help !