I tried to create a live camera feed with face detection using both CameraX and MediaPipe in Kotlin.
Sadly, I get the error that my buffer is too small for the pixels.
The main function I execute takes place in the CameraScreen composable.
fun setUpCamera() {
val cameraProviderFuture = ProcessCameraProvider.getInstance(context)
cameraProviderFuture.addListener({
val cameraProvider: ProcessCameraProvider = cameraProviderFuture.get()
val cameraSelector = CameraSelector.DEFAULT_FRONT_CAMERA
val screenSize = Size(640, 480)
val resolutionSelector = ResolutionSelector.Builder().setResolutionStrategy(
ResolutionStrategy(screenSize,
ResolutionStrategy.FALLBACK_RULE_NONE)
).build()
// Build and bind camera use cases
val preview = Preview.Builder()
.setResolutionSelector(resolutionSelector)
.build().also {
it.setSurfaceProvider(preview.surfaceProvider)
}
val imageAnalyzer = ImageAnalysis.Builder()
.setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST)
.build()
.also {
it.setAnalyzer(executor, faceDetectorHelper::detectLivestreamFrame)
}
cameraProvider.unbindAll()
try {
cameraProvider.bindToLifecycle(lifecycleOwner, cameraSelector, preview, imageAnalyzer)
} catch (exc: Exception) {
android.util.Log.e("CameraFragment", "Use case binding failed", exc)
}
}, ContextCompat.getMainExecutor(context))
}
While in my FaceDetectorHelper I have this:
fun detectLivestreamFrame(imageProxy: ImageProxy) {
if (runningMode != RunningMode.LIVE_STREAM) {
throw IllegalArgumentException(
"Attempting to call detectLivestreamFrame" +
" while not using RunningMode.LIVE_STREAM"
)
}
val frameTime = SystemClock.uptimeMillis()
// Copy out RGB bits from the frame to a bitmap buffer
val bitmapBuffer =
Bitmap.createBitmap(
imageProxy.width,
imageProxy.height,
Bitmap.Config.ARGB_8888
)
imageProxy.use { bitmapBuffer.copyPixelsFromBuffer(imageProxy.planes[0].buffer) }
imageProxy.close()
// Rotate the frame received from the camera to be in the same direction as it'll be shown
val matrix =
Matrix().apply {
postRotate(imageProxy.imageInfo.rotationDegrees.toFloat())
// postScale is used here because we're forcing using the front camera lens
// This can be set behind a bool if the camera is togglable.
// Not using postScale here with the front camera causes the horizontal axis
// to be mirrored.
postScale(
-1f,
1f,
imageProxy.width.toFloat(),
imageProxy.height.toFloat()
)
}
val rotatedBitmap =
Bitmap.createBitmap(
bitmapBuffer,
0,
0,
bitmapBuffer.width,
bitmapBuffer.height,
matrix,
true
)
// Convert the input Bitmap face to an MPImage face to run inference
val mpImage = BitmapImageBuilder(rotatedBitmap).build()
detectAsync(mpImage, frameTime)
}
The problem occurs when the code executes
imageProxy.use { bitmapBuffer.copyPixelsFromBuffer(imageProxy.planes[0].buffer) }
Can someone help me with this? If you need further information about the code, feel free to say it and I'll add it in the post.
Thank you in advance