How do you paint from screen touch events (at scale) using Jetpack Compose on Android

638 Views Asked by At

There are only a couple of guides to using screen touch events (or MotionEvents) for painting on a screen using Jetpack Compose. All the guides I have found involve:

  1. linking a touch event to a Jetpack Compose State so a recomposition is triggered on every new touch action (a bit like calling invalidate() in the old view-based Android approach);
  2. saving that event change to a remembered list;
  3. and then redrawing the entire list on every recomposition.

The best guide I have found is this one: Code Simple Android Jetpack Compose Drawing App which has complete code here.

With only a few hundred touch events, the algorithm is surprisingly responsive. But when there are thousands of events in the list, it starts to get a bit like Schlemiel the Painter's Algorithm, and gets slower and slower until it eventually stops working.

In the old Kotlin/Android, the way around that would typically be to paint to a Canvas that is backed by a bitmap, so you don't need to redraw the entire screen on each invalidate(). But I can't find a Canvas in Jetpack Compose that seamlessly saves to a bitmap.

Is there a way in Jetpack Compose to simply remember the contents of a Composable Canvas, so the entire Canvas doesn't need to be redrawn on each recomposition?

Failing that, is there a way to back a Jetpack Compose Canvas with a bitmap, like you can in the old Canvas?

As far as I can tell, the Jetpack Compose Painter() object is what we are supposed to use for this, but I cannot for the life of me figure out how to do it.

In the Painter() object I can't work out how to access a drawScope in which to issue dynamic Compose commands like drawLine and drawCircle. All I can find is where you can override drawScope.onDraw(), but I don't know how to use that for what I want.

Any help pointing me in the right direction would be terrific!

0

There are 0 best solutions below