We know that we can detect magnification from triple-tap gesture by
Settings.Secure.getInt( getAppContext()?.getContentResolver(), "accessibility_display_magnification_enabled",0)
But can we detect it from navBar shortcut and other shortcuts also
Tried
Settings.Secure.getInt( getAppContext()?.getContentResolver(), "accessibility_display_magnification_enabled",0)
&
Settings.Secure.getInt( getAppContext()?.getContentResolver(), "accessibility_display_magnification_scale",0)
but always getting zero
Yes, but with caution. I created a
sharedFlow
to listen to the zoom value:From what I can see, the value is only "saved" when a user is using not using the "temporary" mode - (temporary zoom is done with a tap and hold, as opposed to just a tap) - so even if you could detect it, you wouldn't be able to during different modes.
I think the big question comes down to motivation. Since this type of data could be considered health data, you definitely don't want it for analytics, because there are a host of legal compliance issues you would need to contend with.
I found an article on detecting screen readers, and they list some reasons why detecting assistive technology is a bad practice:
I know we're capable of detecting screen readers on Android, but that is because we might need to for the user. Because of the variety of mechanisms there are for users, it does become a maintenance nightmare.
However if it's really necessary, I wrote a little something to help you out:
Be aware: