Is there a way to detect magnification accessibility feature from all shortcuts on android?

217 Views Asked by At

We know that we can detect magnification from triple-tap gesture by

Settings.Secure.getInt( getAppContext()?.getContentResolver(), "accessibility_display_magnification_enabled",0)

But can we detect it from navBar shortcut and other shortcuts also

Tried

Settings.Secure.getInt( getAppContext()?.getContentResolver(), "accessibility_display_magnification_enabled",0)

&

Settings.Secure.getInt( getAppContext()?.getContentResolver(), "accessibility_display_magnification_scale",0)

but always getting zero

1

There are 1 best solutions below

0
On

Yes, but with caution. I created a sharedFlow to listen to the zoom value:

private val magnificationScale: Float
    get() =
        Settings.Secure.getFloat(
            contentResolver,
            "accessibility_display_magnification_scale",
            1.0f
        )
    

private val magnifierFlow = MutableSharedFlow<Float>()

private val magnifierJob: Job
    get() = GlobalScope.launch {
        while (true) {
            // you could only emit if the value is changed
            magnifierFlow.emit(magnificationScale)
            delay(250) 
        }
    }
...
// in onResume() / onCreate()
magnifierJob.start()
...

magnifierFlow.collect { value ->
    view.also { cb ->
        runOnUiThread {
            checkBox.isChecked = value != default
            checkBox.setText("magnification: [${value}]")
        }
    }
}

From what I can see, the value is only "saved" when a user is using not using the "temporary" mode - (temporary zoom is done with a tap and hold, as opposed to just a tap) - so even if you could detect it, you wouldn't be able to during different modes.

I think the big question comes down to motivation. Since this type of data could be considered health data, you definitely don't want it for analytics, because there are a host of legal compliance issues you would need to contend with.

I found an article on detecting screen readers, and they list some reasons why detecting assistive technology is a bad practice:

  • detection is eerily similar to the browser-sniffing technique which has proven to be a poor practice
  • Maintaining separate channels of code is a nightmare; developers overloaded already with supporting multiple browsers, devices... And if done, it will many times become outdated if not entirely forgotten about.
  • Why [any assistive technology] detection? If you follow that logic, then detection should be provided for screen readers, screen magnifiers, braille output devices, onscreen keyboards, voice-recognition, etc. That's just [overwhelming to maintain]

I know we're capable of detecting screen readers on Android, but that is because we might need to for the user. Because of the variety of mechanisms there are for users, it does become a maintenance nightmare.

However if it's really necessary, I wrote a little something to help you out:

val couldHaveMagnification: Boolean
get() {
    val hasAccessibilityMagnificationButton = (Settings.Secure.getString(
        contentResolver,
        "accessibility_button_targets") ?: "")
            .contains("com.android.server.accessibility.MagnificationController")

    val hasAccessibilityMagnificationShortcut = (Settings.Secure.getString(
        contentResolver,
        "accessibility_shortcut_target_service") ?: "")
            .contains("com.android.server.accessibility.MagnificationController")
    
    val hasTripleTapToZoomOn = Settings.Secure.getInt(
        contentResolver,
        "accessibility_display_magnification_enabled", 0) == 1

    return hasAccessibilityMagnificationButton || hasAccessibilityMagnificationShortcut || hasTripleTapToZoomOn
}

Be aware:

  • this was only tested on a Pixel 7 phone
  • the user could use magnification, but we can't detect whether they are actively using magnification