How Google Makes Android Apps, And The World’s Information, Universally Accessible To Everyone

Bill Mount

Google’s Android Studio now helps developers scan for accessibility issues in their apps. Google Images As iPhone and Apple Watch are the standard-bearers in their respective product categories, so too is Apple the standard-bearer when it comes to designing and shipping best-of-breed assistive technologies. The Cupertino company has long been […]

As iPhone and Apple Watch are the standard-bearers in their respective product categories, so too is Apple the standard-bearer when it comes to designing and shipping best-of-breed assistive technologies. The Cupertino company has long been lauded by those in the disability community as creating the best accessibility software, just as iPhone is the best smartphone and Apple Watch the best smartwatch.

Still, where Apple leads, it is incumbent on its contemporaries to follow. Maintaining good accessibility practices is obviously not exclusive to one company, nor should it be. Indeed, Apple’s Big Tech peers in Amazon, Google, and Microsoft, all do admirable work in their own right to push accessibility’s importance to technology and to raise awareness of disabled people. In particular, the Mountain View-based Google has made significant strides in recent times to make accessibility on Android and other properties better in various ways. Additionally, the company recently begun airing a heartfelt ad called “A CODA Story”, which spotlights how Google tech such as Live Transcribe and more enables children of deaf adults communicate with their parents.

“Google’s mission is to organize the world’s information and make it universally accessible,” Casey Burkhardt, a staff software engineer on Google’s Accessibility team, said in a recent interview with me conducted over email. “With one billion people in the world who have a disability, bringing that mission to life means leveraging what many people may already have in their pockets—a smartphone—to make both the physical and digital worlds more accessible.”

To Burkhardt, Google’s mission to make the world’s information accessible to everyone has special meaning. He is legally Blind, so not only is he intimately involved in building the tools that make his company’s software accessible, he uses those same tools to have easier access the world. The advent of the smartphone quite literally changed his life, as he no longer had to tote a physical magnifier with him to read small-print items in the real world, from his schoolwork to mail and more. “I can still recall the moment I discovered a free digital magnifier app that outperformed the specialized assistive hardware, which has sat in a drawer ever since,” he said.

For Google, the work on accessibility partly stems from the notion that “mobile devices have also become gateways to the digital world and an increasing amount of what we do and how we interact is app-based,” Burkhardt told me. The company declined to share what percentage of Android users use accessibility features, but Burkhardt did say much of the inspiration for what comes out of the development process is the needs of team members internally. Many have disabilities themselves, and they contribute ideas based on what they need from their devices. One example Burkhardt cited is the TalkBack Braille Keyboard. It was conceived and developed by Daniel Dalton, a software engineer at Google who is Blind, who “wanted to provide people who use braille with a fast way of communicating on their Android devices.” Another example is Live Transcribe, developed by engineers Chet Gnegy and Dimitri Kanevsky. Kanevsky is Deaf, and he and Kanevsky wanted to create something that provided “additional avenues for making conversations more accessible.”

“[As] a team we approach the impact of accessibility features not only by thinking of the number of people we reach, but also the degree to which we can help make an impact on their lives,” Burkhardt said.

Beyond internal inspiration, Google gets feedback from users on how accessibility features help them and enable them to be productive. Burkhardt said the company uses these stories as fuel to create these technologies; these lived experiences are key to “reflecting on the impact that our tools can have on someone’s life,” he said. One anecdote Burkhardt shared was about Matthew Johnston. A self-professed digital accessibility advocate who is profoundly Deaf, Johnston wrote a guest post on Google’s Keyword blog in which he explained how Live Captions for Calls on his Pixel phone allowed him to talk on the phone with his 23-year-old son for the first time. “Thanks to this feature, when Harry spoke his words were instantly converted to text,” Johnston wrote. “I was able to simultaneously read what Harry was saying and respond to him in a way that was natural and fluid.” Johnston added he’s been able to call his “bank manager, handyman, colleagues, family, and friends” as well.

At a technical level, what makes these Android accessibility features possible are a bunch of APIs, Burkhardt said, that allow Google and third-party developers to make their apps accessible. Google recently added an Accessibility Scanner feature to its Android Studio app, the environment in which developers create apps for Android. The Scanner, itself part of the Layout Editor function, helps developers scan for accessibility issues and provides feedback along the way. Burkhardt said getting this early in an app’s development cycle can help developers—who may not be as savvy when it comes to accessibility—incorporate it as best they can into their own software.

“Before now, developers needed to proactively review our accessibility reports by integrating their tests with our test framework, running Accessibility Scanner, or viewing their Play Store Pre-Launch Report, all of which require either additional effort or knowledge of our tools, and which surface findings later in the development lifecycle,” he said. “By meeting developers where they’re at in Android Studio’s Layout Editor, we can get concrete suggestions in front of many developers who might otherwise be unaware of, or know how, to approach accessibility.”

Building the Scanner was a collaborative effort between the Android Studio Design Tools and Accessibility Developer Infrastructure teams. The tool is in its infancy—it is currently available in the Android Studio “Arctic Fox” beta—but Google welcomes feedback from developers. Burkhardt added Google is “excited” to integrate the tool into Android Studio and get accessibility to the forefront of app development.

Android Studio’s Accessibility Scanner functionality is similar to the Accessibility Inspector feature in Apple’s Xcode. The tool, which debuted at WWDC 2019, does more or less the same job as Accessibility Scanner. The means are different, the platforms obviously are different, but the end goal of making accessibility a first-class citizen during development is the same. Silicon Valley, and the tech industry at large, needs this kind of commitment—disabled people deserve this kind of recognition.

Next Post

Mathematicians welcome computer-assisted proof in ‘grand unification’ theory

Efforts to verify a complex mathematical proof using computers have been successful.Credit: Fadel Senna/AFP via Getty Peter Scholze wants to rebuild much of modern mathematics, starting from one of its cornerstones. Now, he has received validation for a proof at the heart of his quest from an unlikely source: a […]