According to Google, those who have speech or physical difficulties will now be able to operate their Android-powered devices without touching them.
They only need simply raise a brow or a grin to be understood. Machine learning and front-facing cameras on smartphones are being used to identify and track the movements of the face and eyes, thanks to two new technologies being developed.
Users can scan their phone screen and select a task from the alternatives displayed on it by using a variety of gestures such as a smile, raised eyebrows, open lips, or a glance to the left, right, or up. To make Android more accessible to everyone, Google is introducing “new tools that make it easier to manage your phone and communicate using face motions,” according to the company.
According to a blog post published by the technological behemoth, people navigate their phones every day by utilising voice commands such as ‘Hey Google’ or their hands. This is not always possible, however, for people who have severe movement and communication issues, according to her.
The modifications are the result of two new features, the first of which is referred to as “Camera Switches,” which allows people to interact with their cellphones using their faces rather than swipes and taps. The second feature is referred to as “Camera Switches,” which allows people to interact with their cellphones using their hands. People can use such movements to activate an activity in an Android application such as having a phone play a recorded word, send text messages, or make a phone call, to name a few possibilities.
The company announced in a statement that “it is now feasible for anyone to manage their phone using eye motions and facial gestures that are tailored to their range of movement – without using their hands or speaking.”
The free Activate app may be downloaded from the Google Play store, and it is now accessible in Australia, the United Kingdom, Canada, and the United States.