How does the Pixel series of Google smartphones accurately detect the movement of the finger touching the display?



Most smartphones use a touch display, which allows intuitive operation using your finger. However, there are various types of finger operations such as taps, double taps, long taps, pinch ins, pinch outs, drags, and flicks, and it is possible to correctly identify what kind of operation was performed on the smartphone side. is needed. With the update made in March 2020, the Pixel series of Google smartphones improved the accuracy of touch operation discrimination, but a researcher of Google's Android UX team explained what kind of system was added. doing.

Google AI Blog: Sensing Force-Based Gestures on the Pixel 4

https://ai.googleblog.com/2020/06/sensing-force-based-gestures-on-pixel-4.html

Capacitive touch sensors, which are often used in smartphone touch displays, are composed of drive electrodes, detection electrodes, and a non-conductive dielectric such as glass sandwiched between them. Each of the drive and sense electrodes is extremely small and, when combined, form a small capacitor cell that holds a charge. When a conductive finger approaches it, part of the electric charge is released, and the capacitance slightly decreases, so the principle of the capacitance method is to be able to detect the location touched by the finger.



The cells of a capacitive touch sensor are tightly lined up on the display, but they are still much coarser than the pixels of the display. For example, the Pixel 4 has a display resolution of 2280 pixels by 1080 pixels, but a touch sensor has 32 cells by 15 cells.

Click the image below to see a GIF animation that visualizes the readings from the touch sensor cell when tapped from the left, held down, and flicked. Although there are differences in fine movements, the resolution of the touch sensor sensitivity is still rough and it is sometimes difficult to correctly identify various gestures.



'Pressure at the time of touch' is attracting attention as a solution to the problem of 'how to determine the correct gesture?'. However, the hardware sensor required to sense the touch pressure is very expensive to design and install, and it is very difficult for humans to precisely control the touch pressure. Introduction to smartphones has been postponed.

The capacitive touch sensor does not respond to changes in pressure itself, but is tuned to be extremely sensitive to changes in the distance of your finger within a few millimeters of the display. That is, when you touch the display with your finger, the sensor near the center of the touched area saturates, but the area around the touched area retains a high dynamic range.

When you apply force to your finger and push the display firmly, the flesh of your finger will deform and spread. The nature of this deformation depends on the size and shape of the user's finger, the angle with respect to the screen, and also dynamically due to finger movement, axis, and pressure. In other words, by reading and analyzing the movement of the flesh of the finger with a capacitive touch sensor, the pressure and movement of the user's finger can be indirectly sensed.



So Google designed a machine learning algorithm that analyzes and classifies dynamic changes. The architecture outline of the gesture classification model designed by Google is as follows, and among the signals observed by the sensor, the

convolutional neural network (CNN) that focuses on spatial features and the regression neural network (RNN) that focuses on temporal features. It is a model that combines.



This gesture classification model was trained on input datasets such as long press, long press with little force, tap, scroll, and drag. This model was introduced in the Pixel series with the second Pixel feature drops in March 2020, and it became possible to accurately identify gestures even with the capacitance method than before.

Google's Android UX team said, 'By integrating machine learning algorithms and careful interaction design , we were able to provide a more expressive touch experience for Pixel users. We will continue to research and develop these features to explore touch operations.'

in Mobile,   Software, Posted by log1i_yk