Touch sensors in mobile phones that detect finger input on the touch screen have not changed much since their first release in the mid-2000s. Conversely, the screens of smartphones and tablets continuously offer improved visual quality: higher color fidelity, higher resolution, and crisper contrast.
Researchers have now developed an artificial intelligence (AI) algorithm that gives touch sensors super-resolution (8x higher). The development allows the sensors to reliably detect when and where fingers touch the display surface, with much higher accuracy than current devices.
The researchers developed the AI for capacitive touch screens, the types used in all our mobile phones, tablets, and laptops. The sensors detect the position of fingers because the electrical field between the sensor lines changes due to the proximity of a finger when touching the screen surface. They increased the low resolution of the sensors and precisely inferred the contact area between the finger and display surface from the capacitive measurements.
To train the AI, the researchers built a custom apparatus that records capacitive intensities, measurements our phones and tablets take, and accurate contact maps using an optical high-resolution pressure sensor. The researchers created a training dataset by capturing a large number of touches from several test participants, from which “CapContact” learned to predict super-resolution contact areas from the coarse and low-resolution sensor data of today’s touch devices.
Related Content: Fast Fabrication Of Optical Sensor Chips