Touch sensors in mobile phones that detect finger input on the touch screen have not changed much since their first release in the mid-2000s. Conversely, the screens of smartphones and tablets are continuously offering improved visual quality: higher color fidelity, higher resolution, crisper contrast.
Researchers have now developed an artificial intelligence (AI) algorithm that gives touch sensors super-resolution (8x higher). The development allows the sensors to reliably detect when and where fingers touch the display surface, with much higher accuracy than current devices.
The researchers developed the AI for capacitive touch screens, the types used in all our mobile phones, tablets, and laptops. The sensors detect the position of fingers because the electrical field between the sensor lines changes due to the proximity of a finger when touching the screen surface. They increased the low resolution of the sensors and precisely inferred the respective contact area between the finger and display surface from the capacitive measurements.