Because of their capacity to measure heart rates (HRs) without contact with human skin, photoplethysmography imaging (PPGI) sensors have been the focus of considerable attention. A PPGI sensor uses a camera capable of face detection and records images of facial skin, as the skin can represent changes in arterial blood volume between the systolic and diastolic phases of the cardiac cycle. Thus, these sensors enable remote monitoring of HRs and do not require any device wearable on the finger or wrist.
Researchers have developed an algorithm that provides accurate HR estimation and can be performed in real-time using vision and robot manipulation algorithms. They mounted photoplethysmography imaging sensors on a robot for active and autonomous HR (R-AAH) estimation. This dynamic approach allowed the robot to monitor HRs, which actively enables active medical services; the services include providing HR information to people in the robot’s vicinity.
The proposed R-AAH navigates a specific physical space while avoiding obstacles. It recognizes human faces and records images of facial skin while in motion. The algorithm then converts these images into PPGI signals to estimate the person’s HR. More specifically, R-AAH involves six stages: simultaneous localization and mapping (SLAM), robot navigation, face detection, facial skin extraction, PPGI signal conversion, and HR estimation.