The article titled, “Automatic Contact-less Monitoring of Breathing Rate and Heart Rate utilizing the Fusion of mmWave Radar and Camera Steering System,” has been accepted for publication in the IEEE Sensors Journal (2022).

Khushi Gupta, Srinivas M. B., Soumya J, Om Jee Pandey, Linga Reddy Cenkeramaddi, “Automatic Contact-less Monitoring of Breathing Rate and Heart Rate utilizing the Fusion of mmWave Radar and Camera Steering System,” has been accepted for publication in the IEEE Sensors Journal (2022).

Keywords: Radar, Heart rate, Monitoring, Sensors, Cameras, Radar measurements, Personnel

Abstract: The demand for noncontact breathing and heart rate measurement is increasing. In addition, because of the high demand for medical services and the scarcity of on-site personnel, the measurement process must be automated in unsupervised conditions with high reliability and accuracy. In this article, we propose a novel automated process for measuring breathing rate and heart rate with mmWave radar and classifying these two vital signs with machine learning. A frequency-modulated continuous-wave (FMCW) mmWave radar is integrated with a pan, tilt, and zoom (PTZ) camera to automate camera steering and direct the radar toward the person facing the camera. The obtained signals are then fed into a deep convolutional neural network to classify them into breathing and heart signals that are individually low, normal, and high in combination, yielding six classes. This classification can be used in medical diagnostics by medical personnel. The average classification accuracy obtained is 87% with precision, recall, and an F1 score of 0.93.

More details: DOI: 10.1109/JSEN.2022.3210256

The article, “Deep Learning based Sign Language Digits Recognition from Thermal Images with Edge Computing System,” has been accepted for publication in the IEEE Sensors Journal.

Daniel S. Breland, Simen B. Skriubakken, Aveen Dayal, Ajit Jha, Phaneendra K. Yalavarthy, and Linga R. Cenkeramaddi, “Deep Learning based Sign Language Digits Recognition from Thermal Images with Edge Computing System,” IEEE Sensors Journal 2021 (in press).

Keywords: Gesture recognition, Cameras, Pins, Integrated circuit modeling, Assistive technology, Three-dimensional displays, Lighting

Abstract: The sign language digits based on hand gestures have been utilized in various applications such as human-computer interaction, robotics, health and medical systems, health assistive technologies, automotive user interfaces, crisis management and disaster relief, entertainment, and contactless communication in smart devices. The color and depth cameras are commonly deployed for hand gesture recognition, but the robust classification of hand gestures under varying illumination is still a challenging task. This work presents the design and deployment of a complete end-to-end edge computing system that can accurately provide the classification of hand gestures captured from thermal images. A thermal dataset of 3200 images was created with each sign language digit having 320 thermal images. The solution presented here utilizes live images taken from a low-resolution thermal camera of 32×32 pixels, feeding into a novel light weight deep learning model based on bottleneck motivated from deep residual learning for classification of hand gestures. The edge computing system presented here utilizes Raspberry pi with a thermal camera making it highly portable. The designed system achieves an accuracy of 99.52% on the test data set with an added advantage of accuracy being invariable to background lighting conditions as it is based on thermal imaging.

More details:DOI: 10.1109/JSEN.2021.3061608