Kumuda D K, Vandana G S, Bethi Pardhasaradhi, B S Raghavendra, Pathipati Srihari, and Linga Reddy Cenkeramaddi, “Multi Target Detection and Tracking by Mitigating Spot Jammer Attack in 77GHz mm-Wave Radars: An Experimental Evaluation,” has been accepted for publication in IEEE Sensors Journal 2022.
Abstract: Small form factor radar sensors at millimeter wavelengths find numerous applications in the industrial and automotive sectors. These radar sensors provide improved range resolution, good angular resolution, and enhanced Doppler resolution for short range and ultrashort ranges. However, it is challenging to detect and track the targets accurately when a radar is interfered by another radar. This article proposes an experimental evaluation of a 77-GHz IWR1642 radar sensor in the presence of a second 77-GHz AWR1642 radar sensor acting as a spot jammer. A real-time experiment is carried out by considering five different targets of various cross sections, such as a car, a larger size motorcycle, a smaller size motorcycle, a cyclist, and a pedestrian. The collected real-time data are processed by four different constant false alarm rate detectors, cell averaging (CA)-CFAR, ordered statistics (OS)-CFAR, greatest of CA (GOCA)-CFAR, and smallest of CA (SOCA)-CFAR. Following that, data from these detectors are fed into two different clustering algorithms (density-based spatial clustering of applications with noise (DBSCAN) and K-means), followed by the extended Kalman filter (EKF)-based tracker with global nearest neighbor (GNN) data association, which provide tracks of various targets with and without the presence of a jammer. Furthermore, four different metrics [tracks reported (TR), track segments (TSs), false tracks (FTs), and track loss (TL)] are used to evaluate the performance of various tracks generated for two clustering algorithms with four detection schemes. The experimental results show that the DBSCAN clustering algorithm outperforms the K-means clustering algorithm for many cases.
Rakesh Reddy Yakkati, Pardhasaradhi Bethi, Sreenivasa reddy Yeduri, Om Jee Pandey, Linga Reddy Cenkeramaddi, “Power Transmission Line Classification From Images Using Pre-Trained Deep Learning Models,” accepted for publication in IEEE International Symposium on Smart Electronic Systems (IEEE – iSES, formerly IEEE – iNIS) 2022, 19-21 December 2022, Warangal, India.
Ajit Jha, Linga Reddy Cenkeramaddi, Santiago Royo, “Frequency Domain Analysis and Filter Design of Continuous Wave Frequency Modulated Optical Feedback Signal for Photonic Sensing,” accepted for publication in IEEE International Symposium on Smart Electronic Systems (IEEE – iSES, formerly IEEE – iNIS) 2022, 19-21 December 2022, Warangal, India.
Rakesh Reddy Yakkati, Anurag Gade, Balu Harshavardan Koduru, Pardhasaradhi Bethi, Linga Reddy Cenkeramaddi, “Classification of UAVs Using Time-Frequency Analysis of Remote Control Signals and CNN,” accepted for publication in IEEE International Symposium on Smart Electronic Systems (IEEE – iSES, formerly IEEE – iNIS) 2022, 19-21 December 2022, Warangal, India.
Suryateja Manikanta Chakkapalli, Srinivas Boppu, Linga Reddy Cenkeramaddi, Barathram Ramkumar, “Hand Gesture Recognition System in the Complex Background for Edge Computing Devices,” accepted for publication in IEEE International Symposium on Smart Electronic Systems (IEEE – iSES, formerly IEEE – iNIS) 2022, 19-21 December 2022, Warangal, India.
Rakesh Reddy Yakkati, Pardhasaradhi Bethi, Jing Zhou, Linga Reddy Cenkeramaddi, “A Machine Learning Based GNSS Signal Classification,” accepted for publication in IEEE International Symposium on Smart Electronic Systems (IEEE – iSES, formerly IEEE – iNIS) 2022, 19-21 December 2022, Warangal, India.
B. Pardhasaradhi, R. R. Yakkati, and L. R. Cenkeramaddi, “Machine Learning based Screening and Measurement to Measurement Association for Navigation in GNSS Spoofing Environment,” has been accepted for publication in the IEEE Sensors Journal (2022).
Abstract: Global navigation satellite system (GNSS) provides reliable positioning across the globe. However, GNSS is vulnerable to deliberate interference problems like spoofing, which can cause fake navigation. This article proposes navigation in a GNSS spoofing environment by taking the received power, correlation distortion function, and pseudorange measurement observation space into account. In the proposed approach, both actual and interference measurements are considered a set. Machine learning screens the authentic measurements from the accessible set using parameters such as received power and correlation function distortion. To maintain the track and navigate the GNSS’s time-varying kinematics, we used a combination of the gating technique within the Kalman filter framework and logic-based track management. The machine learning classifiers like support vector machines (SVMs), neural networks (NNs), ensemble, nearest neighbor, and decision trees are explored, and we observe that linear SVM and NN provide a test accuracy of 98.20%. A time-varying position-pull off strategy is considered, and the metrics like position RMSE and track failure are compared with the conventional M-best algorithm. The results show that for four authentic measurements and spoof injections, there are only a few track failures. In contrast, even with an increase in spoof injections, track failures are zero in the case of six authentic measurements.
Sindhusha Jeeru, Arun Kumar Sivapuram, David Gonzalez Leon, Gröli Jade, Sreenivasa Reddy Yeduri, Linga Reddy Cenkeramaddi, “Depth Camera based Dataset of Hand Gestures,” has been accepted for publication in the Data in Brief Journal (2022).
Keywords: Video hand gestures, RGB image, Depth image, RGB-D Camera, Machine learning
Abstract: The dataset contains RGB and depth version video frames of various hand movements captured with the Intel RealSense Depth Camera D435. The camera has two channels for collecting both RGB and depth frames at the same time. A large dataset is created for accurate classification of hand gestures under complex backgrounds. The dataset is made up of 29718 frames from RGB and depth versions corresponding to various hand gestures from different people collected at different time instances with complex backgrounds. Hand movements corresponding to scroll-right, scroll-left, scroll-up, scroll-down, zoom-in, and zoom-out are included in the data. Each sequence has data of 40 frames, and there is a total of 662 sequences corresponding to each gesture in the dataset. To capture all the variations in the dataset, the hand is oriented in various ways while capturing.
INCAPS – 2nd Indo-Norway Workshop on Smart Sensing, Communication and Machine Learning for Autonomous and Cyber Physical Systems (IN-SSCOM) from Oct 14, 2022 to Oct 16, 2022 (3 day workshop) is organized at Indian Institute of Technology Hyderabad (IITH) in Hybrid mode. This workshop is an outcome of the multiple ongoing collaborations between Indian Institute of Technology Hyderabad (IITH), India, and University of Agder (UiA), Norway. This includes the Department of Science and Technology (International Bilateral Cooperation Division), Government of India and the Norwegian Research council funded project, Low Altitude UAV Communication and Tracking (LUCAT), and the Norwegian Research council and the Diku (the Norwegian agency for international cooperation and quality enhancement in higher education) funded project, the Indo-Norwegian collaboration in Autonomous Cyber-Physical Systems (INCAPS). There exists an MoU between IITH and UiA to further promote this research and collaboration. Through IN-SSCOM 2022, we aim to bring together researchers, industries, and practitioners to present and discuss their latest achievements and innovations in the areas of Smart Sensing, Communications and Machine Learning for Autonomous Cyber-Physical Systems. The workshop will be an experience sharing forum and we expect to attract a good mix of academic, industry researchers, practitioners, and students working in this area.
The Indo-Norwegian collaboration in Autonomous Cyber-Physical Systems (INCAPS) aims to establish long-term collaboration between highly reputed Indian universities which include Indian Institute of Science (IISc), Bangalore, Indian Institute of Technology Hyderabad (IITH), International Institute of Information Technology Hyderabad (IIITH) and Birla Institute of Technology and Science (BITS), Hyderabad and Norwegian universities, University of Agder (UiA), Norwegian University of Science and Technology (NTNU) and Norwegian Institute for Water Research (NIVA) in world-class research and education. INCAPS considers broad areas of research which include smart sensing for autonomous cyber physical systems, mmWave sensor-based system design, de-centralized wireless communications, in-network processing and intelligence for heterogeneous wireless sensor and communication networks, machine learning and deep learning for autonomous systems, data analytics, energy harvesting based smart electronic systems, smart water networks and inference methods for timely detection and prediction, cognitive control and adaptive learning in autonomous cyber-physical systems.
The key objectives of the INCAPS project are to: I). Strengthen collaborative network between industry (both public and private enterprises, small and medium-sized enterprises, and multi-national companies) and academia. II). Increased value creation and enhanced innovation by using smart sensing, machine, and deep learning techniques in autonomous cyber-physical systems. III). Facilitate education and knowledge sharing through better mobility for students and researchers. IV). Create an arena for the generation of research and innovation projects. V). Increased utilization of research and educational infrastructure both in Norway and India. VI). Integrate professionals from industries and academics through workshops, seminars, webinars, and summer/winter schools.
The 1st Indo-Norway Workshop on unmanned Aerial VEhicles (IN-WAVE) from April 30, 2021 to May 2, 2021 (3 day workshop) is organized in a virtual mode. This workshop is an outcome of the multiple ongoing collaborations between Indian Institute of Technology Hyderabad (IITH), India, and University of Agder (UiA), Norway. This includes the Department of Science and Technology (International Bilateral Cooperation Division), Government of India and the Norwegian Research council funded project, Low Altitude UAV Communication and Tracking (LUCAT), and the Norwegian Research council and the Diku (the Norwegian agency for international cooperation and quality enhancement in higher education) funded project, the Indo-Norwegian collaboration in Autonomous Cyber-Physical Systems (INCAPS). There exists an MoU between IITH and UiA to further promote this research and collaboration. Through IN-WAVE 2021, we aim to bring together researchers, industries, and practitioners to present and discuss their latest achievements and innovations in the area of UAV Communication and Tracking. The workshop will be an experience sharing forum and we expect to attract a good mix of academic, industry researchers, practitioners, and students working in this area.
The Indo-Norwegian collaboration in Autonomous Cyber-Physical Systems (INCAPS) aims to establish long-term collaboration between highly reputed Indian universities which include Indian Institute of Science (IISc), Bangalore, Indian Institute of Technology Hyderabad (IITH), International Institute of Information Technology Hyderabad (IIITH) and Birla Institute of Technology and Science (BITS), Hyderabad and Norwegian universities, University of Agder (UiA), Norwegian University of Science and Technology (NTNU) and Norwegian Institute for Water Research (NIVA) in world-class research and education. INCAPS considers broad areas of research which include smart sensing for autonomous systems, mmWave sensor-based system design, de-centralized wireless communications, in-network processing and intelligence for heterogeneous wireless sensor and communication networks, machine learning and deep learning for autonomous systems, data analytics, energy harvesting based smart electronic systems, smart water networks and inference methods for timely detection and prediction, cognitive control and adaptive learning in autonomous cyber-physical systems.
The key objectives of the INCAPS project are to: I). Strengthen collaborative network between industry (both public and private enterprises, small and medium-sized enterprises, and multi-national companies) and academia. II). Increased value creation and enhanced innovation by using smart sensing, machine, and deep learning techniques in autonomous cyber-physical systems. III). Facilitate education and knowledge sharing through better mobility for students and researchers. IV). Create an arena for the generation of research and innovation projects. V). Increased utilization of research and educational infrastructure both in Norway and India. VI). Integrate professionals from industries and academics through workshops, seminars, webinars, and summer/winter schools.
Khushi Gupta, Srinivas M. B., Soumya J, Om Jee Pandey, Linga Reddy Cenkeramaddi, “Automatic Contact-less Monitoring of Breathing Rate and Heart Rate utilizing the Fusion of mmWave Radar and Camera Steering System,” has been accepted for publication in the IEEE Sensors Journal (2022).
Abstract: The demand for noncontact breathing and heart rate measurement is increasing. In addition, because of the high demand for medical services and the scarcity of on-site personnel, the measurement process must be automated in unsupervised conditions with high reliability and accuracy. In this article, we propose a novel automated process for measuring breathing rate and heart rate with mmWave radar and classifying these two vital signs with machine learning. A frequency-modulated continuous-wave (FMCW) mmWave radar is integrated with a pan, tilt, and zoom (PTZ) camera to automate camera steering and direct the radar toward the person facing the camera. The obtained signals are then fed into a deep convolutional neural network to classify them into breathing and heart signals that are individually low, normal, and high in combination, yielding six classes. This classification can be used in medical diagnostics by medical personnel. The average classification accuracy obtained is 87% with precision, recall, and an F1 score of 0.93.
Venkata Satya Chidambara Swamy Vaddadi, Saidi Reddy Parne, Vijeesh V. P., Suman Gandi, Saran Srihari Sripada Panda and Linga Reddy Cenkeramaddi, “Design and Fabrication of Liquid Pressure Sensor using FBG Sensor through Seesaw Hinge Mechanism,” has been accepted for publication in IEEE Photonics Journal (2022).
Keywords: Fiber gratings, Temperature Measurement, Temperature sensors, Sensitivity, Strain, Pressure measurement, Optical fiber sensors
Abstract: Pressure sensors are used in various industrial applications assisting in preventing unintended disasters. This paper presents the design and fabrication of a novel Seesaw device incorporating a diaphragm and Fiber Bragg Grating (FBG) sensor to measure the pressure of liquids. The designed sensor has been tested in a static water column. The proposed design enables the user to easily make and modify the diaphragm based on the required pressure range without interfering with the FBG sensor. The developed pressure sensor produces improved accuracy and sensitivity to applied liquid pressure in both low and high-pressure ranges without requiring sophisticated sensor construction. A finite element analysis has been performed on the diaphragm and on the entire structure at 10 bar pressure. The deformation of the diaphragm is comparable to theoretical deformation levels. A copper diaphragm with a thickness of 0.25 mm is used in the experiments. All experiments are performed in the elastic region of the diaphragm. The sensor’s sensitivity as 19.244 nm/MPa with the linearity of 99.64% is obtained based on the experiments. Also, the proposed sensor’s performance is compared with recently reported pressure sensors.
Ashish Reddy Bommana, Susheel Ujwal Siddamshetty, Dhilleswararao Pudi, Arvind T. K. R, Srinivas Boppu, M Sabarimalai Manikandan, Linga Reddy Cenkeramaddi, “Design of Synthesis-time Vectorized Arithmetic Hardware for Tapered Floating-point Addition and Subtraction,” has been accepted for publication in ACM Transactions on Design Automation of Electronic Systems (2022).
Abstract: Energy efficiency has become the new performance criterion in this era of pervasive embedded computing; thus, accelerator-rich multi-processor system-on-chips are commonly used in embedded computing hardware. Once computationally intensive machine learning applications gained much traction, they are now deployed in many application domains due to abundant and cheaply available computational capacity. In addition, there is a growing trend toward developing hardware accelerators for machine learning applications for embedded edge devices where performance and energy efficiency are critical. Although these hardware accelerators frequently use floating-point operations for accuracy, reduced-width floating-point formats are also used to reduce hardware complexity; thus, power consumption while maintaining accuracy. Vectorization concepts can also be used to improve performance, energy efficiency, and memory bandwidth. We propose the design of a vectorized floating-point adder/subtractor that supports arbitrary length floating-point formats with varying exponent and mantissa widths in this article. In comparison to existing designs in the literature, the proposed design is 2.57× area- and 1.56× power-efficient, and it supports true vectorization with no restrictions on exponent and mantissa widths.
B. H. Shekar, Shazia Mannan, Habtu Hailu, C. Krishna Mohan and C. Linga Reddy, “Performance Analysis of Deep Neural Networks for Covid-19 Detection from Chest Radiographs,” has been accepted for publication in the 15th International Conference on Machine Vision(ICMV 2022).
Kenneth Bremnes, Rebecca Moen, Sreenivasa Reddy Yeduri, Rakesh Reddy Yakkati, and Linga Reddy Cenkeramaddi, “Classification of UAVs utilizing Fixed Boundary Empirical Wavelet Subbands of RF Fingerprints and Deep Convolutional Neural Network,” has been accepted for publication in IEEE Sensors Journal (2022).
Abstract: Unmanned aerial vehicle (UAV) classification and identification have many applications in a variety of fields, including UAV tracking systems, antidrone systems, intrusion detection systems, military, space research, product delivery, agriculture, search and rescue, and internet carrier. It is challenging to identify a specific drone and/or type in critical scenarios, such as intrusion. In this article, a UAV classification method that utilizes fixed boundary empirical wavelet sub-bands of radio frequency (RF) fingerprints and a deep convolutional neural network (CNN) is proposed. In the proposed method, RF fingerprints collected from UAV receivers are decomposed into 16 fixed boundary empirical wavelet sub-band signals. Then, these sub-band signals are then fed into a lightweight deep CNN model to classify various types of UAVs. Using the proposed method, we classify a total of 15 different commercially available UAVs with an average testing accuracy of 97.25%. The proposed model is also tested with various sampling points in the signal. Furthermore, the proposed method is compared with recently reported works for classifying UAVs utilizing remote controller RF signals.