Ongoing Research

Health Care Application with BLE 5.0.

• Development of a BLE 5.0 enabled ultra-low-power wrist-worn sensor system for streaming physiological sensor signal to user end and the server. Feasibility evaluation of the capability of the sensor system to monitor daily health from the wrist position.

• Implementation of machine learning and deep learning models on the embedded processor, nRF52840, of BLE chip.

Assistance in Blind Vision. 

• Recognition of on-path obstacles from the egocentric camera placed on eye position.

• Implementation of two neural networks on an embedded ARM M4 processor for recognizing voice commands of the wearer and US dollar bill for counting.

Machine Vision on Multisensory Smart Wheelchair.

• Recognition of on-path obstacles from the Quadrascopic camera placed on a powered wheelchair. Customization of Quadrascopic camera to an improved FOV and localization.

• Implementation of deep neural networks on an embedded NVIDIA Jetson Nano processor of the smart wheelchair for recognizing user intention and providing locomotion support.

Contactless fingerprint and Child’s Irish Recognition System.

• Development of an adjustable stereoscopic camera for capturing images of children of all ages. Integration of a deep neural network on the embedded Jetson Nano processor to provide real-time feedback to ensure quality image capture.

• Development of a contactless fingerprint acquisition system for both adults and children to capture the combination of hand and inner knuckles.

Study The Psychophysics of Balance.

• Development of sensor-based innovative methods for studying postural stability and possible mechanisms for balance control • Determine the relationship between acceleration threshold and the length and duration of the translation in neurologically intact young adults, neurologically intact older adults without diabetes, and older adults with diabetes

• Determination of thresholds and quasi-static posturography involving psychophysics.

 

Collaborated Research

 

Personal Automatic Cigarette Tracker 2.0. Collaboration with the University of Alabama

• Design and development of multi-sensory wearable sensor systems to objectively monitor behavioral and physiological manifestation of cigarette smoking in free-living. Major developments include a low power chest device, a hand device, an instrumented lighter, an egocentric camera, a Raspberry PI based smart IoT Charger, etc.

• Application of computational intelligence to extract information on smoking habits. Major accomplishments include the development of an SVM-based machine learning model to detect smoking from heart rate parameters, an RCNN-based deep learning model to recognize smoking event from full-day images, an image classifier to categorize smoking environment and smoking context, SVM and CNN-LSTM based models to classify smoking and non-smoking gestures of the hand, smoke inhalations, extraction of smoke metrics, etc.

Automatic Ingestion Monitor 2.0. Collaboration with the University of Alabama

• Design and development of multi-sensory wearable camera-based sensor systems to objectively monitor the eating behavior of an individual in free-living. Major developments include a low power egocentric eye-glass camera, an ear-mount sensor, etc.

• Application of computational intelligence to extract information on food intake. Major accomplishments include the development of an SVM-based machine learning model to detect eating episodes and determine the type of food.

• Design and development of a low power stereo-camera to monitor cooking and eating environment in the rural area.

Monitoring of Infant Feeding. Collaboration with the University of Alabama

• Design and development of an intelligent infant bottle (comprising of Omnivision camera, IMU, Pressure sensor, etc.) to monitor the milk-intake of an infant.

• Development of computer algorithms to assess the nutritive sucking patterns of infants.

• Automatic extraction of breastfeeding statistics from the egocentric images captured from mother’s eyeglass camera.

Personalized Prosthesis Controller Design. Collaboration with the University of Alabama

• Design and development of a measurement exoskeleton to support the development of robotic lower-limb prostheses. Development of computer algorithms to assess the human gait and walking speed from wearable and exoskeleton sensors.

• Development of a YOLO-deep learning model to recognize obstacles and involvement with stairs during human walking from eye level egocentric images.

• Development of a LIDAR-based sensor to obtain ambulatory velocity profile. • Development of an ANN to predict the treadmill walking speed from a chest-mounted IMU.