Omar Costilla-Reyes has an M.Sc. degree in Electrical Engineering from the University of North Texas, U.S.A., where he was a research assistant in projects with funding from the National Science Foundation (NSF) and National Aeronautics and Space Administration (NASA). His M.Sc. dissertation was on dynamic indoor positioning systems using wireless sensor networks. He is currently a research associate and PhD candidate in Electrical and Electronics Engineering at the University of Manchester, U.K. He has published papers on applications of machine learning for gait analysis in security and healthcare. His research interest lies on applications of machine learning using sensor systems for security and healthcare. He received the Best Student Paper Award in Optical Sensing applications at the 2015 IEEE Sensors Conference.
Visualization of Deep Neural Networks for Learning Spatio-Temporal Features from Tomography Sensors
This presentation demonstrates accurate spatio-temporal gait data classification from raw tomography sensor data without the need to reconstruct images. This is based on a simple yet efficient machine learning methodology based on a convolutional neural network architecture for learning spatio-temporal features, automatically end-to-end from raw sensor data. In a case study on a floor pressure tomography sensor, experimental results show an effective gait pattern classification F-score performance of 97.88 1.70%. It is shown that the automatic extraction of classification features from raw data leads to a substantially better performance, compared to features derived by shallow machine learning models that use the reconstructed images as input, implying that for the purpose of automatic decision-making it is possible to eliminate the image reconstruction step.