Dawar, NehaKehtarnavaz, Nasser2019-06-282019-06-282018-06-129781538660898https://hdl.handle.net/10735.1/6624Full text access from Treasures at UT Dallas is restricted to current UTD affiliates.This paper presents a convolutional neural network-based sensor fusion system to monitor six transition movements as well as falls in healthcare applications by simultaneously using a depth camera and a wearable inertial sensor. Weighted depth motion map images and inertial signal images are fed as inputs into two convolutional neural networks running in parallel, one for each sensing modality. Detection and thus monitoring of the transition movements and falls are achieved by fusing the movement scores generated by the two convolutional neural networks. The results obtained for both subject-generic and subject-specific testing indicate the effectiveness of this sensor fusion system for monitoring these transition movements and falls. © 2018 IEEE.en©2018 IEEEConvolutions (Mathematics)--Computer programsMedical careNeural networks (Computer science)--ModelsPatient monitoringInertial navigation systemsWearable sensorsA Convolutional Neural Network-Based Sensor Fusion System for Monitoring Transition Movements in Healthcare ApplicationsarticleDawar, N., and N. Kehtarnavaz. 2018. "A convolutional neural network-based sensor fusion system for monitoring transition movements in healthcare applications." IEEE 14th International Conference on Control and Automation (ICCA): 482-485, doi:10.1109/ICCA.2018.8444326