Research on the Use of Multimodal Data for Detecting Emergency Situations Involving Elderly People Living Alone 


Vol. 14,  No. 11, pp. 950-959, Nov.  2025
https://doi.org/10.3745/TKIPS.2025.14.11.950


PDF
  Abstract

This study developed a multimodal anomaly detection model to monitor the safety of elderly people living alone in an ageing society, and compared and analysed its performance with that of a single modality model. The data used in the experiment was a risk detection dataset for elderly care provided by AI Hub, which included three sensor modalities: biosignals, emergency requests, and infrared images. The 1D-CNN model showed the best performance (accuracy 99.74%, F1-score 0.9975) for biosignal data, while the LSTM model recorded the highest performance (accuracy 99.99%, F1-score 0.9750) for emergency request data. The performance of the infrared image-only model (accuracy 88.01%, F1-score 0.8668) was relatively low compared to other models, but the application of the multimodal early fusion model significantly improved the anomaly detection performance based on infrared images (accuracy 94.52%, F1-score 0.9475). This experiment experimentally demonstrated that multimodal fusion effectively compensates for the limitations of a single sensor and contributes to improving the detection performance of image data.

  Statistics


  Cite this article

[IEEE Style]

S. Lim, S. Baik, Y. Hong, "Research on the Use of Multimodal Data for Detecting Emergency Situations Involving Elderly People Living Alone," The Transactions of the Korea Information Processing Society, vol. 14, no. 11, pp. 950-959, 2025. DOI: https://doi.org/10.3745/TKIPS.2025.14.11.950.

[ACM Style]

Suyeon Lim, Seongbok Baik, and Yong-Geun Hong. 2025. Research on the Use of Multimodal Data for Detecting Emergency Situations Involving Elderly People Living Alone. The Transactions of the Korea Information Processing Society, 14, 11, (2025), 950-959. DOI: https://doi.org/10.3745/TKIPS.2025.14.11.950.