Health Evaluation for Collaborative Robot Using Vision-Based Motion Data 


Vol. 15,  No. 2, pp. 146-159, Feb.  2026
https://doi.org/10.3745/TKIPS.2026.15.2.146


PDF
  Abstract

Collaborative robots require health evaluation to prevent human injury during human–robot collaboration. However, their programmable characteristics result in diverse tasks and complex data patterns, while existing methods relying on internal sensor data face limitations due to differences in sensor types and units across devices. This study proposes a vision-based health evaluation method that overcomes task diversity and is independent of internal sensor data. The proposed method collects normal-state motion data by designing a workspace-based test program for the target collaborative robot, trains an LSTM-based prediction model using the data, and assesses robot health by measuring the similarity between actual and predicted motion signals. Applied to the Indy7 collaborative robot by Neuromeka, the proposed method achieved 87.79% accuracy in distinguishing normal and degraded states. Parameter tuning for normal range boundaries yielded up to 100% true positive and 99.5% true negative rates, confirming its capability for quantitative and user-oriented health evaluation.

  Statistics


  Cite this article

[IEEE Style]

H. C. Yang, M. i. S. Choi, J. S. Kim, J. w. Lee, "Health Evaluation for Collaborative Robot Using Vision-Based Motion Data," The Transactions of the Korea Information Processing Society, vol. 15, no. 2, pp. 146-159, 2026. DOI: https://doi.org/10.3745/TKIPS.2026.15.2.146.

[ACM Style]

Hui Chan Yang, M in Seo Choi, Jin Se Kim, and Jung won Lee. 2026. Health Evaluation for Collaborative Robot Using Vision-Based Motion Data. The Transactions of the Korea Information Processing Society, 15, 2, (2026), 146-159. DOI: https://doi.org/10.3745/TKIPS.2026.15.2.146.