Efficient Limb Range of Motion Analysis from a Monocular Camera for Edge Devices.

dc.citation.issue3
dc.citation.volume25
dc.contributor.authorYan X
dc.contributor.authorZhang L
dc.contributor.authorLiu B
dc.contributor.authorQu G
dc.contributor.editorAmerini I
dc.contributor.editorRusso P
dc.contributor.editorDi Ciaccio F
dc.coverage.spatialSwitzerland
dc.date.accessioned2025-03-03T01:02:09Z
dc.date.available2025-03-03T01:02:09Z
dc.date.issued2025-01-22
dc.description.abstractTraditional limb kinematic analysis relies on manual goniometer measurements. With computer vision advancements, integrating RGB cameras can minimize manual labor. Although deep learning-based cameras aim to offer the same ease as manual goniometers, previous approaches have prioritized accuracy over efficiency and cost on PC-based devices. Nevertheless, healthcare providers require a high-performance, low-cost, camera-based tool for assessing upper and lower limb range of motion (ROM). To address this, we propose a lightweight, fast, deep learning model to estimate a human pose and utilize predicted joints for limb ROM measurement. Furthermore, the proposed model is optimized for deployment on resource-constrained edge devices, balancing accuracy and the benefits of edge computing like cost-effectiveness and localized data processing. Our model uses a compact neural network architecture with 8-bit quantized parameters for enhanced memory efficiency and reduced latency. Evaluated on various upper and lower limb tasks, it runs 4.1 times faster and is 15.5 times smaller than a state-of-the-art model, achieving satisfactory ROM measurement accuracy and agreement with a goniometer. We also conduct an experiment on a Raspberry Pi, illustrating that the method can maintain accuracy while reducing equipment and energy costs. This result indicates the potential for deployment on other edge devices and provides the flexibility to adapt to various hardware environments, depending on diverse needs and resources.
dc.description.confidentialfalse
dc.edition.editionFebruary-1 2025
dc.format.pagination627-
dc.identifier.author-urlhttps://www.ncbi.nlm.nih.gov/pubmed/39943266
dc.identifier.citationYan X, Zhang L, Liu B, Qu G. (2025). Efficient Limb Range of Motion Analysis from a Monocular Camera for Edge Devices.. Sensors (Basel). 25. 3. (pp. 627-).
dc.identifier.doi10.3390/s25030627
dc.identifier.eissn1424-8220
dc.identifier.elements-typejournal-article
dc.identifier.issn1424-8220
dc.identifier.number627
dc.identifier.piis25030627
dc.identifier.urihttps://mro.massey.ac.nz/handle/10179/72554
dc.languageeng
dc.publisherMDPI (Basel, Switzerland)
dc.publisher.urihttp://mdpi.com/1424-8220/25/3/627
dc.relation.isPartOfSensors (Basel)
dc.rightsCC BY 4.0
dc.rights(c) 2025 The Author/s
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectRGB camera
dc.subjectclinical assessment
dc.subjectedge device
dc.subjectfast deep learning model
dc.subjectjoint range of motion
dc.subjectpose estimation
dc.subjectHumans
dc.subjectRange of Motion, Articular
dc.subjectBiomechanical Phenomena
dc.subjectDeep Learning
dc.subjectNeural Networks, Computer
dc.subjectLower Extremity
dc.titleEfficient Limb Range of Motion Analysis from a Monocular Camera for Edge Devices.
dc.typeJournal article
pubs.elements-id499722
pubs.organisational-groupOther
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
499722 PDF.pdf
Size:
676.36 KB
Format:
Adobe Portable Document Format
Description:
Evidence
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
9.22 KB
Format:
Plain Text
Description:
Collections