Please use this identifier to cite or link to this item: http://cmuir.cmu.ac.th/jspui/handle/6653943832/52548
Full metadata record
DC FieldValueLanguage
dc.contributor.authorYuphawadee Nathasitsophonen_US
dc.contributor.authorSansanee Auephanwiriyakulen_US
dc.contributor.authorNipon Theera-Umponen_US
dc.date.accessioned2018-09-04T09:26:58Z-
dc.date.available2018-09-04T09:26:58Z-
dc.date.issued2013-08-22en_US
dc.identifier.other2-s2.0-84881631183en_US
dc.identifier.other10.1109/ISIE.2013.6563711en_US
dc.identifier.urihttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=84881631183&origin=inwarden_US
dc.identifier.urihttp://cmuir.cmu.ac.th/jspui/handle/6653943832/52548-
dc.description.abstractOne of the health issues in elderly people is the injury from the fall. Some of these injuries might lead to deaths. Thus, a good fall detection algorithm is needed to help reducing a rescuing time for a helper. In this paper, we develop a fall detection algorithm using the linear prediction model with a tri-axis accelerometer. We test the algorithm with the data set that have 11 activities (standing, walking, jumping, falling, running, lying, sitting, getting up (from lying to standing or from sitting to standing), going down (from standing to sitting), accelerating and decelerating) from 17 subjects. The result shows that we can detect all fall activities in both training and blind test data sets with precisions of 90.72% and 93.69%, respectively. The result also shows that we can detect 89.77% and 93.27% of other activities correctly. Although, there are some false alarms, the false alarm rate is small. © 2013 IEEE.en_US
dc.subjectEngineeringen_US
dc.titleFall detection algorithm using linear prediction modelen_US
dc.typeConference Proceedingen_US
article.title.sourcetitleIEEE International Symposium on Industrial Electronicsen_US
article.stream.affiliationsChiang Mai Universityen_US
Appears in Collections:CMUL: Journal Articles

Files in This Item:
There are no files associated with this item.


Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.