1. bookVolumen 15 (2022): Heft 1 (January 2022)
Zeitschriftendaten
License
Format
Zeitschrift
eISSN
1178-5608
Erstveröffentlichung
01 Jan 2008
Erscheinungsweise
1 Hefte pro Jahr
Sprachen
Englisch
Uneingeschränkter Zugang

Wearable-Gait-Analysis-Based Activity Recognition: A Review

Online veröffentlicht: 04 Jan 2023
Volumen & Heft: Volumen 15 (2022) - Heft 1 (January 2022)
Seitenbereich: -
Eingereicht: 31 Mar 2022
Zeitschriftendaten
License
Format
Zeitschrift
eISSN
1178-5608
Erstveröffentlichung
01 Jan 2008
Erscheinungsweise
1 Hefte pro Jahr
Sprachen
Englisch
Introduction

Human activity recognition systems are systems that use data obtained from sensors to identify different actions carried out by people. These systems have a wide range of applications in various fields such as occupational safety [12, 13, 16, 44], personal fitness and sports [68], rehabilitation [24, 67], elderly care [41, 46, 47, 56], telemedicine [11], human-computer interaction [18], etc.

Human activity recognition systems are broadly classified into two main groups based on the different data collection approaches: vision-based methods [53] and sensor-based methods [36]. Vision-based activity recognition involves the use of images or videos captured by optical sensors in the recognition of human activities. However, these methods are affected by poor lighting conditions and changing environments [36]. To avoid the shortages of the vision-based method, methods that rely on sensors such as pressure sensors, temperature sensors, sound sensors, and radar sensors embedded in the environment were proposed. The most common example of such applications is in the implementation of smart homes for assisted living [19, 48]. However, these methods require a high number of sensors which makes them suitable in controlled environments, but not practical in free-living environments. This limits the application of activity recognition in daily life.

In recent years, with the development of sensing and computing electronics, activity sensing systems that can be worn on the human body have been proposed. With the help of these wearable activity sensing systems, the application of many disciplines has been expanded from controlled environments to free-living environments, which contributes to the recognition of activities in daily life. An example of such discipline is gait analysis. Gait analysis is the systematic study of human locomotion [57]. It generally involves the measurement, estimation, and analysis of certain measurable parameters including absolute and relative body angles, positions, movement patterns and joints’ range of motion [55]. It also includes the gait study of different activities, which can contribute to activity recognition directly.

Since gait analysis is traditionally performed in controlled environments, it is mainly used for the diagnosis of locomotory-related abnormalities, but not for activity recognition purposes. With the help of wearable activity sensing systems, motion data for gait analysis can be collected in free-living environments. This makes it possible to use gait analysis to contribute to activity recognition in daily life.

Many surveys have been conducted over the years on wearable activity recognition systems. However, there are currently no existing reviews on wearable-gait-analysis-based (WGA-based) activity recognition systems. This study aims to close this gap by highlighting the WGA-based activity recognition systems.

Google Scholar was used in searching for all papers included in this study. Different combinations of the following group of keywords were employed in the search; “activity recognition”, “wearable sensors”, “sensors”, “gait analysis”, “gait”, “toe off”, and “heel strike”. The results from the keyword searches were then filtered to include articles published from 2012 to 2022. The top 150 relevant articles from each search were selected from which wearable-gait-analysis-based activity recognition methods that met the selection criteria were chosen. The complete selection process, as well as the selection criteria, are shown in Fig. 2. As shown in Fig. 3, there is a growing interest in this field over the past 10 years.

Related Work

A number of surveys have been conducted on activity recognition systems in past years. There have been broader surveys on both vision-based and wearable activity recognition systems covering a range of topics including the type of sensors used, activities recognized, segmentation approaches, classification algorithms, applications, and advantages of the two approaches [6, 21, 37, 50, 58, 62]. Other surveys focused on either vision-based activity recognition methods [4, 7, 9, 54, 60, 65]; sensor-based activity recognition methods which include the use of environment embedded sensors, smartphones, and wearable sensors [17, 49]; or solely on wearable activity recognition methods.

Similar to broader activity recognition surveys, review studies, which focus solely on wearable-based activity recognition methods, have covered a wide range of topics concerning this field. Mukhopadhyay et al., for instance, discussed some challenges and advancements in wearable activity recognition systems, with a primary focus on the sensors employed by such systems [40]. Salam et al. employed a standardized evaluation benchmark to evaluate various wearable activity recognition methods with six publicly available data sets [3]. Similarly, Lara et al. evaluated twenty-eight wearable activity recognition systems in terms of recognition performance, energy consumption, obtrusiveness, and flexibility [25].

With growing interest in the application of wearable activity recognition systems in the healthcare sector, some reviews have highlighted wearable activity recognition methods with applications in the healthcare field. Rex et al., for instance, focused on wearable human activity recognition systems with applications in the healthcare field [26]. Topics such as sensor types, numbers, and placements, as well as classification algorithms, were discussed in this review study.

The use of machine learning algorithms for activity recognition has gained a lot of attention in recent years. As such, a number of review studies have been conducted to highlight the various machine learning methods which have been employed by wearable activity recognition systems. Zang et al., for example, focuses on deep learning methods used by wearable activity recognition systems [64]. This study highlights the current advancements, developing trends, and major challenges in this field.

Although a wide range of topics has been covered by existing literature reviews, there is currently no existing review study focused on wearable activity recognition systems which have employed knowledge from the field of gait analysis in the activity recognition process. This study seeks to close this gap by highlighting the ways by which gait analysis is employed by current activity recognition systems, including discussing some commonly used wearable sensor types and positions to realize gait analysis-based segmentation and extract gait analysis-based features to distinguish various human activities.

WGA-based activity recognition

As shown in Fig. 1, WGA-based activity recognition systems generally involve four main processes. These include data collection with wearable sensors, data segmentation, feature extraction, and activity classification. WGA-based activity recognition techniques incorporate gait analysis in one or more of these processes. In this section, WGA-based activity recognition techniques were discussed from the aspect of these four processes.

Figure 1

Four main steps for WGA-based activity recognition. The pressure sensors and IMU in the “Data Collection” section represent the commonly used wearable sensors in WGA-based activity recognition systems. The plots in the “Data Segmentation” section represent the gait cycle-based method which involves the segmentation of data through the detection of gait cycles and, the fixed non-overlapping sliding window approach which involves the segmentation of data using fixed time windows. To extract features for activity recognition, knowledge-driven features and data-driven features are frequently used. The icons in the “Classification” section represent examples of activities that can be recognized by activity recognition systems during the classification phase.

Figure 2

Flow Chart of the Article Selection Process.

Figure 3

Distribution of the WGA-Based Activity Recognition Publications Over Time.

Wearable Sensors

Wearable sensors are sensors that can be worn on various parts of the human body, such as the feet, knees, or hips, to measure data related to human subjects. For wearable gait analysis and wearable activity recognition, wearable sensors that can measure human motion are necessary. As shown in Table 1, the two most frequently used sensors are Inertial Measurement Unit (IMU) and pressure sensors.

IMUs are the most popularly used sensors in WGA-based activity recognition systems. They are small, lightweight, and inexpensive, which makes them suitable to be worn on the human body. IMUs are mostly made up of a 3-axis accelerometer, a 3-axis gyroscope, and/or a 3-axis magnetometer. Accelerometers are used to measure acceleration, based on which velocity and distance can be calculated. Gyroscopes, on the other hand, are used to measure angular velocity and orientation. Magnetometers measure magnetic field or magnetic dipole moment [5]. By fusing the data from accelerometers, gyroscopes, and/or magnetometers, human motion can be measured with good accuracy.

Different research employed varying numbers of IMUs and positioned IMUs at different locations. The number of IMUs used by research in Table 1 ranged from 1 to 3. Locations on the lower limb, including the thigh, shank, ankle, and foot, are popular choices for many WGA-based activity recognition systems. This is generally because most human activities are performed with the support of the lower limbs. Therefore, necessary data can be acquired for gait analysis and activity recognition with IMUs placed on the lower limbs. For example, as shown in Fig. 4b, Lopez-Nava et al. proposed the use of one IMU placed on the right ankle, to collect data for the detection of gait events, such as toe-off and heel-strike, and for the recognition of activities, such as level-ground walking, stairs ascent, stairs descent, ramp ascent, and ramp descent [29]. Similarly, the use of a single IMU worn on the ankle to collect data for the recognition of walking, stair ascent, and stairs descent activities was presented by McCalmont et al. [35]. As shown in Fig. 4a, Martinez-Hernandez et al. [32] proposed using data from three IMUs attached to the thigh, shank, and foot for the recognition of level-ground walking, ramp ascent, and ramp descent activities.

Figure 4

Commonly used IMU sensor positions. a) Three IMU sensors positioned at the thigh, shank, and foot to capture data for activity recognition [32]. b) A single IMU sensor worn at the ankle for activity recognition [29].

Summary of WGA-based Activity Récognition Techniques

RéférencesRecognized ActivitiesWearable SensorsData SegmentationExtracted FeaturesActivity Recognition
Martinez et al. [32]Level-ground walking, ramp ascent, and ramp descent.3-axis gyroscope and pressure sensors.Gait cycle-based methodTime-domain featuresAdaptive Bayesian Inference method
McCalmont et al. [35]Slow walking, normal walking, Fast walking, stair ascent, and stair descent.3-axis accelerometer, 3-axis gyroscope, 3-axis magnetometer, Pressure sensor array.Gait cycle-based methodTime-domain features and gait-based features.Artificial neural network, K-nearest neighbour (KNN), and Random Forest.
Ng et al. [42]Walking, sitting, lying, and falling.Sensor tagsGait cycle-based methodRaw sensor dataKNN and Random
Lopez et al. [29]Level-ground walking, Stair ascent, stair descent, Ramp ascent, and ramp descent.3-axis accelerometerGait cycle-based method.Time-domain features and frequency-domain features.KNN
Chenet al. [14]Walking, running, standing, sitting, stair ascent, and Stair descent.3-axis accelerometer, 3-axis gyroscope, Pressure sensor array.Gait cycle-based methodGait-based featuresSupport vector machine (SVM)
Jeong et al. [23]Level-ground walking, ascent. and stair descent.Pressure sensorsGait cycle-based methodRaw sensor dataSVM
Truong et al. [59]Level-ground walking, stair ascent. and stair descent.Pressure sensorsGait cycle-based methodTime-domain featuresSVM
Martinez et al. [33]Level-ground walking, ramp ascent, and ramp descent.3-axis accelerometer, 3-axis gyroscope, and Pressure sensors.Gait cycle-based methodTime-domain featuresBayesian formulation Based approach
Achkaretal. [38]Level-ground walking, standing, sitting, stair ascent, stair descent, Ramp ascent, and ramp descent.3-axis accelerometer, 3-axis gyroscope, 3-axis magnetometer, Pressure sensors, and barometric sensor.Gait cycle-based methodGait-based features.Rule-based method.
Zhao et al. [66]Level-ground walking, Stair ascent. stair descent. Ramp ascent, and ramp descent.Pressure sensors and electromyography sensors.Gait cycle-based methodTime-domain features.SVM
Mazumder et al. [34]Level-ground walking, fast walking, standing, sitting, Stair ascent, stair descent, and ramp ascent.3-axis accelerometer, 3-xis gyroscope, and pressure sensors.Gait cycle-based methodTime-domain features, Polynomial coefficients Extracted from hip angle Trajectory and centre-of-pressure (CoP) trajectory.SVM
Camargo et al. [10]Level-ground walking, Stair ascent, stair descent, Ramp ascent, and ramp descent.3-axis accelerometer, 3-axis gyroscope, goniometer, and îlectromyography sensor.Gait cycle-based methodTime-domain features and frequency-domain features.Dynamic Bayesian network
Ershadi et al. [20]Toe level ground walking, Normal level-ground walking, Sitting, and standing.Pressure sensors.Gait cycle-based methodTime-domain features.Rule based method
Martindale et al. [31]Level-ground walking, sitting, stair ascent, stair descent, jogging, running, cycling, and jumping.3-axis accelerometer, 3-axis gyroscope, and pressure sensors.Gait cycle-based methodRaw sensor data.Convolutional Neural Networks (CNN) and Récurrent Neural Network (RNN).
Benson et al. [8]Normal running and fast running.3-axis accelerometer, 3-axis gyroscopeGait cycle-based methodTime-domain features, frequency-domain features, and wavelet-based features.SVM
Hamdi et al. [22]Level-ground walking, Stair ascent, stair descent, ramp ascent, and ramp descent.3-axis accelerometer, and 3-axis gyroscopeGait cycle-based methodGait-based features, time-domain features, frequency-domain, and wavelet-based features.Random Forest
Achkar et al. [39]Level-ground walking, standing, sitting, Stair ascent, stair descent, Ramp ascent, and ramp descent.3-axis accelerometer, 3-axis gyroscope, 3-axis magnetometer, pressure sensors, and barometric sensor.Gait cycle-based methodGait-based features and time-domain features.Rule based method
Xiuhua et al. [27]Level-ground walking, Ramp ascent, and ramp descent.3-axis accelerometer, 3-axis gyroscope, and pressure sensors.Gait cycle-based methodGait-based features.Class incrémental learning method.
Ngo et al. [2]Level-ground walking, Stair ascent, stair descent, Ramp ascent, and ramp descent.3-axis accelerometer and 3-axis gyroscope.Gait cycle-based methodTime-domain features.KNN and SVM.

Pressure sensors are another commonly used sensor type in WGA-based activity recognition systems. They are usually placed beneath the foot and are used to capture foot plantar pressure during the execution of activities. Variations in plantar pressure during various activities provide insights for gait analysis and activity recognition [14, 23, 45].

Similarly to IMUs, varying sensor numbers and positions have been employed in existing works. For example, as shown in Fig. 5a, Chen et al. [14] proposed the use of an insole-shaped pressure sensor array, with 96 pressure sensors evenly distributed on it, to capture plantar pressure with high spatial resolution. The plantar pressure data was used to calculate 26 gait parameters and recognize 6 daily activities. As shown in Fig. 5b, Jeong et al. [23] and Truong et al. [59] proposed using eight pressure sensors distributed at the big toe, metatarsal, and heel positions for the collection of data to detect gait cycles and recognize level-ground walking, stair ascent, and descent activities. Mazumder et al., as shown in Fig. 5c, proposed the use of five pressure sensors – placed at the heel, toe, and metatarsal positions – in the detection of gait cycles and the recognition of level-ground walking, fast walking, standing, sitting, stair ascent, stair descent, and ramp ascent activities [34]. Similarly, Martinez-Hernandez et al. proposed using four pressure sensors embedded in insoles in the detection of gait cycles for the recognition of level-ground walking, ramp ascent, and ramp descent activities [33].

Figure 5

Different numbers and locations of pressure sensors used in WGA-based activity recognition systems. a) A pressure sensor array with 96 pressure sensors evenly distributed on it [14]. b) Eight pressure sensors distributed at the big toe, metatarsal, and heel [23, 59]. c) Five pressure sensors placed at the toe, metatarsal, and heel [34].

Data Segmentation

Sensor data segmentation is an important step in the activity recognition process. It can influence the real-time performance and accuracy of activity recognition systems. There are generally two types of data segmentation techniques: the sliding window method and the gait cycle-based method.

The sliding window method is one of the most popularly used segmentation approaches, especially in non-WGA-based activity recognition methods. It involves the use of a fixed or dynamic time interval to segment time-series sensor data. However, with the sliding window approach, there is a higher tendency of capturing two different activities in one data segment, especially during the transition phases. When this happens, data segments will most likely be wrongly labeled as one activity, thus influencing the accuracy of the activity recognition system.

To address this problem, most WGA-based activity recognition systems use gait cycles to segment sensor data [14, 23, 35, 42]. One full gait cycle begins with one repetitive gait event (e.g., heel-strike or toe-off) and continues until the occurrence of the same gait event on the same foot. For activities performed with both feet, since one human subject can only perform one activity during one gait cycle, the gait cycle is recognized as the unit for different activities [14]. Therefore, the gait cycle can be used to clearly separate different activities. Toe-off and heel-strike are two frequently used gait events for gait cycle detection.

Both IMUs and pressure sensors can be used to detect gait cycles. For pressure sensors, since the plantar pressure will increase significantly during heel-strike and decrease significantly during toe-off, a threshold could be used to recognize these two gait events and then detect gait cycles. For example, Martinez-Hernandez et al. [32] proposed a threshold crossing method for the recognition of toe-off and heel-strike events and detection of gait cycles. The acceleration data captured by IMUs placed on the ankle can also be used to detect toe-off and heel-strike events of the gait cycle. The toe-off event can be detected as the initial acceleration in a characteristic peak of the x-axis acceleration or z-axis acceleration. The heel-strike can also be detected as the deceleration in the characteristic peak of the x-axis acceleration [29, 30].

Feature Extraction

After segmenting the sensor data, features usually need to be extracted for the classification of activities. Generally, there are two main feature categories: data-driven features and knowledge-driven features.

Data-driven features are the most frequently used features for activity recognition, which can be extracted automatically and manually. Deep learning algorithms can be used for extracting data-driven features automatically. For example, Convolutional Neural Networks (CNN) are able to learn complex structures and patterns from the segmented data and automatically extract features for the recognition of activities. Manually extracted data-driven features are popularly used for activity recognition. Such features include time-domain features and frequency-domain features. These features are mostly based on characteristic differences, such as differences in frequency, acceleration, and cycle time between the activities to be recognized [61]. Examples of time-domain features include mean value, variance, median, skewness, percentile, and interquartile ranges, etc. McCalmont et al. [35] used 30 features consisting of the time domain features such as the mean and standard deviation of the acceleration, and angular velocity signals in the recognition of slow walking, normal walking, fast walking, stair ascent, and stair descent activities. Frequency-domain features include the signal power, spectral entropy, auto-correlation coefficients, mean frequency, median frequency, etc. Lopez-Nava et al. [29] extracted the power of the acceleration which is a frequency-domain feature from the segmented sensor data to assist in the recognition of level-ground walking, stair ascent, stair descent, ramp ascent, and ramp decent activities. Although data-driven features have proved to be effective in controlled research environments, the performance of these features highly relies on the collected training dataset. It would be challenging for data-driven features to recognize activity varieties that are not included in the training dataset.

Knowledge-driven features are complementary to data-driven features. Knowledge-driven features are representative features that are extracted based on existing knowledge resources of observed data. For activity recognition, knowledge-driven features based on gait analysis have shown to be effective. Examples of knowledge-driven features are “foot contact pitch” and “double support time”. They can be used to recognize daily activities. The “foot contact pitch” is the pitch angle at the time when the foot initially contacts the ground (i.e., heel-strike). The “double support time” is the period when both feet are in contact with the ground. According to the existing knowledge, the “foot contact pitch” of stair ascent, stair descent, and level-ground walking are −4.7° ± 6.4, −16.6° ± 4.7°, and 19.0° ± 4.4°, respectively [51]. Therefore, based on the “foot contact pitch”, those three activities can be discriminated. In addition, since the “double support time” of stair ascent, stair descent, level-ground walking, and running account for 13.6% ±1.9%, 11.2% ± 2.3%, 11.1% ± 1.7%, and 0.0% of a whole gait cycle, it is easy to discriminate running from the other three activities [43, 51]. Researches have been done to show the effectiveness of knowledge-driven features. Chen et al. [14], for example, extracted two of the three features (i.e., “foot contact pitch”, “percentage of double support time”, and “pitch angle at midstance”) based on the knowledge of the human gait characteristics to recognize walking, stair ascent, stair descent, and running activities. As shown in Fig. 6a, for walking, the position of the forefoot is significantly higher than the hindfoot during heel-strike. For stair descent (Fig. 6c), the position of the forefoot is significantly lower than the hindfoot. And for stair ascent (Fig. 6b), the foot is almost flat. These posture differences could lead to significant differences in the “foot contact pitch”, which enables these three activities to be distinguished. The “percentage of double support time” is the percentage of “double support time” over the total gait cycle time. It can be used to discriminate running from all the other three activities because as compared to other walking activities, there is no “double support time” for running, instead, there is a phase known as the “double float” phase when both feet are off the ground (Fig. 6d). This research achieved 99.8% accuracy in recognizing these activities.

Figure 6

Foot contact pitch during (a) walking, (b) stair ascent, (c) stair descent, and (d) the double float phase during running. This gait-analysis-based parameter was used by Chen et al. [14] in the recognition of activities.

Classification

Classification is the final step for activity recognition. Artificial intelligence models are the most popular used methods for activity recognition. About 63% of the WGA-based activity recognition systems reviewed in this study used artificial intelligence models. Artificial intelligence models such as Artificial Neural Networks (ANN), Support Vector Machines (SVM), K-Nearest Neighbor (KNN), Naive Bayes, and Convolutional Neural Networks (CNN) are among the most popularly used. Lopez-Nava et al. used a KNN classifier in the recognition of level-ground walking, ramp ascent, ramp descent, stair ascent, and stair descent activities with an accuracy of 85.5% [29]. Jeong et al. employed an SVM classifier in the recognition of level-ground walking, stair descent, and stair ascent activities [23]. This classifier was able to attain an accuracy of 95.2%. Similarly, Chen et al. achieved an overall accuracy of 99.8% in the recognition of walking, running, sitting, standing, stair ascent, and stair descent activities with the use of an SVM classifier [14]. McCalmont et al. conducted a comparative study on the ANN, KNN, and Random Forest classifiers [35]. In this study, the ANN classifier was able to achieve the highest accuracy of 80% with both the KNN and random forest achieving an accuracy of 70% in the recognition of slow, normal, and fast walking, stair ascent, and stair descent activities.

Discussion

This study demonstrates how gait analysis can be used to contribute to wearable activity recognition. In this section, the limitations of current research and the potential opportunities for future research in the field of WGA-based activity recognition will be discussed.

Wearable Sensor Types, Numbers, and Locations

The most frequently used wearable sensors for WGA-based activity recognition systems are IMUs and pressure sensors, with the use of other sensor types not yet explored. Although the use of IMUs and pressure sensors has been demonstrated to be effective for the recognition of simple daily activities [14, 35], these two sensors have some shortcomings. For example, the accuracy of IMUs is influenced by drift problems [15], especially during long-term measurement, and the accuracy of pressure sensors is influenced by the contact environment (e.g., soft or hard).

To help improve the performance of the WGA-based activity recognition systems, using different types of wearable sensors is one solution. For instance, to capture data for the recognition of activities that involve a change in altitude and posture, the use of barometer sensors could be explored. Barometers (Fig. 7a) are generally used to capture changing atmospheric pressure, which can be used to detect changing altitudes [52]. Rodriguez-Martin et al. [52] for example, proposed the use of barometers together with accelerometers in the recognition of activities. The addition of the barometer sensors increased the accuracy of detected posture transitions and falls by up to 11%. Another type of wearable sensor that can be used in WGA-based activity recognition systems is the strain sensor (Fig. 7b). For wearable activity recognition systems that need to be worn for a long time, people prefer sensors embedded into their clothing or accessories than wearing the system separately. Strain sensors have gained much attention due to their flexibility, lightweight nature, and their ability to be integrated into clothing or directly mounted on the skin [63]. In addition, they can be used in the detection of the elbow, wrist, finger, and joint movements and thus can be employed in the recognition of activities of higher complexity that involve the movement of these body parts.

Figure 7

Other wearable sensor types which can be employed in activity recognition. a) Barometer [1] b) Strain sensor [28].

As discussed in the section “Wearable Sensors”, different studies applied different numbers of sensors on different body locations for activity recognition. However, for wearable systems, low cost, high accuracy, and long battery life are important performance parameters. Currently, there is no standard to follow in terms of the wearable activity recognition system design. More research will be necessary to explore the optimal sensor number and locations to achieve the best performance in cost, accuracy, and battery life.

Gait Features

Gait-related features can be used to effectively recognize human activities [14]. These knowledge-based features are helpful to improve the generalization performance of the activity recognition models. For example, in most of the scenarios, running is faster than walking. Therefore, the data-driven model for walking and running discrimination might put a higher weight on the speed. However, walking and running are not discriminated based on speed but double support time [43]. When testing with new data, the performance of models built with data-driven features might decrease, but not the model built with knowledge-based features – the double support time. In recent years, more research has applied the gait cycle for data segmentation. However, for feature extraction, findings from this study indicate that there are very few existing recognition systems employing gait analysis-based features for the recognition of activities. In a study by Chen et al [14], the use of gait features enabled the recognition of human activities with relatively fewer features than activity recognition systems that employed non-gait analysis-based features. Considering the advantages of gait features in generalization and efficiency, they can be further explored and applied in the future to contribute to activity recognition-related applications.

Conclusion

In this study, existing WGA-based activity recognition systems in the past years were reviewed. Important topics related to WGA-based activity recognition, including wearable sensors, data segmentation, feature extraction, and classification were discussed. The ways that gait analysis can be used to assist activity recognition were summarized and highlighted. Finally, limitations in the current research and the potential opportunities for future research were discussed to help inform future research endeavors in this field.

Figure 1

Four main steps for WGA-based activity recognition. The pressure sensors and IMU in the “Data Collection” section represent the commonly used wearable sensors in WGA-based activity recognition systems. The plots in the “Data Segmentation” section represent the gait cycle-based method which involves the segmentation of data through the detection of gait cycles and, the fixed non-overlapping sliding window approach which involves the segmentation of data using fixed time windows. To extract features for activity recognition, knowledge-driven features and data-driven features are frequently used. The icons in the “Classification” section represent examples of activities that can be recognized by activity recognition systems during the classification phase.
Four main steps for WGA-based activity recognition. The pressure sensors and IMU in the “Data Collection” section represent the commonly used wearable sensors in WGA-based activity recognition systems. The plots in the “Data Segmentation” section represent the gait cycle-based method which involves the segmentation of data through the detection of gait cycles and, the fixed non-overlapping sliding window approach which involves the segmentation of data using fixed time windows. To extract features for activity recognition, knowledge-driven features and data-driven features are frequently used. The icons in the “Classification” section represent examples of activities that can be recognized by activity recognition systems during the classification phase.

Figure 2

Flow Chart of the Article Selection Process.
Flow Chart of the Article Selection Process.

Figure 3

Distribution of the WGA-Based Activity Recognition Publications Over Time.
Distribution of the WGA-Based Activity Recognition Publications Over Time.

Figure 4

Commonly used IMU sensor positions. a) Three IMU sensors positioned at the thigh, shank, and foot to capture data for activity recognition [32]. b) A single IMU sensor worn at the ankle for activity recognition [29].
Commonly used IMU sensor positions. a) Three IMU sensors positioned at the thigh, shank, and foot to capture data for activity recognition [32]. b) A single IMU sensor worn at the ankle for activity recognition [29].

Figure 5

Different numbers and locations of pressure sensors used in WGA-based activity recognition systems. a) A pressure sensor array with 96 pressure sensors evenly distributed on it [14]. b) Eight pressure sensors distributed at the big toe, metatarsal, and heel [23, 59]. c) Five pressure sensors placed at the toe, metatarsal, and heel [34].
Different numbers and locations of pressure sensors used in WGA-based activity recognition systems. a) A pressure sensor array with 96 pressure sensors evenly distributed on it [14]. b) Eight pressure sensors distributed at the big toe, metatarsal, and heel [23, 59]. c) Five pressure sensors placed at the toe, metatarsal, and heel [34].

Figure 6

Foot contact pitch during (a) walking, (b) stair ascent, (c) stair descent, and (d) the double float phase during running. This gait-analysis-based parameter was used by Chen et al. [14] in the recognition of activities.
Foot contact pitch during (a) walking, (b) stair ascent, (c) stair descent, and (d) the double float phase during running. This gait-analysis-based parameter was used by Chen et al. [14] in the recognition of activities.

Figure 7

Other wearable sensor types which can be employed in activity recognition. a) Barometer [1] b) Strain sensor [28].
Other wearable sensor types which can be employed in activity recognition. a) Barometer [1] b) Strain sensor [28].

Summary of WGA-based Activity Récognition Techniques

Références Recognized Activities Wearable Sensors Data Segmentation Extracted Features Activity Recognition
Martinez et al. [32] Level-ground walking, ramp ascent, and ramp descent. 3-axis gyroscope and pressure sensors. Gait cycle-based method Time-domain features Adaptive Bayesian Inference method
McCalmont et al. [35] Slow walking, normal walking, Fast walking, stair ascent, and stair descent. 3-axis accelerometer, 3-axis gyroscope, 3-axis magnetometer, Pressure sensor array. Gait cycle-based method Time-domain features and gait-based features. Artificial neural network, K-nearest neighbour (KNN), and Random Forest.
Ng et al. [42] Walking, sitting, lying, and falling. Sensor tags Gait cycle-based method Raw sensor data KNN and Random
Lopez et al. [29] Level-ground walking, Stair ascent, stair descent, Ramp ascent, and ramp descent. 3-axis accelerometer Gait cycle-based method. Time-domain features and frequency-domain features. KNN
Chenet al. [14] Walking, running, standing, sitting, stair ascent, and Stair descent. 3-axis accelerometer, 3-axis gyroscope, Pressure sensor array. Gait cycle-based method Gait-based features Support vector machine (SVM)
Jeong et al. [23] Level-ground walking, ascent. and stair descent. Pressure sensors Gait cycle-based method Raw sensor data SVM
Truong et al. [59] Level-ground walking, stair ascent. and stair descent. Pressure sensors Gait cycle-based method Time-domain features SVM
Martinez et al. [33] Level-ground walking, ramp ascent, and ramp descent. 3-axis accelerometer, 3-axis gyroscope, and Pressure sensors. Gait cycle-based method Time-domain features Bayesian formulation Based approach
Achkaretal. [38] Level-ground walking, standing, sitting, stair ascent, stair descent, Ramp ascent, and ramp descent. 3-axis accelerometer, 3-axis gyroscope, 3-axis magnetometer, Pressure sensors, and barometric sensor. Gait cycle-based method Gait-based features. Rule-based method.
Zhao et al. [66] Level-ground walking, Stair ascent. stair descent. Ramp ascent, and ramp descent. Pressure sensors and electromyography sensors. Gait cycle-based method Time-domain features. SVM
Mazumder et al. [34] Level-ground walking, fast walking, standing, sitting, Stair ascent, stair descent, and ramp ascent. 3-axis accelerometer, 3-xis gyroscope, and pressure sensors. Gait cycle-based method Time-domain features, Polynomial coefficients Extracted from hip angle Trajectory and centre-of-pressure (CoP) trajectory. SVM
Camargo et al. [10] Level-ground walking, Stair ascent, stair descent, Ramp ascent, and ramp descent. 3-axis accelerometer, 3-axis gyroscope, goniometer, and îlectromyography sensor. Gait cycle-based method Time-domain features and frequency-domain features. Dynamic Bayesian network
Ershadi et al. [20] Toe level ground walking, Normal level-ground walking, Sitting, and standing. Pressure sensors. Gait cycle-based method Time-domain features. Rule based method
Martindale et al. [31] Level-ground walking, sitting, stair ascent, stair descent, jogging, running, cycling, and jumping. 3-axis accelerometer, 3-axis gyroscope, and pressure sensors. Gait cycle-based method Raw sensor data. Convolutional Neural Networks (CNN) and Récurrent Neural Network (RNN).
Benson et al. [8] Normal running and fast running. 3-axis accelerometer, 3-axis gyroscope Gait cycle-based method Time-domain features, frequency-domain features, and wavelet-based features. SVM
Hamdi et al. [22] Level-ground walking, Stair ascent, stair descent, ramp ascent, and ramp descent. 3-axis accelerometer, and 3-axis gyroscope Gait cycle-based method Gait-based features, time-domain features, frequency-domain, and wavelet-based features. Random Forest
Achkar et al. [39] Level-ground walking, standing, sitting, Stair ascent, stair descent, Ramp ascent, and ramp descent. 3-axis accelerometer, 3-axis gyroscope, 3-axis magnetometer, pressure sensors, and barometric sensor. Gait cycle-based method Gait-based features and time-domain features. Rule based method
Xiuhua et al. [27] Level-ground walking, Ramp ascent, and ramp descent. 3-axis accelerometer, 3-axis gyroscope, and pressure sensors. Gait cycle-based method Gait-based features. Class incrémental learning method.
Ngo et al. [2] Level-ground walking, Stair ascent, stair descent, Ramp ascent, and ramp descent. 3-axis accelerometer and 3-axis gyroscope. Gait cycle-based method Time-domain features. KNN and SVM.

high-precision-pressure-sensor-height-sensor-module. https://robu.in/product/gy-63ms5611-01ba03-high-precision-pressure-sensor-height-sensor-module/. high-precision-pressure-sensor-height-sensor-module https://robu.in/product/gy-63ms5611-01ba03-high-precision-pressure-sensor-height-sensor-module/. Search in Google Scholar

“Similar gait action recognition using an inertial sensor”, Pattern Recognition, 48(4):1289–1301, 2015. “Similar gait action recognition using an inertial sensor” Pattern Recognition 48 4 1289 1301 2015 10.1016/j.patcog.2014.10.012 Search in Google Scholar

R. Abdel-Salam, R. Mostafa, and M. Hadhood, “Human activity recognition using wearable sensors: Review, challenges, evaluation benchmark”, ArXiv, abs/2101.01665, 2021. Abdel-SalamR. MostafaR. HadhoodM. “Human activity recognition using wearable sensors: Review, challenges, evaluation benchmark” ArXiv abs/2101.01665, 2021 10.1007/978-981-16-0575-8_1 Search in Google Scholar

J. K. Aggarwal and L. Xia, “Human activity recognition from 3d data: A review”, Pattern Recognition Letters, 48 (Celebrating the life and work of Maria Petrou.):70–80, 2014. AggarwalJ. K. XiaL. “Human activity recognition from 3d data: A review” Pattern Recognition Letters 48 (Celebrating the life and work of Maria Petrou.) 70 80 2014 10.1016/j.patrec.2014.04.011 Search in Google Scholar

N. Ahmad, R. Ariffin Bin Raja Ghazilla, N. Mohd Khairi, and V. Kasi, “Reviews on various inertial measurement unit (imu) sensor applications”, SiPS 2013, 2013. AhmadN. Ariffin Bin Raja GhazillaR. Mohd KhairiN. KasiV. “Reviews on various inertial measurement unit (imu) sensor applications” SiPS 2013 2013 10.12720/ijsps.1.2.256-262 Search in Google Scholar

O. C. Ann and L. B. Theng, “Human activity recognition: A review”, 2014 IEEE International Conference on Control System, Computing and Engineering (ICCSCE 2014), pp. 389–393, 2014. AnnO. C. ThengL. B. “Human activity recognition: A review” 2014 IEEE International Conference on Control System, Computing and Engineering (ICCSCE 2014) 389 393 2014 10.1109/ICCSCE.2014.7072750 Search in Google Scholar

D. J. Beddiar, B. Nini, M. Sabokrou, and A. Hadid, “Vision-based human activity recognition: A survey”, Multimedia Tools Appl., 79(41–42):30509–30555, Nov. 2020. BeddiarD. J. NiniB. SabokrouM. HadidA. “Vision-based human activity recognition: A survey” Multimedia Tools Appl 79 41–42 30509 30555 Nov. 2020 10.1007/s11042-020-09004-3 Search in Google Scholar

L. C. Benson, C. A. Clermont, S. T. Osis, D. Kobsar, and R. Ferber, “Classifying running speed conditions using a single wearable sensor: Optimal segmentation and feature extraction methods”, Journal of Biomechanics, 71:94–99, 2018. BensonL. C. ClermontC. A. OsisS. T. KobsarD. FerberR. “Classifying running speed conditions using a single wearable sensor: Optimal segmentation and feature extraction methods” Journal of Biomechanics 71 94 99 2018 10.1016/j.jbiomech.2018.01.03429454542 Search in Google Scholar

A. Bux, P. Angelov, and Z. Habib, “Vision based human activity recognition: A review”, in Plamen Angelov, Alexander Gegov, Chrisina Jayne, and Qiang Shen, editors, Advances in Computational Intelligence Systems, pp. 341–371, Cham, 2017. BuxA. AngelovP. HabibZ. “Vision based human activity recognition: A review” in AngelovPlamen GegovAlexander JayneChrisina ShenQiang editors, Advances in Computational Intelligence Systems 341 371 Cham 2017 10.1007/978-3-319-46562-3_23 Search in Google Scholar

J. Camargo, W. Flanagan, N. Csomay-Shanklin, B. Kanwar, and A. Young, “A machine learning strategy for locomotion classification and parameter estimation using fusion of wearable sensors”, IEEE Transactions on Biomedical Engineering, 68(5):1569–1578, 2021. CamargoJ. FlanaganW. Csomay-ShanklinN. KanwarB. YoungA. “A machine learning strategy for locomotion classification and parameter estimation using fusion of wearable sensors” IEEE Transactions on Biomedical Engineering 68 5 1569 1578 2021 10.1109/TBME.2021.306580933710951 Search in Google Scholar

D. Castro, W. Coral, C. Rodriguez, J. Cabra, and J. Colorado, “Wearable-based human activity recognition using an iot approach”, Journal of Sensor and Actuator Networks, 6(4), 2017. CastroD. CoralW. RodriguezC. CabraJ. ColoradoJ. “Wearable-based human activity recognition using an iot approach” Journal of Sensor and Actuator Networks 6 4 2017 10.3390/jsan6040028 Search in Google Scholar

D. Chen, G. Asaeikheybari, H. Chen, W. Xu, and M.-C. Huang, “Ubiquitous fall hazard identification with smart insole”, IEEE journal of biomedical and health informatics, 2020. ChenD. AsaeikheybariG. ChenH. XuW. HuangM.-C. “Ubiquitous fall hazard identification with smart insole” IEEE journal of biomedical and health informatics 2020 10.1109/JBHI.2020.304670133351772 Search in Google Scholar

D. Chen, Y. Cai, J. Cui, J. Chen, H. Jiang, and M.-C. Huang, “Risk factors identification and visualization or work-related musculoskeletal disorders with wearable and connected gait analytics system and Kinect skeleton models”, SmartHealth, 7:60–77, 2018. ChenD. CaiY. CuiJ. ChenJ. JiangH. HuangM.-C. “Risk factors identification and visualization or work-related musculoskeletal disorders with wearable and connected gait analytics system and Kinect skeleton models” SmartHealth 7 60 77 2018 10.1016/j.smhl.2018.05.003 Search in Google Scholar

D. Chen, Y. Cai, X. Qian, R. Ansari, W. Xu, K.-C. Chu, and M.-C. Huang, “Bring gait lab to everyday life: Gait analysis in terms of activities of daily living”, IEEE Internet of Things Journal, 7(2):1298–1312, 2020. ChenD. CaiY. QianX. AnsariR. XuW. ChuK.-C. HuangM.-C. “Bring gait lab to everyday life: Gait analysis in terms of activities of daily living” IEEE Internet of Things Journal 7 2 1298 1312 2020 10.1109/JIOT.2019.2954387 Search in Google Scholar

D. Chen, H. Cao, H. Chen, Z. Zhu, X. Qian, W. Xu, and M.-C. Huang, “Smart insole-based indoor localization system for internet of things applications”, IEEE Internet of Things Journal, 6(4):7253–7265, 2019. ChenD. CaoH. ChenH. ZhuZ. QianX. XuW. HuangM.-C. “Smart insole-based indoor localization system for internet of things applications” IEEE Internet of Things Journal 6 4 7253 7265 2019 10.1109/JIOT.2019.2915791 Search in Google Scholar

D. Chen, J. Chen, H. Jiang, and M.-C. Huang, “Risk factors identification for work-related musculoskeletal disorders with wearable and connected gait analytics system”, in 2017 IEEE/ACM International Conference on Connected Health: Applications, Systems and Engineering Technologies (CHASE), pp. 330–339, IEEE, 2017. ChenD. ChenJ. JiangH. HuangM.-C. “Risk factors identification for work-related musculoskeletal disorders with wearable and connected gait analytics system” in 2017 IEEE/ACM International Conference on Connected Health: Applications, Systems and Engineering Technologies (CHASE) 330 339 IEEE 2017 10.1109/CHASE.2017.116 Search in Google Scholar

L. Chen and C. D. Nugent, Sensor-Based Activity Recognition Review, Springer International Publishing, Cham, pp. 23–47, 2019. ChenL. NugentC. D. Sensor-Based Activity Recognition Review Springer International Publishing Cham 23 47 2019 10.1007/978-3-030-19408-6_2 Search in Google Scholar

T. Chu, A. Chua, and E. Secco, “A wearable myo gesture armband controlling sphero bb-8 robot”, HighTech and Innovation Journal, 1, 10, 2020. ChuT. ChuaA. SeccoE. “A wearable myo gesture armband controlling sphero bb-8 robot” HighTech and Innovation Journal 1 10 2020 10.28991/HIJ-2020-01-04-05 Search in Google Scholar

S. Eisa and A. Moreira, “A behaviour monitoring system (bms) for ambient assisted living”, Sensors, 17(9), 2017. EisaS. MoreiraA. “A behaviour monitoring system (bms) for ambient assisted living” Sensors 17 9 2017 10.3390/s17091946562073628837105 Search in Google Scholar

G. Ershadi, M. Gwak, A. Aminian, R. Soangra, M. GrantBeuttler, and M. Sarrafzadeh, “Smart insole: Remote gait detection algorithm using pressure sensors for toe walking rehabilitation”, in 2021 IEEE 7th World Forum on Internet of Things (WF-IoT), pp. 332–337, 2021. ErshadiG. GwakM. AminianA. SoangraR. GrantBeuttlerM. SarrafzadehM. “Smart insole: Remote gait detection algorithm using pressure sensors for toe walking rehabilitation” in 2021 IEEE 7th World Forum on Internet of Things (WF-IoT) 332 337 2021 10.1109/WF-IoT51360.2021.9595676 Search in Google Scholar

A. Gupta, K. Gupta, K. Gupta, and K. Gupta, “A survey on human activity recognition and classification”, in 2020 International Conference on Communication and Signal Processing (ICCSP), pp. 0915–0919, 2020. GuptaA. GuptaK. GuptaK. GuptaK. “A survey on human activity recognition and classification” in 2020 International Conference on Communication and Signal Processing (ICCSP) 0915 0919 2020 Search in Google Scholar

M. M. Hamdi, M. I. Awad, M. M. Abdelhameed, and F. A. Tolbah, “Lower limb gait activity recognition using inertial measurement units for rehabilitation robotics”, in 2015 International Conference on Advanced Robotics (ICAR), pp. 316–322, 2015. HamdiM. M. AwadM. I. AbdelhameedM. M. TolbahF. A. “Lower limb gait activity recognition using inertial measurement units for rehabilitation robotics” in 2015 International Conference on Advanced Robotics (ICAR) 316 322 2015 10.1109/ICAR.2015.7251474 Search in Google Scholar

G.-M. Jeong, P. H. Truong, and S.-I. Choi, “Classification of three types of walking activities regarding stairs using plantar pressure sensors”, IEEE Sensors Journal, 17(9):2638–2639, 2017. JeongG.-M. TruongP. H. ChoiS.-I. “Classification of three types of walking activities regarding stairs using plantar pressure sensors” IEEE Sensors Journal 17 9 2638 2639 2017 10.1109/JSEN.2017.2682322 Search in Google Scholar

E. Kantoch, “Human activity recognition for physical rehabilitation using wearable sensors fusion and artificial neural networks”, in 2017 Computing in Cardiology (CinC), pp. 1–4, 2017. KantochE. “Human activity recognition for physical rehabilitation using wearable sensors fusion and artificial neural networks” in 2017 Computing in Cardiology (CinC) 1 4 2017 10.22489/CinC.2017.296-332 Search in Google Scholar

O. D. Lara and M. A. Labrador, “A survey on human activity recognition using wearable sensors”, IEEE Communications Surveys Tutorials, 15(3):1192–1209, 2013. LaraO. D. LabradorM. A. “A survey on human activity recognition using wearable sensors” IEEE Communications Surveys Tutorials 15 3 1192 1209 2013 10.1109/SURV.2012.110112.00192 Search in Google Scholar

R. Liu, A. A. Ramli, H. Zhang, E. Datta, and X. Liu, “An overview of human activity recognition using wearable sensors: Healthcare and artificial intelligence”, CoRR, abs/2103.15990, 2021. LiuR. RamliA. A. ZhangH. DattaE. LiuX. “An overview of human activity recognition using wearable sensors: Healthcare and artificial intelligence” CoRR abs/2103.15990, 2021 10.1007/978-3-030-96068-1_1 Search in Google Scholar

X. Liu and Q. Wang, “Incrementally classifying different walking activities based on wearable sensors”, in 2021 27th International Conference on Mechatronics and Machine Vision in Practice (M2VIP), pp. 699–704, 2021. LiuX. WangQ. “Incrementally classifying different walking activities based on wearable sensors” in 2021 27th International Conference on Mechatronics and Machine Vision in Practice (M2VIP) 699 704 2021 10.1109/M2VIP49856.2021.9665024 Search in Google Scholar

Y. Liu, J. Huang, G. Ding, and Z. Yang, “High-performance and wearable strain sensors based on graphene microfluidics and serpentine microchannels for human motion detection”, Microelectronic Engineering, 231:111402, 2020. LiuY. HuangJ. DingG. YangZ. “High-performance and wearable strain sensors based on graphene microfluidics and serpentine microchannels for human motion detection” Microelectronic Engineering 231 111402 2020 10.1016/j.mee.2020.111402 Search in Google Scholar

I. H. López-Nava, M. Garcia-Constantino, and J. Favela, “Recognition of gait activities using acceleration data from a smartphone and a wearable device”, in UCAmI, 2019. López-NavaI. H. Garcia-ConstantinoM. FavelaJ. “Recognition of gait activities using acceleration data from a smartphone and a wearable device” in UCAmI 2019 10.3390/proceedings2019031060 Search in Google Scholar

I. H. López-Nava, A. Muñoz-Meléndez, A. I. Pérez Sanpablo, A. Alessi Montero, I. Quiñones Urióstegui, and L. Núñez Carrera, “Estimation of temporal gait parameters using bayesian models on acceleration signals”, Computer Methods in Biomechanics and Biomedical Engineering, 19(4):396–403, 2016. PMID: 25876180 López-NavaI. H. Muñoz-MeléndezA. Pérez SanpabloA. I. Alessi MonteroA. Quiñones UriósteguiI. Núñez CarreraL. “Estimation of temporal gait parameters using bayesian models on acceleration signals” Computer Methods in Biomechanics and Biomedical Engineering 19 4 396 403 2016 PMID: 25876180 10.1080/10255842.2015.103294525876180 Search in Google Scholar

C. F. Martindale, V. Christlein, P. Klumpp, and B. M. Eskofier, “Wearables-based multi-task gait and activity segmentation using recurrent neural networks”, Neurocomputing, 432:250–261, 2021. MartindaleC. F. ChristleinV. KlumppP. EskofierB. M. “Wearables-based multi-task gait and activity segmentation using recurrent neural networks” Neurocomputing 432 250 261 2021 10.1016/j.neucom.2020.08.079 Search in Google Scholar

U. Martinez-Hernandez and A. A. Dehghani-Sanij, “Adaptive bayesian inference system for recognition of walking activities and prediction of gait events using wearable sensors”, Neural Networks, 102:107–119, 2018. Martinez-HernandezU. Dehghani-SanijA. A. “Adaptive bayesian inference system for recognition of walking activities and prediction of gait events using wearable sensors” Neural Networks 102 107 119 2018 10.1016/j.neunet.2018.02.01729567532 Search in Google Scholar

U. Martinez-Hernandez, I. Mahmood, and A. A. Dehghani-Sanij, “Simultaneous bayesian recognition of locomotion and gait phases with wearable sensors”, IEEE Sensors Journal, 18(3):1282–1290, 2018. Martinez-HernandezU. MahmoodI. Dehghani-SanijA. A. “Simultaneous bayesian recognition of locomotion and gait phases with wearable sensors” IEEE Sensors Journal 18 3 1282 1290 2018 10.1109/JSEN.2017.2782181 Search in Google Scholar

O. Mazumder, A. S. Kundu, P. K. Lenka, and S. Bhaumik, “Ambulatory activity classification with dendogram-based support vector machine: Application in lower-limb active exoskeleton”, Gait Posture, 50:53–59, 2016. MazumderO. KunduA. S. LenkaP. K. BhaumikS. “Ambulatory activity classification with dendogram-based support vector machine: Application in lower-limb active exoskeleton” Gait Posture 50 53 59 2016 10.1016/j.gaitpost.2016.08.01027585182 Search in Google Scholar

G. McCalmont, P. Morrow, H. Zheng, A. Samara, S. Yasaei, H. Wang, and S. McClean, “ezigait: Toward an ai gait analysis and assistant system”, in 2018 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), pp. 2280–2286, 2018. McCalmontG. MorrowP. ZhengH. SamaraA. YasaeiS. WangH. McCleanS. “ezigait: Toward an ai gait analysis and assistant system” in 2018 IEEE International Conference on Bioinformatics and Biomedicine (BIBM) 2280 2286 2018 Search in Google Scholar

L. M. Dang, K. Min, H. Wang, Md. JalilPiran, C. Lee, and H. Moon, “Sensor-based and vision-based human activity recognition: A comprehensive survey”, Pattern Recognition, 108:107561, 2020. DangL. M. MinK. WangH. JalilPiranMd. LeeC. MoonH. “Sensor-based and vision-based human activity recognition: A comprehensive survey” Pattern Recognition 108 107561, 2020 10.1016/j.patcog.2020.107561 Search in Google Scholar

L. M. Dang, K. Min, H. Wang, Md. JalilPiran, C. Lee, and H. Moon, “Sensor-based and vision-based human activity recognition: A comprehensive survey”, Pattern Recognition, 108:107561, 2020. DangL. M. MinK. WangH. JalilPiranMd. LeeC. MoonH. “Sensor-based and vision-based human activity recognition: A comprehensive survey” Pattern Recognition 108 107561, 2020 10.1016/j.patcog.2020.107561 Search in Google Scholar

C. M. el Achkar, C. Lenoble-Hoskovec, A. Paraschiv-Ionescu, K. Major, C. Büla, and K. Aminian, “Physical behavior in older persons during daily life: Insights from instrumented shoes”, Sensors, 16:1225, August 2016. el AchkarC. M. Lenoble-HoskovecC. Paraschiv-IonescuA. MajorK. BülaC. AminianK. “Physical behavior in older persons during daily life: Insights from instrumented shoes” Sensors 16 1225 August 2016 10.3390/s16081225501739027527172 Search in Google Scholar

C. M. el Achkar, C. Lenoble-Hoskovec, A. Paraschiv-Ionescu, K. Major, C. Büla, and K. Aminian, “Instrumented shoes for activity classification in the elderly”, Gait Posture, 44:12–17, 2016. el AchkarC. M. Lenoble-HoskovecC. Paraschiv-IonescuA. MajorK. BülaC. AminianK. “Instrumented shoes for activity classification in the elderly” Gait Posture 44 12 17 2016 10.1016/j.gaitpost.2015.10.01627004626 Search in Google Scholar

S. C. Mukhopadhyay, “Wearable sensors for human activity monitoring: A review”, IEEE Sensors Journal, 15(3):1321–1330, 2015. MukhopadhyayS. C. “Wearable sensors for human activity monitoring: A review” IEEE Sensors Journal 15 3 1321 1330 2015 10.1109/JSEN.2014.2370945 Search in Google Scholar

S. C. Mukhopadhyay and T. Islam, “Wearable sensors; applications, design and implementation”, 2017. MukhopadhyayS. C. IslamT. “Wearable sensors; applications, design and implementation” 2017 Search in Google Scholar

Y. Ng, X. Jiang, Y. Zhang, S. Shin, and R. Ning, “Automated activity recognition with gait positions using machine learning algorithms”, Engineering, Technology Applied Science Research, 9:4554–4560, August 2019. NgY. JiangX. ZhangY. ShinS. NingR. “Automated activity recognition with gait positions using machine learning algorithms” Engineering, Technology Applied Science Research 9 4554 4560 August 2019 10.48084/etasr.2952 Search in Google Scholar

T. F. Novacheck, “The biomechanics of running”, Gait & posture, 7(1):77–95, 1998. NovacheckT. F. “The biomechanics of running” Gait & posture 7 1 77 95 1998 10.1016/S0966-6362(97)00038-610200378 Search in Google Scholar

C. I. Nwakanma, F. B. Islam, M. P. Maharani, J.-M. Lee, and D.-S. Kim, “Detection and classification of human activity for emergency response in smart factory shop floor”, Applied Sciences, 11(8), 2021. NwakanmaC. I. IslamF. B. MaharaniM. P. LeeJ.-M. KimD.-S. “Detection and classification of human activity for emergency response in smart factory shop floor” Applied Sciences 11 8 2021 10.3390/app11083662 Search in Google Scholar

M. N. Orlin and T. G McPoil, “Plantar Pressure Assessment”, Physical Therapy, 80(4):399–409, April 2000. OrlinM. N. McPoilT. G “Plantar Pressure Assessment” Physical Therapy 80 4 399 409 April 2000 10.1093/ptj/80.4.39910758524 Search in Google Scholar

S. Paraschiakos, R. Cachucho, M. Moed, D. van Heemst, S. Mooijaart, E. Slagboom, A. Knobbe, and M. Beekman, “Activity recognition using wearable sensors for tracking the elderly”, User Modeling and User-Adapted Interaction, July 2020. ParaschiakosS. CachuchoR. MoedM. van HeemstD. MooijaartS. SlagboomE. KnobbeA. BeekmanM. “Activity recognition using wearable sensors for tracking the elderly” User Modeling and User-Adapted Interaction July 2020 10.1007/s11257-020-09268-2 Search in Google Scholar

X. Qian, H. Cheng, D. Chen, Q. Liu, H. Chen, H. Jiang, and M.-C. Huang, “The smart insole: A pilot study of fall detection”, in EAI International Conference on Body Area Networks, pp. 37–49, 2019. QianX. ChengH. ChenD. LiuQ. ChenH. JiangH. HuangM.-C. “The smart insole: A pilot study of fall detection” in EAI International Conference on Body Area Networks 37 49 2019 10.1007/978-3-030-34833-5_4 Search in Google Scholar

J. Rafferty, C. D. Nugent, J. Liu, and L. Chen, “From activity recognition to intention recognition for assisted living within smart homes”, IEEE Transactions on Human-Machine Systems, 47(3):368–379, 2017. RaffertyJ. NugentC. D. LiuJ. ChenL. “From activity recognition to intention recognition for assisted living within smart homes” IEEE Transactions on Human-Machine Systems 47 3 368 379 2017 10.1109/THMS.2016.2641388 Search in Google Scholar

E. Ramanujam, T. Perumal, and S. Padmavathi, “Human activity recognition with smartphone and wearable sensors using deep learning techniques: A review”, IEEE Sensors Journal, 21(12):13029–13040, 2021. RamanujamE. PerumalT. PadmavathiS. “Human activity recognition with smartphone and wearable sensors using deep learning techniques: A review” IEEE Sensors Journal 21 12 13029 13040 2021 10.1109/JSEN.2021.3069927 Search in Google Scholar

S. Ranasinghe, F. Al Machot, and H. C. Mayr, “A review on applications of activity recognition systems with regard to performance and evaluation”, International Journal of Distributed Sensor Networks, 12(8):1550147716665520, 2016. RanasingheS. Al MachotF. MayrH. C. “A review on applications of activity recognition systems with regard to performance and evaluation” International Journal of Distributed Sensor Networks 12 8 1550147716665520, 2016 10.1177/1550147716665520 Search in Google Scholar

R. Riener, M. Rabuffetti, and C. Frigo, “Stair ascent and descent at different inclinations”, Gait & posture, 15(1):32–44, 2002. RienerR. RabuffettiM. FrigoC. “Stair ascent and descent at different inclinations” Gait & posture 15 1 32 44 2002 10.1016/S0966-6362(01)00162-X Search in Google Scholar

D. Rodríguez-Martín, A. Samà, C. Pérez-López, A. Català, and J. Cabestany, “Posture transition analysis with barometers: contribution to accelerometer based algorithms”, Neural Computing and Applications, 32:335–349, 2018. Rodríguez-MartínD. SamàA. Pérez-LópezC. CatalàA. CabestanyJ. “Posture transition analysis with barometers: contribution to accelerometer based algorithms” Neural Computing and Applications 32 335 349 2018 10.1007/s00521-018-3759-8 Search in Google Scholar

A. Sarabu and A. Santra, “Human action recognition in videos using convolution long short-term memory network with spatio-temporal networks”, Emerging Science Journal, 5:25–33, February 2021. SarabuA. SantraA. “Human action recognition in videos using convolution long short-term memory network with spatio-temporal networks” Emerging Science Journal 5 25 33 February 2021 10.28991/esj-2021-01254 Search in Google Scholar

A. B. Sargana, P. Angelov, and Z. Habib, Vision Based Human Activity Recognition: A Review, vol. 513, pp. 341–371. January 2017. SarganaA. B. AngelovP. HabibZ. Vision Based Human Activity Recognition: A Review 513 341 371 January 2017 10.1007/978-3-319-46562-3_23 Search in Google Scholar

S. Sharif, I. Murray, and G. Lee, “Validation of foot pitch angle estimation using inertial measurement unit against marker-based optical 3d motion capture system”, Biomedical Engineering Letters, 8, May 2018. SharifS. MurrayI. LeeG. “Validation of foot pitch angle estimation using inertial measurement unit against marker-based optical 3d motion capture system” Biomedical Engineering Letters 8 May 2018 10.1007/s13534-018-0072-5620854130603212 Search in Google Scholar

N. K. Suryadevara and S. C. Mukhopadhyay, “Assistive Technology for the Elderly”, Academic Press, 2020. SuryadevaraN. K. MukhopadhyayS. C. “Assistive Technology for the Elderly” Academic Press 2020 Search in Google Scholar

W. Tao, T. Liu, R. Zheng, and H. Feng, “Gait analysis using wearable sensors”, Sensors, 12(2):2255–2283, 2012. TaoW. LiuT. ZhengR. FengH. “Gait analysis using wearable sensors” Sensors 12 2 2255 2283 2012 10.3390/s120202255330416522438763 Search in Google Scholar

Tina, A. K. Sharma, S. Tomar, and K. Gupta, “Various approaches of human activity recognition: A review”, In 2021 5th International Conference on Computing Methodologies and Communication (ICCMC), pp. 1668–1676, 2021. Tina SharmaA. K. TomarS. GuptaK. “Various approaches of human activity recognition: A review” In 2021 5th International Conference on Computing Methodologies and Communication (ICCMC) 1668 1676 2021 Search in Google Scholar

P. H. Truong, S. You, S.-H. Ji, and G.-M. Jeong, “Adaptive accumulation of plantar pressure for ambulatory activity recognition and pedestrian identification”, Sensors, 21:3842, June 2021. TruongP. H. YouS. JiS.-H. JeongG.-M. “Adaptive accumulation of plantar pressure for ambulatory activity recognition and pedestrian identification” Sensors 21 3842 June 2021 10.3390/s21113842819962834199381 Search in Google Scholar

M. Vrigkas, C. Nikou, and I. A. Kakadiaris, “A review of human activity recognition methods”, Frontiers in Robotics and AI, 2, 2015. VrigkasM. NikouC. KakadiarisI. A. “A review of human activity recognition methods” Frontiers in Robotics and AI 2 2015 10.3389/frobt.2015.00028 Search in Google Scholar

C. Wang, J. Z. Zhang, Z. Wang, and J. Wang, “Position-independent activity recognition model for smartphone based on frequency domain algorithm”, in Proceedings of 2013 3rd International Conference on Computer Science and Network Technology, pp. 396–399, 2013. WangC. ZhangJ. Z. WangZ. WangJ. “Position-independent activity recognition model for smartphone based on frequency domain algorithm” in Proceedings of 2013 3rd International Conference on Computer Science and Network Technology 396 399 2013 10.1109/ICCSNT.2013.6967138 Search in Google Scholar

S. K. Yadav, K. Tiwari, H. M. Pandey, and S. Ali Akbar, “A review of multimodal human activity recognition with special emphasis on classification, applications, challenges and future directions”, Knowledge-Based Systems, 223:106970, 2021. YadavS. K. TiwariK. PandeyH. M. Ali AkbarS. “A review of multimodal human activity recognition with special emphasis on classification, applications, challenges and future directions” Knowledge-Based Systems 223 106970, 2021 10.1016/j.knosys.2021.106970 Search in Google Scholar

S. Yang, C. Li, X. Chen, Y. Zhao, H. Zhang, N. Wen, Z. Fan, and L. Pan, “Facile fabrication of high-performance pen ink-decorated textile strain sensors for human motion detection”, ACS Applied Materials & Interfaces, 12(17):19874–19881, 2020. PMID: 32253911 YangS. LiC. ChenX. ZhaoY. ZhangH. WenN. FanZ. PanL. “Facile fabrication of high-performance pen ink-decorated textile strain sensors for human motion detection” ACS Applied Materials & Interfaces 12 17 19874 19881 2020 PMID: 32253911 10.1021/acsami.9b2253432253911 Search in Google Scholar

S. Zhang, Y. Li, S. Zhang, F. Shahabi, S. Xia, Y. Deng, and N. Alshurafa, “Deep learning in human activity recognition withwearable sensors: A review on advances” Sensors, 22(4), Publisher Copyright: © 2022 by the authors. Licensee MDPI, Basel, Switzerland, February 2022. ZhangS. LiY. ZhangS. ShahabiF. XiaS. DengY. AlshurafaN. “Deep learning in human activity recognition withwearable sensors: A review on advances” Sensors 22 4 Publisher Copyright: © 2022 by the authors. Licensee MDPI Basel, Switzerland February 2022 10.3390/s22041476887904235214377 Search in Google Scholar

S. Zhang, Z. Wei, J. Nie, L. Huang, S. Wang, and Z. Li, “A review on human activity recognition using vision-based method”, Journal of Healthcare Engineering, 2017:1–31, July 2017. ZhangS. WeiZ. NieJ. HuangL. WangS. LiZ. “A review on human activity recognition using vision-based method” Journal of Healthcare Engineering 2017 1 31 July 2017 10.1155/2017/3090343554182429065585 Search in Google Scholar

Y. Zhao, J. Wang, Y. Zhang, H. Liu, Z. Chen, Y. Lu, Y. Dai, L. Xu, and S. Gao, “Flexible and wearable emg and psd sensors enabled locomotion mode recognition for ioht-based in-home rehabilitation”, IEEE Sensors Journal, 21(23):26311–26319, 2021. ZhaoY. WangJ. ZhangY. LiuH. ChenZ. LuY. DaiY. XuL. GaoS. “Flexible and wearable emg and psd sensors enabled locomotion mode recognition for ioht-based in-home rehabilitation” IEEE Sensors Journal 21 23 26311 26319 2021 10.1109/JSEN.2021.3058429 Search in Google Scholar

J. Zheng, H. Cao, D. Chen, R. Ansari, K.-C. Chu, and M.-C. Huang, “Designing deep reinforcement learning systems for musculoskeletal modeling and locomotion analysis using wearable sensor feedback”, IEEE Sensors Journal, 20(16):9274–9282, 2020. ZhengJ. CaoH. ChenD. AnsariR. ChuK.-C. HuangM.-C. “Designing deep reinforcement learning systems for musculoskeletal modeling and locomotion analysis using wearable sensor feedback” IEEE Sensors Journal 20 16 9274 9282 2020 10.1109/JSEN.2020.2986768 Search in Google Scholar

Z. Zhuang and Y. Xue, “Sport-related human activity detection and recognition using a smartwatch”, Sensors, 19(22):5001, Nov. 2019. ZhuangZ. XueY. “Sport-related human activity detection and recognition using a smartwatch” Sensors 19 22 5001 Nov. 2019 10.3390/s19225001689162231744127 Search in Google Scholar

Empfohlene Artikel von Trend MD

Planen Sie Ihre Fernkonferenz mit Scienceendo