Core Concepts
Utilizing temporal relations in domain adaptation enhances cross-user activity recognition.
Abstract
The article introduces the Deep Generative Domain Adaptation with Temporal Attention (DGDATA) method for cross-user Human Activity Recognition (HAR). It addresses the challenges of data distribution discrepancies in scenarios like cross-user HAR by integrating temporal relations during domain adaptation. The method combines generative models with a Temporal Relation Attention mechanism to improve classification performance. Evaluation on three public sensor-based HAR datasets demonstrates the efficacy of DGDATA in recognizing activities across different users.
Structure:
Introduction to Human Activity Recognition (HAR)
Challenges in current HAR methods
Importance of domain adaptation in cross-user HAR
Introduction of DGDATA method
Components of DGDATA: Fine-grained feature representation, Common temporal relations characterization, Classifier learning across users
Experimental setup and comparison with traditional and deep domain adaptation methods
Performance evaluation on OPPT, PAMAP2, and DSADS datasets
Effect of temporal relation knowledge on activity recognition
Stats
"A perfect score of 100% in all test scenarios."
"Accuracy above 83%, peaking at 90.29% in the 5 →6 scenario."
"DGDATA consistently achieves the highest scores in all test scenarios."
Quotes
"Our method introduces the generative model in Figure 3 as the foundational network architecture applied to the above-mentioned three components for further model generalization improvement."
"DGDATA effectively understands the fundamental structure with temporal relation knowledge."