Cognitive Effort Comparison of Hand and Stylus Interaction in Touchscreen-Based Educational Games Using fNIRS
Core Concepts
Using hands for touch-based interaction in educational games requires less cognitive effort and is perceived as more user-friendly than using a stylus, despite similar performance outcomes.
Abstract
- Bibliographic Information: Sharmin, S., Bakhshipour, E., Abrar, M. F., Kiafar, B., Kullu, P., Getchell, N., & Barmaki, R. L. (2025). Functional Near-Infrared Spectroscopy (fNIRS) Analysis of Interaction Techniques in Touchscreen-Based Educational Gaming. In Proceedings of Human Factors in Computing Systems (CHI’25). ACM, New York, NY, USA, 10 pages. https://doi.org/XXXXXXX.XXXXXXX
- Research Objective: To investigate the impact of hand and stylus interaction on cognitive effort and user experience in touchscreen-based educational games.
- Methodology: A within-subjects study was conducted with 14 participants who played educational quizzes on a touchscreen laptop using both hand and stylus input methods. Hemodynamic responses were measured using fNIRS, and user experience was assessed through SUS and NASA-TLX questionnaires.
- Key Findings: Participants exhibited significantly lower oxygenated hemoglobin (ΔHbO) levels and relative neural involvement (RNI) when using hand compared to stylus, indicating lower cognitive effort. Conversely, hand interaction resulted in higher relative neural efficiency (RNE). User experience scores (SUS and NASA-TLX) were significantly better for hand interaction.
- Main Conclusions: Hand interaction in touchscreen-based educational games requires less cognitive effort and is perceived as more user-friendly than stylus interaction, despite similar performance levels.
- Significance: This study provides valuable insights for designing effective and engaging educational games by highlighting the importance of input modality selection.
- Limitations and Future Research: The study was limited by a small sample size and participants' unfamiliarity with stylus use. Future research should explore these findings with a larger, more diverse sample and investigate the impact of extended stylus training.
Translate Source
To Another Language
Generate MindMap
from source content
fNIRS Analysis of Interaction Techniques in Touchscreen-Based Educational Gaming
Stats
The hand condition had a significantly lower ΔHbO and RNI with higher RNE than the stylus condition.
The average SUS score for the hand condition was 88.76 ± 14.6, while the stylus condition had an average score of 54.64 ± 28.12.
The hand condition had a lower average workload score (M= 19.46 ± 5.22) than the stylus condition (M= 41.43 ± 3.45).
Quotes
"Our findings show that the hand condition had a significant lower (ΔHbO) and RNI with higher RNE than the stylus condition indicating the requirement of less cognitive effort."
"When people used the stylus, they had to control the pen and answer questions at the same time that might cause higher cognitive effort."
Deeper Inquiries
How can the design of educational games be further optimized to leverage the natural intuitiveness of hand-based interaction?
This study highlights the intuitive nature of hand-based interaction for touchscreen-based educational games. To further leverage this, game designers can consider the following optimizations:
Gesture-based controls: Integrate intuitive hand gestures like tapping, swiping, dragging, and pinching for navigation, object manipulation, and gameplay mechanics. This minimizes the need for on-screen buttons and leverages the natural mapping between hand movements and actions.
Multi-touch activities: Design collaborative activities that require multiple touchpoints, encouraging learners to use both hands simultaneously. This can enhance engagement and create a more immersive learning experience.
Object manipulation and physics: Incorporate realistic physics engines that allow learners to interact with objects in the game world using natural hand movements. This can facilitate a deeper understanding of concepts related to gravity, momentum, and spatial reasoning.
Handwriting and drawing recognition: Implement robust handwriting and drawing recognition features that allow learners to input answers, solve problems, and express creativity directly on the touchscreen using their fingers or a digital pen.
Haptic feedback integration: Enhance the sense of touch by incorporating haptic feedback for specific actions, providing confirmation and a more engaging experience. For example, a slight vibration could indicate a correct answer or a successful object placement.
By prioritizing natural hand movements and minimizing the cognitive load associated with complex controls, educational games can become more intuitive, engaging, and effective for learners of all ages.
Could the observed differences in cognitive effort between hand and stylus interaction be attributed to the specific design of the educational game used in the study, or are they generalizable to other touchscreen-based applications?
While the study indicates a lower cognitive effort associated with hand-based interaction compared to stylus use in their specific educational game, generalizing these findings to all touchscreen applications requires careful consideration.
Here's why:
Task-specificity: The observed differences could be partly attributed to the game's design and the nature of the tasks involved. The study used a quiz format with dragging, dropping, and clicking. Different applications requiring higher precision (e.g., drawing apps, note-taking) or different motor skills might yield different results.
Stylus familiarity: The study acknowledged that participants had limited prior experience with styluses. Increased familiarity and practice with a stylus could potentially reduce the observed cognitive effort over time.
Individual differences: Motor skills, dexterity, and prior experience with touchscreens vary greatly between individuals. Some users might find stylus use more intuitive or comfortable depending on their individual preferences and abilities.
Therefore, while the study provides valuable insights into cognitive effort in one context, further research is needed to determine the generalizability of these findings. Future studies should investigate a wider range of touchscreen applications, tasks, and user populations with varying levels of stylus familiarity.
What are the implications of these findings for the development of accessible and inclusive learning technologies for individuals with motor impairments or disabilities?
This study's findings have significant implications for developing accessible and inclusive learning technologies, particularly for individuals with motor impairments:
Prioritizing hand-based interaction: Given the lower cognitive effort associated with hand-based interaction, prioritizing this modality can benefit users with motor impairments who may find stylus use challenging or fatiguing.
Flexible input options: Offering both hand and stylus input methods as options allows individuals to choose the modality that best suits their needs and abilities. This promotes inclusivity and caters to a wider range of users.
Adaptive interfaces: Developing adaptive interfaces that adjust to individual user capabilities is crucial. For instance, the interface could offer larger touch targets, adjustable sensitivity settings, and alternative input methods like voice control for users with motor impairments.
Customization and personalization: Providing users with the ability to customize interface elements, such as button size, layout, and gesture shortcuts, can further enhance accessibility and user experience.
Usability testing with diverse users: Involving individuals with motor impairments in the design and testing phases is essential to ensure that learning technologies are truly accessible and meet their specific needs.
By considering these implications, developers can create more inclusive learning environments that empower all learners, regardless of their physical abilities, to engage with educational content effectively and enjoyably.