Recap of HCII 2017 in Vancouver

Post Pic - HCII 2017

RASL members Huan Zhao and Dayi Bian presented two separate papers at this year’s Human Computer Interaction International (HCII) in Vancouver. Descriptions of both papers are given below.

Zhao, Huan, et al. “Design of a Tablet Game to Assess the Hand Movement in Children with Autism.” International Conference on Universal Access in Human-Computer Interaction. Springer, Cham, 2017.

Abstract: The high rate of atypical handedness and motor deficits among the children with autism spectrum disorders (ASD) have been repeatedly reported. Recently, tablet-assisted systems are increasingly applied to ASD interventions due to their potential benefits in terms of accessibility, cost and the ability to engage many children with ASD. In this paper, we propose the design of a tablet game system to assess the hand usage in movement manipulations of children with ASD. To play the games designed in this system, it requires good eye-hand coordination, precise and quick hand movements and cooperation with partners. The games can be played by one player using two hands or by two players each of whom using one hand. We present the system design and a small preliminary usability study that verified the system functionality in recording objective performance data for offline analysis of the hand usage of the players. Results showed that the proposed system was engaging to children with ASD and their TD (i.e. typically developing) peers, and could induce collaborative activities between them. The system was also shown to efficiently evaluate the usages of the dominant hand and the non-dominant hand of the users. We found that children with ASD showed different patterns of hand usage behaviors from the TD participants when using this system.

Bian, Dayi, et al. “Design of a Multisensory Stimulus Delivery System for Investigating Response Trajectories in Infancy.” International Conference on Universal Access in Human-Computer Interaction. Springer, Cham, 2017.

Abstract: Sensory processing differences, including auditory, visual, and tactile, are ideal targets for early detection of neurodevelopmental risk. However, existing studies focus on the audiovisual paradigm but ignore the sense of touch. In this work, we present a multisensory delivery system that can deliver audiovisual stimuli and precisely controlled tactile stimuli to infants in a synchronized manner. The system also records multi-dimensional data including eye gaze and physiological data. A pilot study of six 3–8 month old infants was conducted to investigate the tolerability and feasibility of the system. Results have shown that the system is well tolerated by infants and all the data were collected robustly. This work paves the way for future studies charting the meaning of sensory response trajectories in infancy.

`