Jump to Content
Wenxin Feng

Wenxin Feng

Authored Publications
Google Publications
Other Publications
Sort By
  • Title
  • Title, descending
  • Year
  • Year, descending
    HGaze Typing: Head-Gesture Assisted Gaze Typing
    Jiangnan Zou
    Andrew Kurauchi
    Carlos H Morimoto
    Margrit Betke
    The 13th ACM Symposium on Eye Tracking Research and Applications (ETRA2021)
    Preview abstract This paper introduces a bi-modal typing interface, HGaze Typing, which combines the simplicity and accuracy of head gestures with the speed of word typing by gaze swiping to provide efficient and comfortable dwell-free text entry. HGaze Typing uses gaze path information to compute candidate words and allows explicit activation of common text entry commands, such as selection, deletion, and revision by using head gestures (nodding, shaking, and tilting). By adding a head-based input channel, HGaze Typing reduces the size of the screen regions for cancel and deletion buttons, and the word candidate list, which are required by most eye-typing interfaces. A lab study finds HGaze Typing outperforms a dwell-time-based virtual keyboard in efficacy and user satisfaction. These results demonstrate that our method of integrating gaze and head-movement inputs can be an effective text entry method and is robust to unintended selections. View details
    WATouCH: Enabling Direct Input on Non-touchscreen Using Smartwatch's Photoplethysmogram and IMU Sensor Fusion
    Hui-Shyong Yeo
    Michael Xuelin Huang
    Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, New York, NY, USA, 1–10
    Preview abstract Interacting with non-touchscreens such as TV or public displays can be difficult and inefficient. We propose WATouCH, a novel method that localizes a smartwatch on a display and allows direct input by turning the smartwatch into a tangible controller. This low-cost solution leverages sensor fusion of the built-in inertial measurement unit (IMU) and photoplethysmogram (PPG) sensor on a smartwatch that is used for heart rate monitoring. Specifically, WATouCH tracks the smartwatch movement using IMU data and corrects its location error caused by drift using the PPG responses to a dynamic visual pattern on the display. We conducted a user study on two tasks – a point and click and line tracing task – to evaluate the system usability and user performance. Evaluation results suggested that our sensor fusion mechanism effectively confined IMU-based localization error, achieved encouraging targeting and tracing precision, was well received by the participants, and thus opens up new opportunities for interaction. View details
    Designing and Evaluating Head-based Pointing on Smartphones for People with Motor Impairments
    Muratcan Cicek
    Ankit Dave
    Michael Xuelin Huang
    Julia Katherine Haines
    Jeffry Nichols
    The 22nd International ACM SIGACCESS Conference on Computers and Accessibility, Association for Computing Machinery, New York, NY, USA (2020), pp. 12
    Preview abstract Head-based pointing is an alternative input method for people with motor impairments to access computing devices. This paper proposes a calibration-free head-tracking input mechanism for mobile devices that makes use of the front-facing camera that is standard on most devices. To evaluate our design, we performed two Fitts’ Law studies. First, a comparison study of our method with an existing head-based pointing solution, Eva Facial Mouse, with subjects without motor impairments. Second, we conducted what we believe is the first Fitts’ Law study using a mobile head tracker with subjects with motor impairments. We extend prior studies with a greater range of index of difficulties (IDs) [1.62, 5.20] bits and achieved promising throughput (average 0.61 bps with motor impairments and 0.90 bps without). We found that users’ throughput was 0.95 bps on average in our most difficult task (IDs: 5.20 bits), which involved selecting a target half the size of the Android recommendation for a touch target after moving nearly the full height of the screen. This suggests the system is capable of fine precision tasks. We summarize our observations and the lessons from our user studies into a set of design guidelines for head-based pointing systems. View details
    No Results Found