Consumer electronics are increasingly using everyday materials to blend into home environments, often using LEDs or symbol displays under textile meshes. Our surveys (n=1499 and n=1501) show interest in interactive graphical displays for hidden interfaces --- however, covering such displays significantly limits brightness, material possibilities and legibility.
To overcome these limitations, we leverage parallel rendering to enable ultrabright graphics that can pass through everyday materials. We unlock expressive hidden interfaces using rectilinear graphics on low-cost, mass-produced passive-matrix OLED displays. A technical evaluation across materials, shapes and display techniques, suggests 3.6--40X brightness increase compared to more complex active-matrix OLEDs.
We present interactive prototypes that blend into wood, textile, plastic and mirrored surfaces. Survey feedback (n=1572) on our prototypes suggests that smart mirrors are particularly desirable. A lab evaluation (n=11) reinforced these findings and allowed us to also characterize performance from hands-on interaction with different content, materials and under varying lighting conditions.View details
Wearable vibrotactile devices have many potential applications, including novel interfaces and sensory substitution for accessibility. Currently, vibrotactile experimentation is done using large lab setups. However, most practical applications require standalone on-body devices and integration into small form factors. Such integration is time-consuming and requires expertise.
To democratize wearable haptics we introduce VHP, a vibrotactile haptics platform. It comprises a low-power, miniature electronics board that can drive up to 12 independent channels of haptic signals with arbitrary waveforms at 2 kHz. The platform can drive vibrotactile actuators including LRAs and voice coils. Each vibrotactile channel has current-based load sensing, thus allowing for self-testing and auto-adjustment. The hardware is battery powered, programmable, has multiple input options, including serial and Bluetooth, as well as the ability to synthesize haptic signals internally. We conduct technical evaluations to determine the power consumption, latency, and how number of actuators that can run simultaneously.
We demonstrate applications where we integrate the platform into a bracelet and a sleeve to provide an audio-to-tactile wearable interface. To facilitate more use of this platform, we open-source our design and partner with a distributor to make the hardware widely available. We hope this work will motivate the use and study of vibrotactile all-day wearable devices.View details
Conference on Human Factors in Computing Systems (2021)
We present SonicHoop, an augmented aerial hoop with capacitive touch sensing and interactive sonification. SonicHoop is equipped with 42 electrodes, equally distributed over the hoop, which detect touch events between the hoop and the performer's body. We add interactive sonification of the touch events with the goal of, first, providing auditory feedback of the movements, and second, transforming the aerial hoop into a digital musical instrument that can be played by the performer's body. We explored 3 sonification strategies: ambient, lounge and electro dance.
Structured observation with 2 professional aerial hoop performers shows that \sh fundamentally changes their perception and choreographic processes: instead of translating music into movement, they search for bodily expressions to compose music. Different sound designs affect their movement differently, and auditory feedback, regardless of types of sound, improves movement quality.
We discuss opportunities for using SonicHoop as a creative object, a pedagogical tool and a digital musical instrument, as well as using interactive sonification in other acrobatic practices to explore full-body vertical interaction.View details
Proceedings of UIST 2020 (ACM Symposium on User Interface Software and Technology), ACM, New York, NY
Today’s wearable and mobile devices typically use separate hardware components for sensing and actuation. In this work, we introduce new opportunities for the Linear Resonant Actuator (LRA), which is ubiquitous in such devices due to its capability for providing rich haptic feedback. By leveraging strategies to enable active and passive sensing capabilities with LRAs, we demonstrate their benefits and potential as self-contained I/O devices. Specifically, we use the back-EMF voltage to classify if the LRA is tapped, touched, as well as how much pressure is being applied. The back-EMF sensing is already integrated into many motor and LRA drivers. We developed a passive low-power tap sensing method that uses just 37.7 uA. Furthermore, we developed active touch and pressure sensing, which is low-power, quiet (2 dB), and minimizes vibration. The sensing method works with many types of LRAs. We show applications, such as pressure-sensing side-buttons on a mobile phone. We have also implemented our technique directly on an existing mobile phone’s LRA to detect if the phone is handheld or placed on a soft or hard surface. Finally, we show that this method can be used for haptic devices to determine if the LRA makes good contact with the skin. Our approach can add rich sensing capabilities to the ubiquitous LRA actuators without requiring additional sensors or hardware.View details
Proceedings of UIST 2019 (ACM Symposium on User Interface Software and Technology), ACM, New York, NY
Adding electronics to textiles can be time-consuming and requires technical expertise. We introduce SensorSnaps, low-power wireless sensor nodes that seamlessly integrate into caps of fabric snap fasteners. SensorSnaps provide a new technique to quickly and intuitively augment any location on the clothing with sensing capabilities. SensorSnaps securely attach and detach from ubiquitous commercial snap fasteners. Using inertial measurement units, the SensorSnaps detect tap and rotation gestures, as well as track body motion.
We optimized the power consumption for SensorSnaps to work continuously for 45 minutes and up to 4 hours in capacitive touch standby mode. We present applications in which the SensorSnaps are used as gestural interfaces for a music player controller, cursor control, and motion tracking suit. The user study showed that SensorSnap could be attached in around 71 seconds, similar to attaching off-the-shelf snaps, and participants found the gestures easy to learn and perform. SensorSnaps could allow anyone to effortlessly add sophisticated sensing capacities to ubiquitous snap fasteners.View details
No Results Found
We're always looking for more talented, passionate people.