Artem Dementyev

Artem Dementyev

Portfolio: http://www.artemdementyev.com/
Authored Publications
Sort By
  • Title
  • Title, descending
  • Year
  • Year, descending
    Preview abstract Motivated by the necessity of guiding and monitoring students' progress in real-time when assembling circuits during in-class activities we propose BlinkBoard, an augmented breadboard to enhance offline as well as online physical computing classes. BlinkBoard uses LEDs placed on each row of the breadboard to guide, via four blinking patterns, how to place and connect components and wires. It also uses a set of Input/Output pins to sense voltage levels at user-specified rows or to generate voltage output. Our hardware uses an open JSON protocol of commands and responses that can be integrated with a graphical application hosted on a computer that ensures bidirectional communication between each of the students' BreadBoard and the instructor's dashboard and slides. The hardware is affordable and simple, partially due to a customized circuit configured via a hardware description language that handles the LEDs' patterns with minimal load on the Arduino micro-controller. Finally, we briefly show how this hardware made its way to a workshop with high-school students and an undergraduate class in a design department. View details
    Preview abstract Consumer electronics are increasingly using everyday materials to blend into home environments, often using LEDs or symbol displays under textile meshes. Our surveys (n=1499 and n=1501) show interest in interactive graphical displays for hidden interfaces --- however, covering such displays significantly limits brightness, material possibilities and legibility. To overcome these limitations, we leverage parallel rendering to enable ultrabright graphics that can pass through everyday materials. We unlock expressive hidden interfaces using rectilinear graphics on low-cost, mass-produced passive-matrix OLED displays. A technical evaluation across materials, shapes and display techniques, suggests 3.6--40X brightness increase compared to more complex active-matrix OLEDs. We present interactive prototypes that blend into wood, textile, plastic and mirrored surfaces. Survey feedback (n=1572) on our prototypes suggests that smart mirrors are particularly desirable. A lab evaluation (n=11) reinforced these findings and allowed us to also characterize performance from hands-on interaction with different content, materials and under varying lighting conditions. View details
    SonicHoop: Using Interactive Sonification to Support Aerial Hoop Practices
    Diemo Schwarz
    Frederic Bevilacqua
    Michel Beaudouin-Lafon
    Wanyu Liu (Abby)
    Wendy Mackay
    Conference on Human Factors in Computing Systems (2021)
    Preview abstract We present SonicHoop, an augmented aerial hoop with capacitive touch sensing and interactive sonification. SonicHoop is equipped with 42 electrodes, equally distributed over the hoop, which detect touch events between the hoop and the performer's body. We add interactive sonification of the touch events with the goal of, first, providing auditory feedback of the movements, and second, transforming the aerial hoop into a digital musical instrument that can be played by the performer's body. We explored 3 sonification strategies: ambient, lounge and electro dance. Structured observation with 2 professional aerial hoop performers shows that \sh fundamentally changes their perception and choreographic processes: instead of translating music into movement, they search for bodily expressions to compose music. Different sound designs affect their movement differently, and auditory feedback, regardless of types of sound, improves movement quality. We discuss opportunities for using SonicHoop as a creative object, a pedagogical tool and a digital musical instrument, as well as using interactive sonification in other acrobatic practices to explore full-body vertical interaction. View details
    VHP: Vibrotactile Haptics Platform for On-body Applications
    Dimitri Kanevsky
    Malcolm Slaney
    UIST, ACM, https://dl.acm.org/doi/10.1145/3472749.3474772 (2021)
    Preview abstract Wearable vibrotactile devices have many potential applications, including novel interfaces and sensory substitution for accessibility. Currently, vibrotactile experimentation is done using large lab setups. However, most practical applications require standalone on-body devices and integration into small form factors. Such integration is time-consuming and requires expertise. To democratize wearable haptics we introduce VHP, a vibrotactile haptics platform. It comprises a low-power, miniature electronics board that can drive up to 12 independent channels of haptic signals with arbitrary waveforms at 2 kHz. The platform can drive vibrotactile actuators including LRAs and voice coils. Each vibrotactile channel has current-based load sensing, thus allowing for self-testing and auto-adjustment. The hardware is battery powered, programmable, has multiple input options, including serial and Bluetooth, as well as the ability to synthesize haptic signals internally. We conduct technical evaluations to determine the power consumption, latency, and how number of actuators that can run simultaneously. We demonstrate applications where we integrate the platform into a bracelet and a sleeve to provide an audio-to-tactile wearable interface. To facilitate more use of this platform, we open-source our design and partner with a distributor to make the hardware widely available. We hope this work will motivate the use and study of vibrotactile all-day wearable devices. View details
    Preview abstract Today’s wearable and mobile devices typically use separate hardware components for sensing and actuation. In this work, we introduce new opportunities for the Linear Resonant Actuator (LRA), which is ubiquitous in such devices due to its capability for providing rich haptic feedback. By leveraging strategies to enable active and passive sensing capabilities with LRAs, we demonstrate their benefits and potential as self-contained I/O devices. Specifically, we use the back-EMF voltage to classify if the LRA is tapped, touched, as well as how much pressure is being applied. The back-EMF sensing is already integrated into many motor and LRA drivers. We developed a passive low-power tap sensing method that uses just 37.7 uA. Furthermore, we developed active touch and pressure sensing, which is low-power, quiet (2 dB), and minimizes vibration. The sensing method works with many types of LRAs. We show applications, such as pressure-sensing side-buttons on a mobile phone. We have also implemented our technique directly on an existing mobile phone’s LRA to detect if the phone is handheld or placed on a soft or hard surface. Finally, we show that this method can be used for haptic devices to determine if the LRA makes good contact with the skin. Our approach can add rich sensing capabilities to the ubiquitous LRA actuators without requiring additional sensors or hardware. View details
    SensorSnaps: Integrating Wireless Sensor Nodes into Fabric Snap Fasteners for Textile Interfaces
    Tomás Alfonso Vega Gálvez
    Proceedings of UIST 2019 (ACM Symposium on User Interface Software and Technology), ACM, New York, NY
    Preview abstract Adding electronics to textiles can be time-consuming and requires technical expertise. We introduce SensorSnaps, low-power wireless sensor nodes that seamlessly integrate into caps of fabric snap fasteners. SensorSnaps provide a new technique to quickly and intuitively augment any location on the clothing with sensing capabilities. SensorSnaps securely attach and detach from ubiquitous commercial snap fasteners. Using inertial measurement units, the SensorSnaps detect tap and rotation gestures, as well as track body motion. We optimized the power consumption for SensorSnaps to work continuously for 45 minutes and up to 4 hours in capacitive touch standby mode. We present applications in which the SensorSnaps are used as gestural interfaces for a music player controller, cursor control, and motion tracking suit. The user study showed that SensorSnap could be attached in around 71 seconds, similar to attaching off-the-shelf snaps, and participants found the gestures easy to learn and perform. SensorSnaps could allow anyone to effortlessly add sophisticated sensing capacities to ubiquitous snap fasteners. View details