News/Research

The BCNM Guide to CHI 2018

09 Apr, 2018

The BCNM Guide to CHI 2018

Overwhelmed by the amazing panels at CHI 2018? Make your CHI experience a BCNM one with our guide to where our students, faculty, and alumni will be presenting!

The ACM CHI Conference on Human Factors in Computing Systems is the premier international conference of Human-Computer Interaction, where researchers and practitioners gather to discuss the latest in interactive technology. We're thrilled so many BCNM members were selected to participate.

See below for a list of workshops, papers and projects by past and current BCNM members (highlighted in green). Don't see your name, email us at lara [​at​] berkeley.edu

Disruptive Improvisations: Making Use of Non-Deterministic Art Practices in HCI

Kristina Andersen, Eindhoven University of Technology, Eindhoven, Netherlands
Laura Devendorf, University of Colorado
James Pierce, University of California, Berkeley
Ron Wakkary, Simon Fraser University; Eindhoven University of Technology
Daniela K Rosner, University of Washington

The goal of this one-day workshop is to open space for disruptive techniques and strategies to be used in the making, prototyping, and conceptualizations of the artifacts and systems developed and imagined within HCI. Specifically, this workshop draws on strategies from art, speculative design, and activism, as we aim to productively “trouble” the design processes behind HCI. We frame these explorations as "disruptive improvisations" — tactics artists and designers use to make the familiar strange or creatively problematize in order to foster new insights. The workshop invites participants to inquire through making and take up key themes as starting points to develop disruptive improvisations for design. These include modesty, scarcity, uselessness, no-technology, and failure. The workshop will produce a zine workbook or pamphlet to be distributed during the conference to bring visibility to the role these tactics of making in a creative design practices.

MatchSticks: Woodworking through Improvisational Digital Fabrication

Chair: Laura Devendorf, University of Colorado
Rundong Tian, Berkeley
Sarah Sterman, University of California, Berkeley
Ethan Chiou, University of California, Berkeley
Jeremy Warner, UC Berkeley
Eric Paulos, University of California, Berkeley

Digital fabrication tools have broadened participation in making and enabled new methods of rapid physical prototyping across diverse materials. We present a novel smart tool designed to complement one of the first materials employed by humans - wood - and celebrate the fabrication practice of joinery. Our tool, MatchSticks, is a digital fabrication system tailored for joinery. Combining a portable CNC machine, touchscreen user interface, and parametric joint library, MatchSticks enables makers of varying skill to rapidly explore and create artifacts from wood. Our system embodies tacit woodworking knowledge and distills the distributed workflow of CNC tools into a hand tool; it operates on materials existing machines find difficult, produces assemblies much larger than its workspace, and supports the parallel creation of geometries. We describe the workflow and technical details of our system, present example artifacts produced by our tool, and report results from our user study.

Capturing, Representing, and Interacting with Laughter

Kimiko Ryokai, University of California, Berkeley
Elena Durán López, University of California, Berkeley
Noura Howell, University of California, Berkeley
Jon Gillick, UC Berkeley
David Bamman, UC Berkeley

We investigate a speculative future in which we celebrate happiness by capturing laughter and representing it in tangible forms. We explored technologies for capturing naturally occurring laughter as well as various physical representations of it. For several weeks, our participants collected audio samples of everyday conversations with their loved ones. We processed those samples through a machine learning algorithm and shared the resulting tangible representations (e.g., physical containers and edible displays) with our participants. In collecting, listening to, interacting with, and sharing their laughter with loved ones, participants described both joy in preserving and interacting with laughter and tension in collecting it. This study revealed that the tangibility of laughter representations matters, especially its symbolism and material quality. We discuss design implications of giving permanent forms to laughter and consider the sound of laughter as a part of our personal past that we might seek to preserve and reflect upon.

AlterWear: Battery-Free Wearable Displays for Opportunistic Interactions

Christine Dierk, University of California
Molly Jane Pearce Nicholas, University of California
Eric Paulos, University of California

As the landscape of wearable devices continues to expand, power remains a major issue for adoption, usability, and miniaturization. Users are faced with an increasing number of personal devices to manage, charge, and care for. In this work, we argue that power constraints limit the design space of wearable devices. We present AlterWear: an architecture for new wearable devices that implement a batteryless design using electromagnetic induction via NFC and bistable e-ink displays. Although these displays are active only when in proximity to an NFC-enabled device, this unique combination of hardware enables both quick, dynamic and long-term interactions with persistent visual displays. We demonstrate new wearables enabled through AlterWear with dynamic, fashion-forward, and expressive displays across several form factors, and evaluate them in a user study. By forgoing the need for onboard power, AlterWear expands the ecosystem of functional and fashionable wearable technologies.

HindSight: Enhancing Spatial Awareness by Sonifying Detected Objects in Real-Time 360-Degree Video

Eldon Schoop, Berkeley
James Smith, University of California, Berkeley
Bjoern Hartmann, University of California, Berkeley

Our perception of our surrounding environment is limited by the constraints of human biology. The field of augmented perception asks how our sensory capabilities can be usefully extended through computational means. We argue that spatial awareness can be enhanced by exploiting recent advances in computer vision which make high-accuracy, real-time object detection feasible in everyday settings. We introduce HindSight, a wearable system that increases spatial awareness by detecting relevant objects in live 360-degree video and sonifying their position and class through bone conduction headphones. HindSight uses a deep neural network to locate and attribute semantic information to objects surrounding a user through a head-worn panoramic camera. It then uses bone conduction headphones, which preserve natural auditory acuity, to transmit audio notifications for detected objects of interest. We develop an application using HindSight to warn cyclists of approaching vehicles outside their field of view and evaluate it in an exploratory study with 15 users. Participants reported increases in perceived safety and awareness of approaching vehicles when using HindSight.

Tensions of Data-Driven Reflection: A Case Study of Real-Time Emotional Biosensing

Noura Howell, University of California, Berkeley
Laura Devendorf, University of Colorado

Tomás Vega Gálvez, Singapore University of Technology and Design
Rundong Tian, Berkeley
Kimiko Ryokai, University of California, Berkeley

Biosensing displays, increasingly enrolled in emotional reflection, promise authoritative insight by presenting users’ emotions as discrete categories. Rather than machines interpreting emotions, we sought to explore an alternative with emotional biosensing displays in which users formed their own interpretations and felt comfortable critiquing the display. So, we designed, implemented, and deployed, as a technology probe, an emotional biosensory display: Ripple is a shirt whose pattern changes color responding to the wearer’s skin conductance, which is associated with excitement. 17 participants wore Ripple over 2 days of daily life. While some participants appreciated the ‘physical connection’ Ripple provided between body and emotion, for others Ripple fostered insecurities about ‘how much’ feeling they had. Despite our design intentions, we found participants rarely questioned the display’s relation to their feelings. Using biopolitics to speculate on Ripple’s surprising authority, we highlight ethical stakes of biosensory representations for sense of self and ways of feeling.

Visualizing API Usage Examples At Scale

Elena L. Glassman, UC Berkeley
Tianyi Zhang, University of California, Los Angeles
Bjoern Hartmann, University of California, Berkeley
Miryung Kim, University of California, Los Angeles

Using existing APIs properly is a key challenge in programming, given that libraries and APIs are increasing in number and complexity. Programmers often search for online code examples in Q&A forums and read tutorials and blog posts to learn how to use a given API. However, there is often a massive number of related code examples and it is difficult for a user to understand the commonalities and variances among related examples, while being able to drill down to concrete details. We introduce an interactive visualization for exploring a large collection of code examples mined from open-source repositories at scale. This visualization summarizes hundreds of code examples in one synthetic code skeleton with statistical distributions for canonicalized statements and structures enclosing an API call. We implemented this interactive visualization for a set of Java APIs and found that, in a lab study, it helped users (1) answer significantly more API usage questions correctly and comprehensively and (2) explore how other programmers have used an unfamiliar API.

Interactive Extraction of Examples from Existing Code

Andrew Head, UC Berkeley
Elena L. Glassman, University of California, Berkeley
Bjoern Hartmann, University of California, Berkeley
Marti A. Hearst, University of California, Berkeley

Programmers frequently learn from examples produced and shared by other programmers. However, it can be challenging and time-consuming to produce concise, working code examples. We conducted a formative study where 12 participants made examples based on their own code. This revealed a key hurdle: making meaningful simplifications without introducing errors. Based on this insight, we designed a mixed-initiative tool, CodeScoop, to help programmers extract executable, simplified code from existing code. CodeScoop enables programmers to "scoop" out a relevant subset of code. Techniques include selectively including control structures and recording an execution trace that allows authors to substitute literal values for code and variables. In a controlled study with 19 participants, CodeScoop helped programmers extract executable code examples with the intended behavior more easily than with a standard code editor.

LBW628: SoundGlove: Multisensory Exploration of Everyday Objects for Creative

Beomjune Son, University of California, Berkeley; POSTECH
Conner Hunihan, University of California, Berkeley
Soravis Prakkamakul, University of California, Berkeley

SoundGlove is a tool for exploring everyday objects through a tangible, synesthetic experience of sound. The device facilitates this exploration by allowing the user to physically record and mix sounds by grabbing them out of the air and dropping them into a bowl. Sounds deposited into the bowl are mixed together and can be played back, enhancing abilities not only in observation, but in sound creation, as well. This paper describes the design and implementation of the SoundGlove system and considers potential applications, as suggested from early testing with users.