News/Research

Summer Research Reports: Ellie Hoshizaki and the Vibe-o-meter

17 Jan, 2024

Summer Research Reports: Ellie Hoshizaki and the Vibe-o-meter

We're thrilled to support our students in their summer research. Read about Ellie Hoshizaki and the Vibe-o-meter!

From Ellie Hoshizaki:

This award catalyzed researching vibes and the creation of the Vibe-o-Meter, leading to beneficial social impact and discoveries within human computer interaction (HCI) that are still being explored today.


Vibes are defined as a person’s emotional state or the atmosphere of a place, as communicated to and felt by others. However, a concise scientific explanation of how vibes are sensed is not readily available. Research in this field predominantly extrapolates sonic vibratory signals in live music to contagion, or collective affect, within a group of concertgoers. In light of this research, the focus of this project became visualizing collective vibes in the context of creative exhibitions. In order to do this, I invented the Vibe-o-Meter, a low cost open source MIDI controller for video jockey (VJ) interactions. Data is collected and Machine Learning (ML) models pick up patterns in order to visualize vibes. A “vibe synthesizer” acts as an input processor powered by real time user interaction and data from biosensory data during a concert through technological sensors to measure temperature and electromagnetic fields.


The Vibe-o-Meter was launched at the UCBerkeley Flux design showcase and will next appear at Randyfest 2024, an intimate music and art festival to honor the memory of Randy Rollog Lee. This project is designed for actual applications on the field through a variety of engagement and interaction, and aims to explore the integral role of vibes in the larger significance of nonverbal communication. Future work includes researching vibes in situations beyond live music where nonverbal communication is absolutely necessary, such as hospice care.


Future research questions that will be explored include: How can sensors detect data correlations in emotional affect perception and surroundings? How can we allow ourselves to be aware of the ambience in the room? How can we let ourselves feel the emotions of the person beside us through haptics, biosensors, and non-verbal communication?