top of page

Regulating Emotion Through Music

Team: Asuka Li, Ameer Husary, Abdullah Hamid, Matthew Joesoep, Joseph Pogue, Elijah Chou

As humans we experience the world through an array of different sensory percepts: vision, audition, olfaction, gustation, and tactician. These senses come together to form the beautiful ensemble of consciousness. However, we live in a time where our senses are bombarded by the distractions of daily life. A barrage of emails, assignments, and text-messages cloud our mind with constant stimulation, leaving us unable to fully experience the present moment. As a result our society faces a mental-health crisis with nearly 52.9 million individuals in the U.S. experiencing symptoms of mental illness.

​

In recent years many have turned to digital technologies as a solution to this crisis. Principles of digital well-being have been adopted across many domains to help monitor and augment the affective experience of users. One such principle is the concept of emotion regulation through music. Music interfaces with the human mind through the sensory percept of audition to impact an individual’s emotional state. Furthermore, many studies have shown that visual images can also augment an individual’s emotions. When combined, these sensory perceptions (vision + audio) enable an individual to powerfully control their emotional state.

​

We sought to apply these principles of digital well-being to help users fully immerse themselves in the present moment. By creating a Java-based audio visualizer we created an immersive musical journey through sensory perceptions. This platform enabled users to visualize their music based on their preferred emotional state. By creating a user-friendly way to augment emotions we sought to alleviate the stresses and anxieties that disrupt our daily life.

The Tool

Digital Well-being Software

Using Processing and Java-Swing we created an interactive digital media player that displayed personalized music visualizations. These visualizations were tuned to emotional assessments that participants interfaced with at the beginning of each session. By pairing the color and audio frequencies to specific affective states we were able to create a tool to regulate the emotions of users.​

Screen Shot 2022-11-25 at 12.23.59 AM.png

KWAi AudioVisualizer

An audiovisualizer for human emotions

bottom of page