The BodyGroove

Body Groove

@May-Sep 2023
Physical Computing | Installation
DESCRIPTION
This project delves into the realm of embodied music interaction, exploring innovative ways of engaging with music through physical interaction. The project created a 'touch-free' musical device that turns physical movements into sounds and music. The objective is to assess how various musical attributes influence participants' engagement and satisfaction during the interaction process, opening up a world of possibilities in terms of inclusive music-making.
Deliverables
Web App
Physical Installation

Tools
Figma
Illustrator
Arduino
Processing
C++
Hand modelling
Responsibility
Feasible Assessment
Prototype
Wireframes
Usability Testing
EMG Tracking
Data Analysis

Embodied music cognition

A paradigm shift in music perception
The basic claim of embodiment theory is that all psychological processes are influenced by the body, including sensory systems, motor systems, and emotions.(Glenberg, 2010)
As it continues to evolve, it has been noted that embodiment is a tangible attribute of human interaction with music.
Its sensory-motor basis and the role it plays in different musical contexts is one of the main sources for our understanding of the fundamentals behind human-music interaction.(Leman et al., 2017)

Related works

Challenge & Limitation

" Using the human body as an instrument is probably more of an experimental performance in which I can imagine that the tone, timbre and instrumental elements of a song are constantly changing under the influence of body movements."

" Human motion cannot be defined into several fixed patterns, compared with DJ keyboard"

" How to simplify and categorise the user's body movements when processing a song? "

Research Direction

Hypothesis

“By incorporating an intuitive mechanism, the project grants users the flexibility to interact spontaneously through body movement, while also ensuring that the produced music retains a sense of coherence. This balance of spontaneity and structured sound generation  is expected to provide users with a more immersive, fulfilling and participatory interaction environment.”

Research Questions

How does the integration of specific technologies and mechanisms influence participants' engagement behaviour in the interaction?

Which musical attributes or genres amplify participants' engagement and facilitate their creative performance during the interaction?

How do participants perceive and evaluate the balance between sound randomness and structured sound generation?

How do participants rate the usability and intuitiveness of prototypes and system mechanisms?

Research Framework

Feasible Assessment

Exploration of Sensor Types

● IR Break Beam Sensor

● Motion Sensor

● LDR Sensor

● TouchBoard

Rapid Prototype

● MVP-Minimum Viable Product

● Physical Layout Study

Process Driven Approach

Pre-set Sound Processing

● Task Identification

System Mechanism

● User Journey

● Information Architecture

● User Interfaces

Prototype Testing

Qualitative Methods

● In-depth Interview

● Contextual inquires

● Think Aloud

Quantitative Method

● EMG(Electromyography) Tracking

● Observation

< Feasible Assessment >

Explore Sensor Types
The LDR (Light Dependent Resistor) sensor

● Exhibits sensitivity to light variations, mirroring the capture of random motion patterns;

● It offers simplicity in operation, is cost-effective, and doesn't require additional signal receivers for installation;

● Can be integrated with Processing to create visual effects based on light changes.

Explore Sensor Layout

- "All melodies and harmony in Western music is typically built from just 12 notes."

Minimum Viable Product

Demonstrates the basic functions of a music simulation system as an initial prototype of a touch-based interactive music simulator.

< Rapid Prototype>

< The Process-Driven Approach>

Pre-set Task Processing

Definition of Testing Approach

Cognitive Engagement Through Familiarity
/ Common Sense
Emotional Resonance and Storytelling
/ Evoke Feeling

Selection of Diverse Soundscapes

Instrumental Sound


Scenario Sound

Task  Development

Sound Trimming, create sound library
Physical Sensor Matching through Processing

< User Interaction Flow >

Basic Triggering

Gestural Exploration

Clip Production

< System Mechanism>

< Testing & Evaluation >

EMG Tracking

observation

recording

Think aloud

8 Participants

Group: International Students
Age: 21-28
Gender: 2 male, 6 female
Musical Background: 4 enthusiasts, 1 professional, 3 former learner

Participants interact very differently with content and information that's presented to them, the way they ingest, their emotional state, physical state will alter that interaction.

Semi-structured Interview
Results - EMG Signal
Results - Example 1
Results - Example 2
Results - Qualitative Data analysis

Each recommendation was supported by specific user quotes and observations gathered during the qualitative research.

" Instrumental + Ambience "

Facets

Engagement

● Randomness
● Unpredictability
● Remix

◦ Muiti-tracks
◦ Use tool to strike

Learning Curve

● Clear layout and controls
● Easy to navigate
● Intuitive

◦ Wire contact
◦ Memory issue

Quality & Diversity

● Two-hand interaction
● Satisfied arrangement
● Involve other body part

◦ Overlapping
◦ Chaotic combination

Application

- Gathering, party
- Children education
- Art gallery
- Music Therapy
- Mobility limited
- Music demo
Improvements

◁ Sound Extraction

Select high quality audio source and split the audio samples into standard lengths to minimise overlap.

Sensor Performance

Enhancing sensor sensitivity and range to encourage more extensive body engagement and user interaction.

◁ Phyical Induction

Incorporate visual feedback when activating individual sensors.