May 20, 2022
A multi-modal meditative music patch in MAX/MSP and Ableton Live.
Listen to the sound of your body and make sound happen by being.
Instructed by Professor Brad Garton.
Notion:
motion(gesture) + body signals + environmental signals = music
When we do meditations, we often close our eyes, stop our movements and focus on our breath, and any changes in our body. Often, they are too subtle to detect, especially for people in a busy world. What if we re-activate our body through sound, close our senses and let our body and environment around us do the job?
Tools
- Max/MSP for data processing and plug-in design.
- Fluid Corpus Manipulation project(FluCoMa) for neural network module.
- Ableton Live for multi-modal live performing or music making.
- Holon.ist for accessing data generated by iPhone, AirPods, Apple Watch.
Data Source & Mapping
- motion (gestures)
- breath measured by subtle movement of airpods attitude and rotation
- body signals: heart beat from apple watch as tempo (BPM)
- environmental signals:
- local sun/moon elevation, wind speed, temperature, humidity, time from online data
- brightness from camera
- environmental sound from microphone
materials from jitter image processing and motion detection
make a live sound event classifier based on mubu.gmm.scratch, then re-synthsesize the sound
Max-for-Live Plugin Design
mapping signals to sound processing parameters
A little demo for the Max for Live patch
Codes
SoniZen
sivannavis • Updated Nov 21, 2022