Process — 03 of 04
The collective state maps to parameters within a generative sound system — not as metaphor, but as direct, continuous, real-time correspondence.
Overview
Translation is the layer where data becomes experience. The aggregated collective state — expressed as valence, arousal, and coherence — drives a generative sound engine through carefully designed transfer functions.
The result is an environment that participants feel responding as they change. Not programmed reactions, but a living system tuned to the group's inner landscape — always current, never repeated.
The sound system
The sound design is metaphorical in structure but physiological in source. Each sonic dimension is mapped to a biological parallel — so the relationship between body and environment feels felt before it is understood.
Heart rate variability becomes the system's rhythmic heartbeat. Group-wide HRV coherence produces entraining pulses; divergent HRV creates polyrhythm and temporal texture.
Collective respiration shapes the harmonic progression. Slow, synchronised breath opens wide harmonic intervals; fast or fragmented breath compresses and destabilises the tonal field.
Coherence is rendered as spatialization — felt as much as heard. When the group converges, sound contracts toward the centre; when dispersed, it expands into the full spatial field.
Arousal level determines the density and grain of the sound texture. High arousal produces rich, layered timbres; low arousal allows sparse, clean tones to emerge.
EEG-derived valence lifts or lowers the spectral register — pleasant emotional states open higher, brighter sonorities while discomfort tends toward lower, denser frequencies.
All dimensions update within 100ms — the environment responds as the group changes.
"Heart rhythms become percussive pulses. Breath patterns shape harmonic textures. Moments of group coherence are translated into spatial movements — felt as much as heard."Emphonic System — Artistic Vision
Visual extension
The translation layer drives not only the audio environment but a coordinated visual field — projection mapping and responsive lighting guided by the same biometric data. Together, these modalities form a unified sensory environment where sound, light, and physiology move in synchrony.
Responsive projections shift in density, movement, and pattern in response to the collective emotional state — creating visual representations of physiological dynamics that are felt as environmental atmosphere.
Lighting behavior — color temperature, intensity, directionality — tracks the biometric streams with sub-100ms correlation. The room itself breathes with the group.