Sonification. That’s the word. It means signal-to-sound transduction. Maybe sound of the appealing variety, or not. It’s subjective, since some of us like noise. I’ve done or planned several “high-concept/low-brow” art installations that turned movement, or data, or some signal into sound. Or light. I like light.
Blind or deaf, you miss one, unless you can translate between them. I’ve called the thing the does this the “transduction reactor”, though I’m sure it has a formal name. My friend is synaesthetic. He is a human transduction reactor.
We do data visualization. Sometimes we do data sonification. Ambitious efforts to create boundary spanning experiences might do both. But ignore for the moment translating between mediums. Nobody wants to listen to your data for practical use. What about just association?
We all have songs that take us to specific moments in time and/or space. We use music to reflect, set, or signal our state of mind. Associating audio with data or user experience can be both a mode of sensing and influence.
The most tepid aspect is curated pairings. In a sense, to DJ your data assets. The idea would be you signal something implicit or ironic in data which is difficult to communicate in plain language.
A slightly more alluring aspect is what people along the data pipeline were listening to during their custody. You could infer a lot from associated process metadata. You couldn’t say “play country and things are more accurate”. Music is just the signal of underlying behavior.
A significantly more interesting aspect is how listening to different tracks while interacting with data leads to difference in interpretation and, ultimately, decisions. This is not a new idea. Music has been a form of psychological warfare forever.
I will try a few experiments in this vein. But. more importantly, go support live music again!