Stay intimated with the recent happenings and occurrences all over the world...your satisfaction is our priority.

Saturday, 2 March 2019

The promise and peril of "sonification": giving feedback through sound

The majority of applications use "visualization" to give feedback and responses to users: think of graphs, alerts, and other visual cues about what is going on inside a computer, or what the computer has detected in the world.

Sonification is the aural equivalent of visualization: communication from computers to humans by means of sound. There's some of this already in the world: alert chimes, system beeps, and talking voice assistants. But in an age of constant earbud use, there is lots of potential for more.

Writing in Wired, Boing Boing contributor Clive Thompson (previously) discusses the growing use of sonification, from promising medical applications (using sound cues to help people with compromised movement and balance correct their gaits) to scientific analysis (transforming the hurricane telemetry into sound, allowing researchers to easily hear when hurricanes are about to intensify) to UI for everyday applications (adapting incoming message chimes to communicate something about their content, like whether they're coming from known senders or appear to be urgent).

Thompson notes, in passing, a very important caveat: "done elegantly [emphasis added], sonification could help create a world where you’re still as informed as you want to be, but hopefully less frayed by nervous glances at your screens."

The reason screens are anxiety-provoking isn't solely a factor of having to get your phone out of your pocket to see what's going on -- it's also the result of an arm's race between app designers and our limbic and attentional systems. Think of how Google Fi, Lyft, and other apps use the fact that they've got permission to send you alerts (for useful things, like telling you when you've lost service or when your cab is arriving) to send you promotional messages inviting you to sign up friends or buy additional services. It's so bad that entire frameworks exist to help manage the desire of firms to hijack your attention -- think of how Google added the entirely useless "trending searches" to the Android search bar, which was just a way to divert you from your desire to find out a specific fact by nonconsensually eyeball-fucking you with Trump-related clickbait.

So long as firms can convert "engagement" into money, they will use any attentional mechanism to "engage" with you. And the equilibrium that firms' products seek is, "Incorporating nonconsensual attention-hijacking into an alert system that you can't switch off because it sometimes carries essential data, to the degree that approaches, but does not exceed, the moment where you delete their product altogether."

So certainly sonification can be useful in science and medicine. But I think it could also be a boon in our everyday lives. We’re already walking around in our own sonic world, with smartphone-­connected headphones plugged into our ears. And app notifications—the ding of the incoming text—are little more than simple forms of data turned into sound. Now imagine if those audio alerts were more sophisticated: What if they connoted something about the content of the text? That way, you could know whether to pull out your phone immediately or just read the message later. Or imagine if your phone chirped out a particular sequence or melodic pattern that informed you of the quality—the emotional timbre, as it were—of the email piling up in your inbox. (Routine stuff? A sudden burst of urgent activity from your team?) You could develop a sophisticated, but more ambient, sense of what was going on.

None of us need a cacophony of sonic alerts, of course, and there are limits to our auditory attention. But done elegantly, sonification could help create a world where you’re still as informed as you want to be, but hopefully less frayed by nervous glances at your screens. This could make our lives a bit safer too: Research at Georgia Tech’s sonification lab found that if car computer systems expressed more data audibly, we’d be less distracted while driving. Like Muratori’s patients, we could all benefit from having our ears a little closer to the ground.

Our Ears Are Unlocking an Era of Aural Data [Clive Thompson/Wired]

Share:

Popular Posts

Powered by Blogger.