Dissonant Imagery

Description (in English)

[From Sonar website:] How do we visualise our thoughts? What do our neurons look like when we listen to music? Abstract questions that seemed rhetorical up until now, with the arrival of “dissonant imaginary”, the new project from Daito Manabe created in collaboration with Dr Yukiyasu Kamitani, which will take the form of an AV tech show at Sónar 2019. Using an MRI scanner, and Dr. Kamitani’s groundbreaking research into the decoding and visualization of brain states, the resulting outputs will be reconfigured live in a new AV tech show, that will redefine how we perceive the relationship between music and the human brain.

It’s no secret that sound elicits strong emotional reactions; whether listening to a film soundtrack or a song from our childhoods, music creates an imaginative response, allowing us to (re) create images in our mind. In an attempt to understand this process, dissonant imaginary asks two fundamental questions: How does music influence the way we visualise? How can images change the way we experience music? In order to answer these questions, Manabe and Kamitani have developed a system that decodes signals from the visual cortex, processing them and projecting them in the form of images in real time.

Situation machine vision is used in
Notes
I changed the year from 2019 to 2018 because this is the publication year stated on Daito's page. And I removed the tech from referenced. The same tech was both in used and referenced and this novel technology is being used and partly developed for this performance so I think it belongs rather there.

Authored by

UUID
1c65e0b0-ebd5-4857-9b14-648a698954ee