Senscapes was invited to the Horizon 2020 funded HackTheBrain event at the Science Gallery in Dublin, where we adapted our ideas to incorporate live brain recordings.

Inspired by synaesthesia, we wanted to turn live brain waves into sound and vision. 

Enriched by the skills brought by new members we combined live brain recordings with machine learning, brain-computer interfaces, and granular synthesis to convert live neural patterns associated with the sensory experience of texture into audio-visual ‘Senscapes’. 

This video shows the experimental stages of our project so far, accompanied by a soundtrack that we created that was driven by the neural activity associated with sensations of different materials.

Using Format