this week i was able to progress on my final! I got the simple patch I had done on Puredata that i’ve posted here last week and connected it to two speakers, to see how it would work spatially speaking. when i’ve put the speakers to work i didn’t have the situation i expected. the frequencies were everywhere and because they were not exactly close from the user, they didn’t cause the desired effect. even though what i had in mind in the beginning is to work with a physical installation that occupies a big space, i decided to work in a more personal scale: the piece will work with headphones.
after doing that first version of the patch and developing over the concept of disorientation, i decided i wanted to develop further the code on pd in a way that the user will have power over the experience – but in a disoriented/blind manner.With the help of Aaron , one of my ITP fellows, i was able to make the Puredata code much more controable for the user. There’s a slider to control the frequency as well as a switch to control to change the frequency that is emitted in each of the channels.
when i put people to listen to it, there were some diverging experiences: while some people could definitely feel out of balance, other could feel anything at all (or feel the phase shifting effect). i think that it will be valid for the experience to orient people to wait for a while until disorientation (i’ve realised that the ones who couldn’t feel anything didn’t stay long with the headphones). next steps: connecting puredata with arduino as well as keep on thinking what can make the experience more vivid and comfortable.