faces
On the beach the Sunday before I caught coronavirus on my birthday, I picked shells and sunburned so bad I wore aloe instead of shirts four days. One was purple and I wonder whether it was born that way or if the ocean did that.
Midify is a program I wrote that creates music from visuals. Like a camera, it translates data into a new representation of itself. For this project, I let Midify translate the first draft of what you see in this video into the first draft of what you hear. The rest was guiding their relationship, taking sound cues to influence the visuals and vice versa.
I rarely use my camera anymore, so I dug through my archive for visuals. I picked shots and one was three years old and for the first time I wonder whether my camera’s relationship to images we capture is stronger than mine.
At first it was easy to make big changes to Midify’s computer-generated draft and my collection of old footage. But as their relationship began to take shape, new edits felt more like disturbances. So I just stopped, which is usually hard for me to do. I’m learning that connecting dots in relationships feels really organic to me and helps when I lose creative motivation. I think there’s a lot of room for me to grow through this process, so I think I’ll keep exploring it.