From IP1 Magazine
Words: Ellie Mills
Ryan Jordan talks about his recently developed Movement & Gesture Interface, which controls music through his own physical movements.
1. What is your commercial name, & why? It generally depends on what I'm playing or performing, but I mainly go under the name Zero Point Energy when I'm doing noise or gabber stuff. I found it out when I was searching the internet for brainwave frequencies & came across certain frequencies where apparently magic windows appear which can be sources for energy. I followed some links & came to Zero Point Energy which is something to do with physics & extracting energy out of a vacuum; just do a Google search & you'll get a better picture.
2. What is it the MGI exactly? MGI is just an abbreviation of Movement & Gesture Interface, & basically it's a MIDI controller that maps bodily movements & gestures by attaching two different types of sensors to the body; one to the head & one to the fingertips. It can control anything that uses MIDI, so, for example, you could control parameters in Reason, Cubase & soft synth samplers, etc.
3. How did you come up with the idea? I had a feeling, well more of a frustration really, with composing & performing with a computer. I haven't been trained to play a traditional musical instrument & I like the idea of physically moving to generate sound because it's a more direct way of connecting to the sound world because your whole body becomes involved. By using a computer you're kind of limited to the keyboard & I wanted a greater physical freedom than that.
4. How does it work? There is an accelerometer attached to the head which measures tilting on an x-y axis (up-down/left-right), & light sensors to the finger tips which measure the level of light in the immediate environment. These sensors then send the data to a computer chip (Basic Stamp) which has been programmed to send out MIDI data to a computer.
5. Who have you been influenced by, & how? I think I'm influenced by pretty much anything that I come into contact with; it all affects me in some way. I have been influenced lately mainly by my surroundings & friends, reading Jung & alchemy stuff, & doing a performance with KK.Null & Zev. I'm unsure in what ways I've been influenced in my work, it's probably easier for other people to answer that question.
6. What sort of music do you generally use to perform with the MGI? Currently its noise & experimental stuff. I'm using samples that I have generated myself, & trigger & manipulate them live with the MGI. I might expand from noise to do some kind of music but I'm not sure yet as I'm still developing the software & hardware.
7. Could you use any type of music with the MGI, for instance rock or jazz? Yeah, you can use it for anything you want because it works by MIDI, but it should be up to the user for doing what they want. It's not designed for any specific kind of musical style; it's just an attempt at creating a freer way of performing when using a computer.
8. Would you class it as a new musical instrument? Yes & no. It's not really a brand new thing, there have been many artists to use body mapping to control sounds. It's definitely not a traditional musical instrument. It isn't really an instrument on its own; it's a controller because it still needs software. With the software, whether it be Reason, Max/MSP or whatever, then it could be termed as an instrument.
9. Would it ever be possible to form a band with a different aspect of the music being controlled by each different member? Yes it is very possible & it's happening everywhere! A traditional rock band has each member controlling a different aspect of the music, so does an orchestra, a choir too. Just imagine a choir with accelerometers strapped to their heads all spinning around!
10. Could you ever integrate visuals to match the sound differentiations which you create, to work in unison with the music? Yeah, I'm actually working on a Max patch at the moment that's doing exactly that & it should be ready for my next performance, so I'll be controlling sound & visuals.
11. Would you ever consider making a version of the MGI that integrated movements from the whole body? Yes, definitely. That's the plan for the next year. I'm intending to put bend sensors on my wrists & across my back, motion sensors on either side of a stage & some other sensors that I haven't decided on yet. There are already artists who have used pretty much the whole body, from measuring eye movements, to pulse, brainwaves, & even blood flow.
12. Would you call what you create art or music, or a mix of them both, e.g. audio-visual art? Or, simply, performance? I think it's somewhere in between all of them. Its art-soundinteractive- performance.
13. What are your objectives for the future of the MGI? Further development with more sensors! Also, the software must be developed more for greater control in performances. Oh, & more gigs! If anyone wants any more info feel free to e-mail me at: firstname.lastname@example.org.