So first thing first: excuse me for the poor video quality, I couldn't use a screen capture software since my core2duo was too busy rendering balls and piano sounds. I had to film my computer screen using my cheap Canon camera. I'll post a better video as soon as I find a way.



I've always had mixed feelings regarding sound visualization on computers. While results usually look extremely cool, effective spectrum analysis and beat detection are hard to program, and the results never fully convey the feeling of looking at music.


So I decided to try a totally different approach and work with General Midi instead. While the public perception might be that midi sounds terrible (legacy of the AdLib & Sound Blaster sound cards days), sound synthesis has come a long way and midi, on top of being extremely practical, now can sound amazing. Coding sound visualization using midi as the input allows me to get the full information about which note is played, at which velocity, when, and how long it is held. Of course the downside is that it works only with midi instruments (keyboards and electronic drums and this is typically it...).


So here's how I approached the design, and how I represented the different dimensions:

  • Pitch: Notes are displayed left to right, as on the piano. Black notes appear slightly higher.
  • Velocity: Size, harder strokes are shown bigger than lighter strokes
  • Time: Basically using the Y axis to represent time. When a note is hit, a ball appears at the top of the screen and stays there until the key is released, then the ball drops following a gravity like force. The note will then start shrinking until is disappears. For trippyness purposes the balls split into 3 when they hit the floor and bounce around.
  • Harmony: Colors are used to display the note on the scale. Consonant interval is illustrated by using similar colors, while dissonant intervals are shown as opposed colors. Basically using the circle of fifths and the hue of the HSB color.





About the technology, I am using processing.org (java), a midi library and some OpenGL. A sequencer generates the sounds and redirects the midi to the processing application. Everything is rendered in real time.

Thanks to these indirect contributors:
* Frederick Chopin
* Ruin & Wesen, who wrote the Midi library for processing.org that I'm using
* All processing & java peeps! w00t.



Oh and I am looking for a performer that would be interested in teaming up to expand on the idea and do live performances using related technologies. Think a concert hall, a great piano player, and a giant projector screen. I have different midi visualizations in the pipeline and other ideas. Let's innovate and bring digital art and virtuoso piano concerto together.

Edit 11/23/2010: Source code!
https://github.com/mistercrunch/MidiVisualization

Edit 3/13/2011:

Robert Turenne has shown interest in this project and provided this screen capture
. Thanks!

7 comments:

  1. awesome!! makes me want to leave work and go play!

    RD

  2. hey, I'm new at this programming language, I only understand the very basics, how do you import the midi library?? How can I generate random music in processing instead of using an input as you did
    Thanks.

    CERO

  3. Hi ther:) nice work, can you show the processing code? I was very grateful...please:)

    Anonymous

  4. Lovely work done here! I stumbled upon your video on vimeo. Interestingly I just did a particle animations in html5 to the music of arabesque too @ http://jabtunes.com/labs/arabesque =)

    SHzZ

  5. couldnt find libraries for rwmidi o com.sun.opengl.util.texture.

    Could you please helpme?

    virginiarigon.com

  6. Beautiful! I'll try and give this a good test run shortly, using some classic videogame MIDI's. :)

    Ugotsta

  7. This is amazing.

    I have built a detailed midi generator that makes its own music using random algorithms within user defined parameters.

    This was a while ago now, but are you still eager to collaborate on projects?

    Ideas like this will make the live show have wow factor. Also, as the audience can control aspects of the live show with their smartphones.. they would also be able to control the visuals defacto.

    Get in touch. Riolaurenti@googlemail.com
    I'll send a video of what we're doing :)

    Unknown

Post a Comment