So first thing first: excuse me for the poor video quality, I couldn't use a screen capture software since my core2duo was too busy rendering balls and piano sounds. I had to film my computer screen using my cheap Canon camera. I'll post a better video as soon as I find a way.



I've always had mixed feelings regarding sound visualization on computers. While results usually look extremely cool, effective spectrum analysis and beat detection are hard to program, and the results never fully convey the feeling of looking at music.


So I decided to try a totally different approach and work with General Midi instead. While the public perception might be that midi sounds terrible (legacy of the AdLib & Sound Blaster sound cards days), sound synthesis has come a long way and midi, on top of being extremely practical, now can sound amazing. Coding sound visualization using midi as the input allows me to get the full information about which note is played, at which velocity, when, and how long it is held. Of course the downside is that it works only with midi instruments (keyboards and electronic drums and this is typically it...).


So here's how I approached the design, and how I represented the different dimensions:

  • Pitch: Notes are displayed left to right, as on the piano. Black notes appear slightly higher.
  • Velocity: Size, harder strokes are shown bigger than lighter strokes
  • Time: Basically using the Y axis to represent time. When a note is hit, a ball appears at the top of the screen and stays there until the key is released, then the ball drops following a gravity like force. The note will then start shrinking until is disappears. For trippyness purposes the balls split into 3 when they hit the floor and bounce around.
  • Harmony: Colors are used to display the note on the scale. Consonant interval is illustrated by using similar colors, while dissonant intervals are shown as opposed colors. Basically using the circle of fifths and the hue of the HSB color.





About the technology, I am using processing.org (java), a midi library and some OpenGL. A sequencer generates the sounds and redirects the midi to the processing application. Everything is rendered in real time.

Thanks to these indirect contributors:
* Frederick Chopin
* Ruin & Wesen, who wrote the Midi library for processing.org that I'm using
* All processing & java peeps! w00t.



Oh and I am looking for a performer that would be interested in teaming up to expand on the idea and do live performances using related technologies. Think a concert hall, a great piano player, and a giant projector screen. I have different midi visualizations in the pipeline and other ideas. Let's innovate and bring digital art and virtuoso piano concerto together.

Edit 11/23/2010: Source code!
https://github.com/mistercrunch/MidiVisualization

Edit 3/13/2011:

Robert Turenne has shown interest in this project and provided this screen capture
. Thanks!


3D Vertices fed from webcam from mistercrunch on Vimeo.


So here's my latest little hack: I'm basically using the webcam feed in real time to draw 3d lines that create a grid of *vertices*. I'm using processing.org (java) and OpenGL. The Z axis is based on brightness. Then there's a simple color mapping that adds an interesting effect.

Moving the mouse cursor will rotate the grid, and buttons allows to change the resolution (size of the squares). Another button will engage into a steady rotation along the Y axis.