Here's an unfinished track I worked on back in March 2010. I was exploring wobbling bass sounds.

The sound sample is from some guy back in the 50's who recorded his impressions while assisting to an atomic test somewhere in the Nevada desert.

AtomicTest.mp3


LED Button Pin from mistercrunch on Vimeo.

I designed and built these button pins as an art contribution to the BaconWood festival. BaconWood is a bacon and music festival that takes place in the Mendecino woodlands. 100 of these were distributed to the crowd, making this giant bacon party more colorful and blinky.

Specs:
* 6 SMD RGB LEDs
* 1 button (cycles through different patterns)
* 1 microcontroller (ATTiny48) (32 IO pins!)
* 1 coin cell battery holder and battery
* 1 6 pin flatflex connector for programming
* 1 pin

    The MCU controls each LED individually with 4bit PWM. I built a pattern framework which allows for different types of patterns (blinking, twirling, random, ...). The framework can do various thing like palette animation, color ranges, operate at different speeds and so on.

    The idea was to make a cool little pin in itself, but also to design a cheap modular node that can be network to one another to use for other projects. In other words, these pins can be meshed into a network. Each one can  receive, emit or retransmit patterns.

    The board is made so that there's a 4 pins connector on either side. The 4 pin connector is (VCC, GND, IO, IO), where IO is a bidirectional UART. I wrote a software UART using a simple protocol and interuptions on these IO pins. The message is typically a lighting pattern, validation and timing as to when to pass the pattern to the next neighbor.

    I designed the PCB using Eagle, the PCB were printed by AP Circuits, the parts ordered from DigiKey, the LEDs from somewhere in China, and the whole was hand soldered by yours truly (yes, it did take a million hours). 

    These pin can be assembled in strings or meshes. I have 100 left. Stay tuned to see what I'm going to do with them.


    PongCyl3D: A 3D Pong game in a cylinder from mistercrunch on Vimeo.

    I programmed this 3D version of the classic Pong game that plays in a cylinder and wanted to open the source code for it under the GPL license. It is written in Java using the Processing.org framework/IDE.

    Note that:
    * it is possible to put effect on the ball based on the movement of the paddle while hitting the ball
    * a circle moves along the Z axis to help with depth perception
    * a menu is accessible to tune some parameters

      It was made as an experiment, and purely for fun. The gameplay is currently limited since your NPC opponent will never miss the ball (his paddle XY coordinates are tied to the ball's). I shelved the project, but wanted to have the paddle controlled using hand gestures while using simple computer vision.

      I was envisioning a projection screen in the middle of a room with a player on either side. Each side has a projector, a webcam that reads player motion and, you guessed it: a player. The ball accelerates and the amount of effect on it increases as the exchange lasts, until someone misses. First to get 11 points win.

      This was meant to be a prototype so the code might not be perfect and commented.

      Code on github:
      https://github.com/mistercrunch/PongCyl-3D

      Yes. I'm taking the leap: I'm open sourcing most of my projects.

      Most of what I inspire myself from, learn from and run on a daily basis is actually Open. So time to give back. Plus the fact that you get awesome free hosting for your open source projects.

      I will go back into my older posts and add the links to github, which is the social coding site where I will host my projects.

      Send me a request through email if there's a specific project you want me to share the code for, I'll take the time to create the github page and update the blog post with the link.

      Now open:
      PongCyl3D: A 3D Pong game in a cylinder
      Open Source Interactive 3D Harmonograph
      More to come!



      If someone was to tell me "I know what you did last summer", that would mean that they know about my involvement in project Pixmob, a crowd display technology developed by Eski and used by the Cirque du Soleil in a show they put together for Microsoft Kinect's launch.

      I played an important creative and executive role on the project: from prototyping, to design, to soldering, to hiring and managing troops. Basically just doing whatever it took to making it happen.

      From Eski's site:

      In Los Angeles, on June 13 2010, with a mise-en-scène by Cirque du Soleil, Microsoft launched Kinect, a revolutionary system for the Xbox 360 console where the human body is the controller. In order to find new ways to make the crowd participate in the event, Cirque du Soleil called upon ESKI’s PixMob technology.
      With barely three month ahead of them, ESKI’s designers and engineers produced several thousands of PixMob LED pixels as well as the infrared spotlights to communicate with them. They also designed the visual effects, created the ‘ponchos’ that the Cirque du Soleil had imagined and embedded the pixels in more than 3000 ponchos.
      PixMob’s technology stunned the members of the crowd as they each turned into a pixel of a giant screen glowing in a myriad of colors. Along with Cirque du Soleil’s dazzling performances, lots of ink has been spilled over the ‘magic ponchos’ born and raised in our studios!

      http://eskistudio.com/en/projects/microsoft_kinect/

      I can't wait to see more of those Pixmobs!


      Inspired by a book titled  Harmonograph: A Visual Guide to the Mathematics of Music , I decided to program an interactive 3d harmonograph using Processing.org.

      Here's what it looks like:




      But just what is a Harmonograph? (from Wikipedia):

      "A harmonograph is a mechanical apparatus that employs pendulums to create a geometric image. The drawings created typically are Lissajous curves, or related drawings of greater complexity.(...) A simple, so-called 'lateral' harmonograph uses two pendulums to control the movement of a pen relative to a drawing surface. One pendulum moves the pen back and forth along one axis and the other pendulum moves the drawing surface back and forth along a perpendicular axis. By varying the frequency of the pendulums relative to one another (and phase) different patterns are created. Even a simple harmonograph as described can create ellipses, spirals, figure eights and other Lissajous figures."




      I decided to share the source code under the GPL license. The project is still work in progress, so the code is shared as is. It should be easy enough for anyone interested in interacting with this software to download the source code and run it in the Processing IDE on any platform.

      Using this software, many variables can be interacted with to change the harmony of the graph. The interface is simple and very straight forward. The menu and variable names should be self explanatory. Here's the basics on how to interact with this program:

      Mouse: Changes the point of view around the drawing
      Numbers(1to9) quick: Recalls a pattern (variable values set)
      Numbers(1to9) held: Saves/overwrite a variable preset for that number
      Space bar: Shows/hide the 3d menu, menu is best viewed when the camera is centered, bring the mouse to the middle of the screen
      Up/Down: Change from a variable to the next/previous
      Left/Right: Increment/Decrement the variable

      Source on GitHub:
      https://github.com/mistercrunch/Harmonograph-3D

      Thanks to Reid Spice, the second edition of the BaconWood Festival is coming soon, and I started wondering about what I'm going to showcase this year.

      Thinking of that reminded me of what I did last year. I set up this video installation I now refer to as "the psychedelic mirror" which was basically a camera / projector setup while Processing the image in real time through trippy algorythmetic filters I wrote. I set it up on the dance floor and people were basically dancing with an ever changing psychedlic image of themselves.

      Luckily, my friend Jim Wiggins had taken a couple of short videos that I dug up and put together recently. See what it looked like here:


      Psychedelic Mirror - Real time video processing.org from mistercrunch on Vimeo.

      The webcam was slightly modified to work on the close to infra-red spectrum and I had a infra-red spotlight. A sheet was used in front of a window as a projector screen. I had nothing to hold the projector in a safe place so I had to strap it on garbage can sitting on top of a table on top of two other tables (so it would clear the heads of the dancers). Kinda ghetto, but it turned out pretty good!

      The software was designed to have two modes, one that would cycle automatically through the different patterns and another where the events and patterns were triggered by a wireless keyboard.

      I'm thinking about using this idea as a prototype to something much better. Stay put!

      So first thing first: excuse me for the poor video quality, I couldn't use a screen capture software since my core2duo was too busy rendering balls and piano sounds. I had to film my computer screen using my cheap Canon camera. I'll post a better video as soon as I find a way.



      I've always had mixed feelings regarding sound visualization on computers. While results usually look extremely cool, effective spectrum analysis and beat detection are hard to program, and the results never fully convey the feeling of looking at music.


      So I decided to try a totally different approach and work with General Midi instead. While the public perception might be that midi sounds terrible (legacy of the AdLib & Sound Blaster sound cards days), sound synthesis has come a long way and midi, on top of being extremely practical, now can sound amazing. Coding sound visualization using midi as the input allows me to get the full information about which note is played, at which velocity, when, and how long it is held. Of course the downside is that it works only with midi instruments (keyboards and electronic drums and this is typically it...).


      So here's how I approached the design, and how I represented the different dimensions:

      • Pitch: Notes are displayed left to right, as on the piano. Black notes appear slightly higher.
      • Velocity: Size, harder strokes are shown bigger than lighter strokes
      • Time: Basically using the Y axis to represent time. When a note is hit, a ball appears at the top of the screen and stays there until the key is released, then the ball drops following a gravity like force. The note will then start shrinking until is disappears. For trippyness purposes the balls split into 3 when they hit the floor and bounce around.
      • Harmony: Colors are used to display the note on the scale. Consonant interval is illustrated by using similar colors, while dissonant intervals are shown as opposed colors. Basically using the circle of fifths and the hue of the HSB color.





      About the technology, I am using processing.org (java), a midi library and some OpenGL. A sequencer generates the sounds and redirects the midi to the processing application. Everything is rendered in real time.

      Thanks to these indirect contributors:
      * Frederick Chopin
      * Ruin & Wesen, who wrote the Midi library for processing.org that I'm using
      * All processing & java peeps! w00t.



      Oh and I am looking for a performer that would be interested in teaming up to expand on the idea and do live performances using related technologies. Think a concert hall, a great piano player, and a giant projector screen. I have different midi visualizations in the pipeline and other ideas. Let's innovate and bring digital art and virtuoso piano concerto together.

      Edit 11/23/2010: Source code!
      https://github.com/mistercrunch/MidiVisualization

      Edit 3/13/2011:

      Robert Turenne has shown interest in this project and provided this screen capture
      . Thanks!


      3D Vertices fed from webcam from mistercrunch on Vimeo.


      So here's my latest little hack: I'm basically using the webcam feed in real time to draw 3d lines that create a grid of *vertices*. I'm using processing.org (java) and OpenGL. The Z axis is based on brightness. Then there's a simple color mapping that adds an interesting effect.

      Moving the mouse cursor will rotate the grid, and buttons allows to change the resolution (size of the squares). Another button will engage into a steady rotation along the Y axis.