The Harmony Table at the Mobius Science Center in Spokane, Washington is the centerpiece of the Sound Cube, a room in the museum that inspires visitors to explore music theory and sound through interactive exhibits. The multi-touch table allows multiple visitors to interact in a collaborative learning experience.
Each visitor is provided a movable felt puck. Placing a puck on the table creates a circular piano-like instrument that rotates around the puck. Each ring of the instrument can be played similarly to a piano key. When a key is struck, a corresponding note is played and a particle, representing that note, is displayed in the activated portion of the table. As subsequent notes are played, the particles form complex chains around the table.
Over time, regions of the table corresponding to the Circle of Fifths are activated and all of the notes triggered by an instruments shift into a new key. When no visitors are present, the table enters an auto-play mode wherein each instrument takes turns playing familiar melodies.
I worked as the programming lead in collaboration with Upswell and Arxi Creative. I wrote the front-end with the Processing graphics framework and the application framework, multi-touch logic, and MIDI server integration in Java.
This prototype for an exhibit at the Oregon Museum of Science & Industry, in Portland, Oregon, uses the Microsoft Kinect Sensor to provide insight into the possibilities of using technology to augment the human body.
I worked as programming lead with OMSI’s research and development team and independent producer Michael Hoffman. The software was developed using C++, the Microsoft Kinect SDK for user detection and the Cinder graphics framework.
The Infinity of Nations exhibit, in New York City, consists of ten artifacts of Native American heritage. Each object is accompanied by a touch-screen application that provides rich content exploring the objects origin.
Visitors pan horizontally through clusters of content containing high res pan and zoom images and videos featuring historians that explain the origins of each object.
I worked as the programming lead on this project with Kieran Lynn and Potion Design. The applications were written in ActionScript 3.0 and run in the Adobe AIR runtime. Content is managed by the museum staff through a Django powered admin tool.
Each of the ten applications in the Infinity of Nations exhibit featured unique images, text and video. The layouts were complex and we needed a way to allow the designers to set them without needing to edit configuration files or writing code. The video above demonstrates a tool that I wrote in addition to the actual application that allows for just this.
The Kronos Research Tool is a data acquisition software built for the University of Pennsylvania Museum of Archaeology and Anthropology. Named after the father of Zeus, the tool was built specifically for an archaeological expedition to Mt. Lykaion in Greece, the alleged birthplace of the Greek god.
Researchers use software running on Android powered devices to collect data about finds in the field. The application features detailed menus and forms that allow for the collection of intricate data points. The researchers can also capture photos of finds and quickly document them with descriptive text.
At the end of each day, the devices are connected to an onsite server and, via Python, the collected data is transferred from the SQLite database on the device to the servers. From there, the data can be accessed through a Django based admin.
I worked as the programming lead on the Android powered data collection app and the Python synchronization tools.
In The Studio is a large scale interactive installation on the second floor of the Grammy Museum in L.A. It consists of eight pods, each featuring a hands on walk through of recording studio techniques.
Each pod consists of a touchscreen and an overhead video screen. Well known musicians, producers and engineers guide visitors, via the video screens, in recording, mixing, mastering & re-mixing music with an entire suite of audio tools made available in the touch screens.
I worked as the programming lead on this project during my time at Second Story. Each of the eight pods feature two independent applications with unique content, audio controls and visual themes. I designed a highly configurable software architecture and graphics framework to streamline the development of all sixteen applications. The video screen and touchscreen in each pod are synchronized over a network connection with the Red5 Media Server.