My Spring 2012 developer internship at Vidvox in Troy, NY was a highly rewarding introduction to the technical and practical aspects of developing commercial software for art and music production. We began the semester with a research presentation I gave to the team outlining the history of the modern music composing tool, the sequencer. During this session we discussed the features and capabilities of a chronological assortment of sequencer tools, beginning with the hardware sequencers of the 1970’s and moving through the most popular tools of today for the recording, editing, and playback of MIDI events. We discussed the ergonomics of use of various interfaces, and the commonalities and differences between them. After this initial session, we possessed a collection of ideas and use cases to contribute to the design of a new sequencer app for Macintosh OSX and mobile platform iOS. To my surprise, the team at Vidvox graciously offered to use our time together to assist me in the design of this application as my own personal project.
At each meeting we focused on the evolution of the design through various collections of features and use cases, while keeping an eye on the currently available tools for OSX and iOS so that we wouldn’t unknowingly reinvent an existing product. While at home, I worked on the implementation of a set of Objective-C classes needed to get the basic functionality going (i.e. recording, and playing back MIDI events). It took a surprising amount to work to get to that point, but once it was done, it became clear that developing a commercial application would definitely be possible. We presently have a working basic sequencer application that runs on Mac OSX which can record multiple tracks of MIDI events, and play-back these tracks of events to both the inter-application MIDI bus, and the built-in Apple DLS synthesizer. We included a way to change the sounds of the synth with a sound-selection menu for each of the sixteen MIDI channels, and provided a simple way for the user to export standard MIDI files via a drag-and-drop action onto the desktop or a compatible external application.
Both to add to the feature set of the basic sequencer, and to test out the pragmatic use of a content-generator system, I interfaced the basic sequencer application with some previous code I had written, which evolves musical phrases using a genetic algorithm. The combination of having both a MIDI event recorder and a MIDI music generator proved to be interesting, and fueled the discussion for a subsequent round of decision-making and design.
We are planning on continuing our work over the summer, with the goal of releasing a commercial iOS application that can both generate and record music using MIDI. We want the interface to be simple and the application powerful for both the novice and expert musician. With the knowledge and experience I’ve gained from working with the team at Vidvox, I’m confident and prepared to continue working toward this career milestone, the release of a first commercial application.
2012 Computer Science and Applied Mathematics B.S. candidate – UAlbany
2014 Music Technology M.S. candidate – Georgia Tech