visual and sonic interpretation of the NYC subway system
This sonified visualization of the Manhattan subway system uses the maxlink Processing library to communicate between MaxMSP and Processing. Through additive and granular synthesis, sine waves and a sample taken from the NYC subway re-imagine what riding the subway looks and sounds like. Amplitude controls the width of the circles in the processing sketch. Each line is traced out (horizontally) and polyphony and granular controls are manipulated as the map is “played” manually, sequentially, or randomly at different speeds.