#Max msp 7 series#
What if you used that data to reconstruct music by driving a sequencer in Max? The analysis is a series of time based quanta called segments. Some of the following background information and video is from the original version.
#Max msp 7 code#
Other than that, the synthesis code in Max has not changed. You will need set up a developer account with Spotify and request API credentials. This implementation uses node.js for Max instead of Ruby to access the API. The original analyzer document by Tristan Jehan can be found here (for the time being): This project is part of the internet-sensors project: and updates the 2013 Echo Nest project described here: Also, you can experiment with the pitch range.Įcho Nest API audio analysis data is now provided by Spotify. Try experimenting with the polling rate and ramp length in the poly-oscillator patch. Note: there will be more buses running during rush hours in Boston.
#Max msp 7 Patch#
The key distributed with the patch is fake. You will need to replace the API key in the message object at the top of the patch with your own key. The patch uses a object to graph the position of the buses along the route – but due to the data problems described above, the positions don’t always reflect the current latitude/longitude coordinates or the bus stop name. Also, buses that are not in service are not removed from the vehicle list or indicated as such. The direction_id and stop_sequence data from the buses is often wrong. For example, there are bus stops not associated with the route. There are quality issues with the realtime data.
Data is polled every 10 seconds, but it seems like the results might be more interesting to poll at a slower rate, because the updates don’t seem that frequent. Latitude and Longitude data is mapped to oscillator pitch.
This patch requests data from MBTA API to get the current location of buses – using the Max js object. Sonification of Mass Ave buses, from Nubian to Harvard.