At an Unhackathon an artist, a DJ, and two Python coders created algorithmic music based on a graph of carbon emissions from 1751 to the present. We transformed an image (graph) back into the numbers it was made from (data), generated then refined music from those numbers, and had so much fun. 
 Ben adds texture and depth to the notes generated by MIDITime software from the carbon emissions data (1751 - present)
 Our two Python experts John & Justin tried an array of potential approaches to converting statistics into music. Then they stumbled on MIDITime - created by a radio producer as a way to create aural interpretation (sonification) of Oklahoma's increasing earthquakes.
 They turned the data from these charts into music, and the DJ made it sound quite nice. A sample of what was produced  https://soundcloud.com/benarvi/amc-unhackathon-output/s-NmunY
prev / next