Paul just blogged about it: The Echo Nest are demonstrating some of the stuff they have been working on. The one I like best is "automatic song" (which they say is a composition from automatically combining about 50 songs).
I'm curious what impact their API to extract features from audio will have on MIR research. Seems like they are also targeting artists who use processing to visualize music content. I'd like to see videos of their music visualizations.
Sunday, 27 January 2008
Subscribe to:
Post Comments (Atom)
1 comment:
I did some visualizations based on the Echo Nest's audio analysis last year. I have some video and images available: Visualizing Music. Cheers!
- Anita Lillie
Post a Comment