The next music tastemakers – the computer programmers
Posted by Paul in recommendation on July 29, 2010
There’s an interesting piece in the New Yorker about the future of listening. The article focuses on Pandora and MOG and the challenges of making the online listening experience. Author Sasha Frere-Jones concludes with this:
While using these services, I kept thinking about an early-eighties drum machine called the Roland TR-808, which has seduced generations of musicians with its heavy kick-drum sound and the oddly human swing of its clock. Whoever programmed this box had more impact on dance music than the hundreds of better-known musicians who used the device. Similarly, the anonymous programmers who write the algorithms that control the series of songs in these streaming services may end up having a huge effect on the way that people think of musical narrative—what follows what, and who sounds best with whom. Sometimes we will be the d.j.s, and sometimes the machines will be, and we may be surprised by which we prefer.
Read the article:
Help researchers understand earworms
Posted by Paul in Music, music information retrieval, research on July 29, 2010
Researchers at Goldsmiths, University of London, in a collaboration with the BBC 6 and the British Academy, are conducting research to find out about the music in people’s heads, sometimes called ’musical imagery’. They want to know what songs are the most common, whether people like it or don’t, what triggers it, and if some people have music in their head all the time, etc.
To help researchers understand this phenomenon, take part in a questionnaire (and you could win £150 too). I took the survey, it took about 10 minutes. They do ask some rather personal questions that seem related to one’s tendency towards compulsive behavior. (yes, I do sometimes count the stairs that I’m walking up).
It looks to be an interesting research project. More details about it are here: The Earwomery.com
Visual Music
Posted by Paul in code, data, events, fun, Music, The Echo Nest, visualization on July 28, 2010
The week long Visual Music Collaborative Workshop held at the Eyebeam just finished up. This was an invite-only event where participants did a deep dive into sound analysis techniques, openGL programming, and interfacing with mobile control devices.
Here’s one project built during the week that uses The Echo Nest analysis output:
(Via Aaron Meyers)
Novelty playlist ordering
[tweetmeme source= ‘plamere’ only_single=false] We’ve been building a new playlisting engine here at the Echo Nest. The engine is really neat – it lets you apply a whole range of very flexible constraints and orderings to make all sorts of playlists that would be a challenge for even the most savvy DJ. Playlists like 15 songs with a tempo between 120 and 130 BPM ordered by how danceable they are by very popular female artists that sound similar to Lady Gaga, that live near London, but never ever include tracks by The Spice Girls.
I was playing with the engine this weekend, writing some rules to make novelty playlists to test the limits of the engine. I started with rules typical for a similar-artist playlist: 15 songs long, filled with songs by artists similar to a seed artist (in this case Weezer), the first and last song must be by the seed artist, and no two consecutive songs can be by the same artist. Simple enough, but then I added two more rules to turn this into a novelty playlist that would be very hard for a human to make. See if you can guess what the two rules are. I think one of the rules is pretty obvious, but the second is a bit more subtle. Post your guesses in the comments.
0 Tripping Down the Freeway - Weezer
1 Yer All I've Got Ttonight - The Smashing Pumpkins
2 The Most Beautiful Things - Jimmy Eat World
3 Someday You Will Be Loved - Death Cab For Cutie
4 Don't Make Me Prove It - Veruca Salt
5 The Sacred And Profane - Smashing Pumpkins, The
6 Everything Is Alright - Motion City Soundtrack
7 The Ego's Last Stand - The Flaming Lips
8 Don't Believe A Word - Third Eye Blind
9 Don's Gone Columbia - Teenage Fanclub
10 Alone + Easy Target - Foo Fighters
11 The Houses Of Roofs - Biffy Clyro
12 Santa Has a Mullet - Nerf Herder
13 Turtleneck Coverup - Ozma
14 Perfect Situation - Weezer
Here’s another playlist – with a different set of two novelty rules, with a seed artist of Led Zeppelin. Again, if you can guess the rules, post a comment.
0 El Niño - Jethro Tull
1 Cheater - Uriah Heep
2 Hot Dog - Led Zeppelin
3 One Thing - Lynyrd Skynyrd
4 Nightmare - Black Sabbath
5 Ezy Ryder - The Jimi Hendrix Experience
6 Soulshine - Govt Mule
7 The Gypsy - Deep Purple
8 I'll Wait - Van Halen
9 Slow Down - Ozzy Osbourne
10 Civil War - Guns N' Roses
11 One Rainy Wish - Jimi Hendrix
12 Overture (Live) - Grand Funk Railroad
13 Larger Than Life - Gov'T Mule
The Music App Summit
Posted by Paul in events, Music, startup, The Echo Nest on July 23, 2010
Billboard has long been known for tracking the hottest artists, albums and songs. Now they are moving into new territory – Music Apps. In October Billboard is hosting a Music App Summit – a day focused on the world of mobile music apps. The summit will focus on new companies and technologies that are now building the next generation of music applications for mobile devices. The summit has some awesome speakers and panelist lined up from a cross section of domains (technology, business and music) like Ge Wang, Ted Cohen, Dave Kusek, Brian Zisk and The Echo Nest’s CEO Jim Lucchese.
At the core of the summit are Billboard’s first ever Music App Awards. Billboard is giving awards to the best apps in a number of categories:

- Best Artist-based App: Apps created specifically for an individual artist
- Best Music Streaming App: Apps that allow users to stream, download or otherwise enjoy music, such as Internet radio or on-demand.
- Best Music Engagement App: Apps that lets users engage in music in various ways, such as music games, music ID services, etc.
- Best Music Creation App: App that lets users make their own music.
- Best Branded App: App that best incorporates a sponsor with music capabilities to promote both the sponsor’s message and highlight the music
- Best Touring App: App created in conjunction with a specific tour or festival
Judges for the apps include Eliot Van Buskirk of Wired, Ian Rogers of Top Spin and Grammy Award winner MC Hammer.
Winning developers receive some modest prizes – but the real award is getting to demo your app to the attendees of the summit – the movers and shakers of the music industry will be there looking for that killer music app – the winner in each of the app categories will get to show their stuff. If you have a mobile music app consider submitting it to the Music App Awards. The submission deadline is July 30.
Keith Moon meets Animal
Another vafromb.py masterpiece from joshmillard.
Echo Nest Remix at the Boston Python Meetup Group
Posted by Paul in events, remix, The Echo Nest on July 15, 2010
Next week I’ll be giving a talk about remixing music with Echo Nest remix at the Boston Python Meetup Group. If you are in the Boston / Cambridge area next week, be sure to come on by and say ‘hi’. Info and RSVP for the talk are here: The Boston Python Meetup Group on Meetup.com
Here’s the abstract for the talk:
Paul Lamere will tell us about Echo Nest remix. Remix is an open source Python library for remixing music. With remix you can use Python to rearrange a track, combine it with others, beat/pitch shift it etc. – essentially it lets you treat a song like silly putty.
The Swinger is an interesting example of what it can do that made the rounds of the blogosphere: it morphs songs to give them a swing rhythm.
For more details about the type of music remixing you can do with remix, feel free to read: http://musicmachinery…
Some NameDropper stats
Posted by Paul in code, fun, Music, The Echo Nest on July 11, 2010
The NameDropper has been live for less than a day and already I ‘ve collected some good data from the game play. Here are some stats:
Total games played: 1841
Total unique players: 462
Total play time: 30hrs, 20mins, 36 seconds
The artists that were most frequently confused with fake artists were:
The Name Dropper
Posted by Paul in data, fun, Music, The Echo Nest, web services on July 10, 2010
[tweetmeme source= ‘plamere’ only_single=false]
TL;DR; I built a game called Name Dropper that tests your knowledge of music artists.
One bit of data that we provide via our web APIs is Artist Familiarity. This is a number between 0 and 1 that indicates how likely it is that someone has heard of that artists. There’s no absolute right answer of course – who can really tell if Lady Gaga is more well known than Barbara Streisand or whether Elvis is more well known than Madonna. But we can certainly say that The Beatles are more well known, in general, than Justin Bieber.
To make sure our familiarity scores are good, we have a Q/A process where a person knowledgeable in music ranks our familiarity score by scanning through a list of artists ordered in descending familiarity until they start finding artists that they don’t recognize. The further they get into the list, the better the list is. We can use this scoring technique to rank multiple different familiarity algorithms quickly and accurately.
One thing I noticed, is that not only could we tell how good our familiarity score was with this technique, this also gives a good indication of how well the tester knows music. The further a tester gets into a list before they can’t recognize artists, the more they tend to know about music. This insight led me to create a new game: The Name Dropper.
The Name Dropper is a simple game. You are presented with a list of dozen artist names. One name is a fake, the rest are real.
If you find the fake, you go onto the next round, but if you get fooled, the game is over. At first, it is pretty easy to spot the fakes, but each round gets a little harder, and sooner or later you’ll reach the point where you are not sure, and you’ll have to guess. I think a person’s score is fairly representative of how broad their knowledge of music artists are.
The biggest technical challenge in building the application was coming up with a credible fake artist name generator. I could have used Brian’s list of fake names – but it was more fun trying to build one myself. I think it works pretty well. I really can’t share how it works since that could give folks a hint as to what a fake name might look like and skew scores (I’m sure it helps boost my own scores by a few points). The really nifty thing about this game is it is a game-with-a-purpose. With this game I can collect all sorts of data about artist familiarity and use the data to help improve our algorithms.
So go ahead, give the Name Dropper a try and see if you can push me out of the top spot on the leaderboard:
Play the Name Dropper
Some preliminary Playlist Survey results
[tweetmeme source= ‘plamere’ only_single=false] I’m conducting a somewhat informal survey on playlisting to compare how well playlists created by an expert radio DJ compare to those generated by a playlisting algorithm and a random number generator. So far, nearly 200 people have taken the survey (Thanks!). Already I’m seeing some very interesting results. Here’s a few tidbits (look for a more thorough analysis once the survey is complete).
People expect human DJs to make better playlists:
The survey asks people to try to identify the origin of a playlist (human expert, algorithm or random) and also rate each playlist. We can look at the ratings people give to playlists based on what they think the playlist origin is to get an idea of people’s attitudes toward human vs. algorithm creation.
Predicted Origin Rating ---------------- ------ Human expert 3.4 Algorithm 2.7 Random 2.1
We see that people expect humans to create better playlists than algorithms and that algorithms should give better playlists than random numbers. Not a surprising result.
Human DJs don’t necessarily make better playlists:
Now lets look at how people rated playlists based on the actual origin of the playlists:
Actual Origin Rating ------------- ------ Human expert 2.5 Algorithm 2.7 Random 2.6
These results are rather surprising. Algorithmic playlists are rated highest, while human-expert-created playlists are rated lowest, even lower than those created by the random number generator. There are lots of caveats here, I haven’t done any significance tests yet to see if the differences here really matter, the survey size is still rather small, and the survey doesn’t present real-world playlist listening conditions, etc. Nevertheless, the results are intriguing.
I’d like to collect more survey data to flesh out these results. So if you haven’t already, please take the survey:
The Playlist Survey
Thanks!


