Full Paper

Finding My Beat: Personalised Rhythmic Filtering for Mobile Music Interaction
Daniel Boland, University of Glasgow, UK
Roderick Murray-Smith, University of Glasgow, UK
Time: Wed 11:48 - 12:12 | Session: Tactile User Interfaces | Location: Große Aula

A novel interaction style is presented, allowing in-pocket music selection by tapping a song's rhythm on a device's touchscreen or body. We introduce the use of rhythmic queries for music retrieval, employing a trained generative model to improve query recognition. We identify rhythm as a fundamental feature of music which can be reproduced easily by listeners, making it an effective and simple interaction technique for retrieving music. We observe that users vary in which instruments they entrain with and our work is the first to model such variability. An experiment was performed, showing that after training the generative model, retrieval performance improved two-fold. All rhythmic queries returned a highly ranked result with the trained generative model, compared with 47% using existing methods. We conclude that generative models of subjective user queries can yield significant performance gains for music retrieval and enable novel interaction techniques such as rhythmic filtering.

MobileHCI 2013 Proceedings in the ACM Digital Library.

Important Dates

ACM Logo
SIGCHI Logo
LMU Logo

Donors

Google Logo
Grand Logo
Intel Software Logo
Microsoft Research Logo
Nokia Logo
SMI Logo
Telefonica Logo
Yahoo! Labs Logo