Paper and Data

Demo of Audio Alignment

Synthesised MIDI alignment on the left channel, original audio on the right channel:

Introduction

Welcome to the companion site for the paper “FiloBass: A Dataset and Corpus Based Study of Jazz Basslines” presented at ISMIR 2023. This was work carried out by Xavier Riley, a PhD candidate on the AIM programme at QMUL.

Each piece is ~6mins, so the dataset totals roughly ~5hours of audio.

This site contains links to Soundslice pages where you can preview all the data. Here’s an example:

Score previews

Feedback/Questions

We welcome feedback on the dataset - please direct this to Xavier Riley whose email address can be found on the ISMIR paper.

License

The FiloBass dataset contains copyright material and is shared with researchers under the following conditions:

  • FiloBass may only be used by the individual signing below and by members of the research group or organisation of this individual. This permission is not transferable.
  • FiloBass may be used only for non-commercial research purposes.
  • FiloBass (or data enabling the its reproduction) may not be sold, leased, published or distributed to any third party without written permission from the FiloBass administrator.

  • When research results obtained using FiloBass are publicly released (in the form of reports, publications, or derivative software), clear indication of the use of FiloBass shall be given, usually in the form of a citation of the following paper:

  • X. Riley and S. Dixon (2023), FiloBass: A Dataset and Corpus Based Study of Jazz Basslines. 23rd International Society for Music Information Retrieval Conference (ISMIR).
  • Queen Mary University of London shall not be held liable for any errors in the content of FiloBass nor damage arising from the use of FiloBass.
  • The FiloBass administrator may update these conditions of use at any time.

Acknowledgements

The author is a research student at the UKRI Centre for Doctoral Training in Artificial Intelligence and Music, supported by UK Research and Innovation [grant number EP/S022694/1].