November 26, 2018, by jicke

Music’s changing fast: FAST is changing music

The University of Nottingham’s Mixed Reality Lab took its latest digital music research to one of the most famous recording studios in the world, with a showcase event at Abbey Road.

The exclusive event showcased the culmination of five years of digital music research from the FAST IMPACt project (Fusing Audio and Semantic Technologies for Intelligent Music Production and Consumption) led by Queen Mary University of London in partnership with the University of Nottingham and University of Oxford.

FAST is looking at how new technologies can positively disrupt the recorded music industry and the event showcased to artists, journalists and industry professionals the next generation technologies that will shape the music industry – from production to consumption.

Dr Alan Chamberlain, Senior Research Fellow at the Mixed Reality Lab in Nottingham is part of the FAST project and said,“The Industry Day was such amazing event for the FAST project, it was great to be able to discuss the ground-breaking research that the project has developed over the past few years and hear the positive comments that were coming from the Music Industry. There was a real feeling that people were seeing what could be possible in the future based on the innovative technologies which the project presented and demonstrated. There’s some really cutting-edge music technology research at the University of Nottingham (Mixed Reality Lab) at the moment, and working with Queen Mary University London and Oxford University has led to genuinely impactful applications from the research”

In total 120 attendees were treated to an afternoon and evening of talks demonstrations, a Climb! performance, and an expert panel discussion with Jon Eaves (The Rattle),Peter Langley (Origin UK), Paul Sanders (state51), Tracy Redhead (award-winning musician, composer and interactive producer, University of Newcastle, Australia), Maria Kallionpää (composer and pianist, Hong Kong Baptist University) and Mark d’Inverno (Goldsmiths)who chaired the panel.

Rivka Gottlieb, harpist and music therapist, also performed some musical pieces in collaboration with the Oxford team throughout the day.

Highlights of the research showcased on the day include:

Carolan Guitar: Connecting Digital to the Physical – The Carolan Guitar tells its own story. Play the guitar, contribute to its history, scan its decorative patterns and discover its story. Carolan uses a unique visual marker technology that enables the physical instrument to link to the places it’s been, the people who’ve played it and the songs it’s sung, and deep learning techniques to better event detection. See: https://carolanguitar.com/
FAST DJ – Fast DJ is a web-based automatic DJ system and plugin that can be embedded into any website. It generates transitions between any pair of successive songs and uses machine learning to adapt to the user’s taste via simple interactive decisions.

Grateful Dead Concert Explorer –A Web service for the exploration of recordings of Grateful Dead concerts, drawing its information from various Web sources. It demonstrates how Semantic Audio and Linked Data technologies can produce an improved user experience for browsing and exploring music collections.  See Thomas Wilmering explaining more about the Grateful Dead Concert explorer: https://vimeo.com/297974486

Jam with Jamendo –brings music learners and unsigned artists together by recommending suitable songs as new and varied practice material. In this web app, users are presented with a list of songs based on their selection of chords. They can then play along with the chord transcriptions or use the audio as backing tracks for solos and improvisations. Using AI-generated transcriptions makes it trivial to grow the underlying music catalogue without human effort. See Johan Pauwels explaining more about Jam with Jamendo: https://vimeo.com/297981584

MusicLynx – is a web platform for music discovery that collects information and reveals connections between artists from a range of online sources. The information is used to build a network that users can explore to discover new artists and how they are linked together.

The SOFA Ontological Fragment Assembler– enables the combination of musical fragments – Digital Music Objects, or DMOs – into compositions, using semantic annotations to suggest compatible choices.

 Numbers into Notes – experiments in algorithmic composition and the relationship between humans, machines, algorithms and creativity. See David de Roure explaining more about the research: https://vimeo.com/297989936

rCALMA Environment for Live Music Data Science –a big data visualisation of the Internet Archive Live Music Archive using Linked Data to combine programmes and audio feature analysis. See David Weigl talking about rCALMA: https://vimeo.com/297970119

Climb! Performance Archive – Climb!is a non-linear composition for Disklavier piano and electronics. This web-based archive creates a richly indexed and navigable archive of every performance of the work, allowing audiences and performers to engage with the work in new ways.

Posted in Uncategorized