Washington, September 16 : Two University of Southern California researchers have written a piece of software that can help create a musical accompaniment in the style of any chosen artist, or even the particular style used in select pieces by the artist.
Elaine Chew, an accomplished pianist and professor at the USC Viterbi School Department of Industrial and Systems Engineering, and graduate student Ching-Hua Chuan have revealed that they started developing the system called ASSA (Automatic Style Specific Accompaniment) two years ago.
"We describe an automatic style specific accompaniment system that makes songwriting accessible to both experts and novices. (T)he system should be able to identify the features important to the style specified by the user, (enabling the user to) ask for harmonization similar to some particular songs," they said while making a presentation at the International Joint Workshop on Computational Creativity in London a year ago.
During the presentation, Chuan and Chew laid out the basics of the system, and tested it on Radiohead songs. They trained the system on three Radiohead songs, and generated chord progressions for the fourth.
The original accompaniment served as "ground truth", that is, the test for the aptness of the accompaniment. Thus, the measure of success was not a subjective perception of whether the accompaniment worked, but rather, based on the observation that it was very close to the band's own accompaniment.
The researchers have since then gone on to a more sophisticated system, which they plan to present in September at the International Conference on Music Information Retrieval.
The ASSA system can be run on any ordinary computer.
"A PC is definitely sufficient for this program," Chuan said.
The researchers plan to continue the work.
Among the possibilities is building an end-to-end interactive prototype that will take user humming input, create the accompaniment, and render it with the appropriate instruments.