Joint melody and chord inpainting model using BERT
MIT License
This project uses a custom BERT model that is masking both melody and chords in the same piece of music.
The model takes as input beat-quantized chord labels and beat-quantized melodic patterns.
Melodies are split into beat-sized chunks, where each chunk is quantized to 16th notes. The chunks are stored in a look-up table.
The model is trained on the Wikifonia dataset.