Jazz Improvisation - How Chords Are Represented?

Hi,

I just completed this assignment but I am wondering how chords are represented in the input and output vectors?

It was mentioned in the lab that musical values are represented as pitch and duration but there was no information regarding how chords are represented.

I do have some musical background as I play instruments.

Thanks!

Hey @svillasica,
If you check out the assignment, you will find that the below line of code loads the raw music data and pre-process it into values.

X, Y, n_values, indices_values, chords = load_music_utils(‘data/original_metheny.mid’)

So, if you are interested in knowing about how the pre-processing is done, feel free to follow the trail of functions, starting from load_music_utils which can be found in data_utils.py file. It will lead you to the get_musical_data function in the preprocess.py file and so on. All the pre-processing used can be find in these additional helper files. Let me know if this helps.

Cheers,
Elemento

Thanks for the reply! To add, the original MIDI input contains multiple instruments but the output of the generated file only contains a single instrument (probably sounded like e-piano). So is it safe to assume that during pre-processing, only the piano sound is processed and used in the model?

To answer my own question, I checked the code and it turns out that MIDI file is an array and the 5th element consists only of the melody part. Basically, the RNN is only training the melody part.

To add to this discussion, I created a local copy of the project and attempted to run this locally in my Jupyter notebook. When running the project, I encounter this issue in preProcess.py.

This question has been answered in your other thread.