Skip to content
Crista Falk edited this page Dec 2, 2021 · 2 revisions

Project Overview

MVP:

  1. Scrolling UI that visualizes the voices of a music21 corpus
  2. Trailing lines behind a voice where Y position on the screen corresponds to the pitch height
  3. Shows information about voice leading across parts and pitch direction within parts
  4. Match this to music timing based on note length and piece duration

Tools:

  • Music21 for feature extraction and corpus
  • Kivy for interactive visualization

Steps for implementation:

  1. Use Music21 to get note information related to:
    • part
    • offset (timestamp of note)
    • pitch height
  2. Send to kivy which then:
    • visualizes the linear movement of the notes within each part
    • hardcoded or randomized features to add interesting visuals
    • slider to change tempo and therefore playback speed

Project showcase: -- proof of concept using one song (BWV66.6?)

Clone this wiki locally