Summary
Music is a part of every culture around the world and throughout history. Musicology is the study of that music from a structural and sociological perspective. Traditionally this research has been done in a manual and painstaking manner, but the advent of the computer age has enabled an increase of many orders of magnitude in the scope and scale of analysis that we can perform. The music21 project is a Python library for computer aided musicology that is written and used by MIT professor Michael Scott Cuthbert. In this episode he explains how the project was started, how he is using it personally, professionally, and in his lectures, as well as how you can use it for your own exploration of musical analysis.
Announcements
- Hello and welcome to Podcast.__init__, the podcast about Python and the people who make it great.
- When you’re ready to launch your next app or want to try a project you hear about on the show, you’ll need somewhere to deploy it, so take a look at our friends over at Linode. With 200 Gbit/s private networking, scalable shared block storage, node balancers, and a 40 Gbit/s public network, all controlled by a brand new API you’ve got everything you need to scale up. Go to pythonpodcast.com/linode to get a $20 credit and launch a new server in under a minute. And don’t forget to thank them for their continued support of this show!
- And to keep track of how your team is progressing on building new features and squashing bugs, you need a project management system designed by software engineers, for software engineers. Clubhouse lets you craft a workflow that fits your style, including per-team tasks, cross-project epics, a large suite of pre-built integrations, and a simple API for crafting your own. Podcast.__init__ listeners get 2 months free on any plan by going to pythonpodcast.com/clubhouse today and signing up for a trial.
- Visit the site to subscribe to the show, sign up for the newsletter, and read the show notes. And if you have any questions, comments, or suggestions I would love to hear them. You can reach me on Twitter at @Podcast__init__ or email hosts@podcastinit.com)
- To help other people find the show please leave a review on iTunes, or Google Play Music, tell your friends and co-workers, and share it on social media.
- Join the community in the new Zulip chat workspace at pythonpodcast.com/chat
- Your host as usual is Tobias Macey and today I’m interviewing Michael Cuthbert about music21, a toolkit for computer aided musicology
Interview
- Introductions
- How did you get introduced to Python?
- Can you start by explaining what computational musicology is?
- What is music21 and what motivated you to create it?
- What are some of the use cases that music21 supports, and what are some common requests that you purposefully don’t support?
- How much knowledge of musical notation, structure, and theory is necessary to be able to work with music21?
- Can you talk through a typical workflow for doing analysis of one or more pieces of existing music?
- What are some of the common challenges that users encounter when working with it (either on the side of Python or musicology/musical theory)?
- What about for doing exploration of new musical works?
- As a professor at MIT, what are some of the ways that music21 has been incorporated into your classroom?
- What have they enjoyed most about it?
- How is music21 implemented, and how has its structure evolved since you first started it?
- What have been the most challenging aspects of building and maintaining the music21 project and community?
- What are some of the most interesting, unusual, or unexpected ways that you have seen music21 used?
- What are some analyses that you have performed which yielded unexpected results?
- What do you have planned for the future of music21?
- Beyond computational analysis of musical theory, what are some of the other ways that you are using Python in your academic and professional pursuits?
Keep In Touch
- mscuthbert on GitHub
- @mscuthbert on Twitter
Picks
- Tobias
- Mozart’s Requiem performed by Berlin Philharmonik and conducted by Claudio Abbado
- Michael
- von Karajan Institute – Karajan was a major conductor of the 60s — his Institute now sponsors research into new projects in music technology and are big advocates of using Python for their data analysis.
- Ruth Crawford Seeger, String Quartet (1931) performed by The Playground Ensemble
Links
- music21
- Studies in Western Music History: Quantitative and Computational Approaches to Music History on MIT Open Courseware
- MIT
- Perl
- National Bureau of Economic Research
- Zen of Python
- Musicology
- Matplotlib
- Orange
- scikit-learn
- Abjad Python Package
- SciPy
- numpy
- Pandas
- PyLevenshtein
- Levenshtein Distance
- PyGame
- AVL Tree
- Subversion (SVN)
- Bach Chorales
- Artusi.xyz Interactive Music Theory
- VexFlow
- MIT Digital Humanities
- NLTK
- Flask
- Fortran
- Django
- Humdrum
The intro and outro music is from Requiem for a Fish The Freak Fandango Orchestra / CC BY-SA
Your host as usual is Tobias Macy, and today I'm interviewing Michael Cuthbert about Music 21, a toolkit for computer aided musicology. Hello, and welcome to podcast dot in it, the podcast about Python and the people who make it great. When you're ready to launch your next app or want to try a project you hear about on the show, you'll need somewhere to deploy it. So say hi to our friends over at Linode. With 200 gigabit private networking, scalable shared block storage, node balancers, and a 40 gigabit public network, all controlled by a brand new API, you've got everything you need to scale up. Go to pythonpodcast.com/linode, l I n o d e, to get a $20 credit and launch a new server in under a minute. And don't forget to thank them for their continued support of this show.
And if you're like me, then you need a simple and easy to use tool to keep track of all of your projects. Some project management platforms are too flexible, leading to confusion of workflows and days' worth of setup, and others are so minimal that they aren't worth the cost of admission. After using Clubhouse for a few days, I was impressed by the intuitive flow. Going from adding the various projects that I work on to defining the high level epics that I need to stay on top of and creating the various tasks that need to happen only took a few minutes. I was also pleased by the presence of subtasks, seamless navigation, and the ability to create issue and bug templates to ensure that you never miss capturing essential details.
Listeners of this show will get a full 2 months for free on any plan when you sign up at python podcast.com/clubhouse. So help support the show and help yourself get organized today. And don't forget to visit the site at python podcast.com to subscribe to the show, sign up for the mailing list, and read the show notes. And don't forget to keep the conversation going at python podcast.com/chat. Registration for PyCon US, the largest annual gathering across the community, is open now. So don't forget to get your ticket, and I'll see you there. Your host as usual is Tobias Macy. And today, I'm interviewing Michael Cuthbert about music 21, a toolkit for computer aided musicology. So Michael, could you start by introducing yourself?
[00:02:12] Unknown:
Hi. My name is Michael Cuthbert or Mike or if you'd like. I'm associate professor of music at MIT, and I'm the faculty director of the digital humanities program sponsored by the Mellon Foundation.
[00:02:25] Unknown:
And do you remember how you first got introduced to Python?
[00:02:27] Unknown:
Absolutely. So I've been working as a Perl hack for, years at the National Bureau of Economic Research where I was doing a lot of their web programming and I came over to MIT. My position, also involves looking at medieval music. So I was hired there and then students really figured out that, you know, hey, I was a programming person and they wanted to do some things with programming with programming with musical analysis. At MIT, the language of instruction was Python. It was pretty new switch that had just recently happened, and I kind of thought, well, I've been wanting to learn about this great and wacky new language with indentation as semantic and all all that stuff. But I never had a chance, and I thought, well, I can either spend the next 10, 15 years training every student I work with to speak my language or I can try it out on my own. And I think it was the first time that I created, I don't know, 5 abstract classes in 10 minutes that I just fell in love that this was the language I wanted to be working with.
[00:03:38] Unknown:
And coming from Pearl, I know that there is a lot of denigration of that language because it has fallen out of favor with a lot of people. But at the same time, it has helped to support and evolve computing infrastructure for a long time and was actually 1 of the languages that helped power the web. So I'm curious what your experience was juxtaposing your Perl programming with what you have been doing with Python in terms of the flow of the language and the capabilities?
[00:04:06] Unknown:
It took me a long time before I was really comfortable using Python, not for big projects that came about really fast, but using Python for that 5 minute hack I needed before class to demonstrate something. And with curl, you had regular expressions as sort of a built in part of the language, nothing to be imported. With Python, you had a library. And especially when you're working with data that you're just sort of throwing together, I missed at first the idea that I could just jump into a dictionary or jump into a list. And if that element wasn't defined, I could just get a undefined. So there was a little bit of a challenge, but I really do feel that moving to Python helped organize my thinking that I'm a lot more structured in my programming. And I don't think that without that change of mindset that came from learning Python, I don't think I could have made something as big as music 21.
At the same time that I've adopted so much of the zen of Python, there's 1 thing that came from Pearl that really has stuck with me. I really do still believe there's more than 1 way to do it, or at least that the Python, there's 1 and only 1 obvious way of doing it might be true, but that 1 way might be different if you're an amateur programmer or professional, if you're younger or older, if you're a woman or a man, and depending on what your experiences are. So I have not 100% jumped into the Python culture, but I think about 95%.
[00:05:42] Unknown:
Well, it's always valuable to have people to push back against some of the idioms or ideologies that have become so ingrained into a given language and culture because it helps us evolve and incorporate new ideas and new ways of programming, and it helps us all grow as developers. I agree with that. Definitely. And so as I mentioned at the beginning, music 21 is a tool for exploring computer aided musicology. So before we go into the specifics of the project that you've built, can you give an explanation about what you mean by computational musicology?
[00:06:13] Unknown:
Musicology is the study of music history, music theory, music and culture, and it's really, you know, often been done by hand as part of the humanities in libraries and archives at the concert hall. But there's really big questions about how styles change or how composers, were influenced by other composers or how a piece of music works. That intense study of 1 piece of music even for years, or a small collection of pieces of music will never give somebody that kind of, broad understanding. So computational musicology is really looking at traditional musicology in the archives or in libraries or in the concert hall.
But there are a lot of things that can't be done even with the most intense study of a single work or a small collection of works, and that's where a big data approach to analyzing music or understanding music's history really comes into play. So with a computer, we are able to look at larger corpora. Now there don't tend to be of the big data size of say, you know, genetics or or what, what a social network is looking at it. But they're very big for us in the arts and humanities. And computational musicology involves not only creating the tools that enable us to make these kind of analyses, but also the understanding of what types of questions
[00:07:46] Unknown:
are best answered with a computer and what types of questions might, at least for now, be better done in an old fashioned offline way. And to your point about the fact that you're not dealing with data on the petabyte scale that you often hear of in the big data sense, you are also much more likely to be dealing with analysis at a much more granular level because you are doing these detailed comparisons, particularly in the musicology range of even individual note structures or chord progressions rather than doing
[00:08:21] Unknown:
aggregate analysis across 1, 000 or millions of records. That's correct. We're still trying to understand the implications of what this new methodology enables. And I think it's most helpful to have some people who are trying to scale up, you know, and then who are trying to scale up maybe a little bit more slowly than we might be able to do so that we can sit back and think, well, you know, are these analyses on 100 works? Do they do they look about right? Be so before we end up interjecting systematic errors into studies of 1, 000 or 10, 000 of works. From a practical perspective, we have a problem that there simply aren't petabytes worth of data of music encoded in high quality, formats as as symbolic scores. So we don't really have to solve that problem until we our data creation problem is solved.
[00:09:17] Unknown:
And so can you give a bit more background about what the music 21 project is and your motivation for creating it and some of the history behind it?
[00:09:27] Unknown:
Sure. So music 21 is a Python library, and, that's 1 of the things that's very surprising to a lot of humanists that when you download music 21 as as many of your listeners will know, there's nothing to run. There's no graphic user interface. There's no, program that asks you what you want to do. Instead, it's just another library, but that happens to encode both low level representations of music such as here is a note object. Well, a note has a particular pitch. It has a duration. It might also have an instrument associated with it, and it has a location in a score. And it also has higher level, analysis tools. So for instance, there's, a well known key detection algorithm that tells you what key the particular piece might be in, named after Carol Crumhunsil. And this algorithm is built into music 21 so that you can run it over any score that's in music 21.
We talk about musical scores, not songs or tracks in music 21 because it's a symbolic music analysis toolkit. That is to say, it works on scores or things like scores, like MIDI files, things where the individual notes have already been made discreet. There are a lot of people who are working on audio analysis, and that's a really great thing to be doing, and it's also a part of musicology and music theory. But because many people in computer science were already working on audio analysis and depending on what you're analyzing, if it's a popular song, there's real money to be made in that. Commercial sound analysis packages had also been going forward, but there really was not much done with, symbolic music analysis and especially nothing that had been done with an object oriented framework until music 21 came along. I switched over to Python because the students were all interested in combining programming with their music and music theory classes, and it was kind of a fun thing I was doing on the side. I had always thought that maybe an object oriented successor to humdrum, which was a shell script based computational musicology toolkit from the eighties nineties. I thought that this would be kind of fun, but I never really thought it would be possible to really have it represent so much music.
But then sometimes miracles happen. There's a very generous institute called the Seaver Institute that mostly funds science and engineering projects, and quite often, they'll do a call for proposals at MIT. And I became the great beneficiary of a mistake that was accidentally sent to all faculty at MIT, not just the science and engineering faculty, this call for proposals. And I I didn't know this was a mistake, so I put in my proposal. And, thankfully, the, head of the institute said, well, as long as people have already taken all the trouble to apply, we might as well consider them also. And, that's how, Music 21 was able to get the kind of funding that's generally only goes to, the sciences. And I'm incredibly grateful that by sort of empowering computational humanities, computational musicology, this institute has really been able to transform, a whole field of musicology.
[00:12:56] Unknown:
And as you mentioned earlier, there is the problem of being able to have access to enough corpora of data that is encoded in such a way that you can perform these analyses on it, which when reading through the documentation looks to be largely in music XML or some other somewhat esoteric formats for people who are used to just general programming and dealing with things like JSON or CSV. But a lot of people might want to do analysis terms of being able to do these analyses, and what are some of the common requests? Terms of being able to do these analyses, and what are some of the common requests that you receive that you are purposefully not incorporating into this toolkit? So music 21 does
[00:13:43] Unknown:
include a very rudimentary audio to symbolic music converter. There are much better ones, out there now, but we wanted people who did want to, work with their audio files to be able to convert them over into the symbolic format. But once once they're in there, there's a lot of things that can be asked about, well, what chords tend to move to other chords? Or this motion that's really spectacular in this 1 piece that I know, is that just because I I know the performance of it, or is it really unusual? So we can look at statistical comparisons across pieces of music. We can if we have the pieces, say, geotagged in a certain way, we can understand how the location that a piece was originally composed or originally published at can influence what types of musical gestures happen.
1 of the articles that I just published with, 1 of my students, a Wellesley student named Sophia Sun, was on looking at I think it was a little bit under 2, 000 lead sheets. So these are pieces of music, mostly popular music and folk music where we had the melodies and we had the chords and we had the texts, and we were able to say, well, do certain emotions carry certain musical gestures more often? For instance, are words that are sad and associated with sadness, are they more often set to minor chords or minor pieces? That's something I'd always been taught. It turned out not to be true. And it's 1 of these things where you can find any 1 piece or any 1 example that really can help to reinforce or to destroy a stereotype. But, really, when you look at 2, 000 pieces, you get very statistically significant results about what types of things, go together. In that particular piece, I noticed that words talking about disgust or emotions of horror tend to be set to, much shorter note values, and they do tend to be more dissonant that is less harmonically, we could say, pleasing than words talking about happiness. And what was really great about doing this project is that music 21 could handle all the note tools, but the other open source programmers had made the sentiment analysis and the motion detection on words. So really mashing up various things that were already out there gave us great results.
[00:16:12] Unknown:
When working with music 21, how much knowledge of musical notation and musical structure exploration of musical works that they enjoy or that they're familiar with and just play around with it and to see what they can figure out. The amount of music theory or music notation knowledge is
[00:16:36] Unknown:
really exactly the amount to answer the particular question that you're using music 21 for. So if you're going to be using music 21 to say randomly generate a few scales so you can practice singing, then you don't need to know very much except what the components of a scale are. If you're gonna be using music 21 to analyze a very large repertory of works, but what you wanna analyze is what's the lowest note in each work, then the limitation really comes from your programming or Python knowledge and much less on the music notation and music theory side. On the other hand, there are certain questions about on the other hand, there are certain problems like phrase detection. Where does 1 phrase of music end and where does another 1 begin, where it can be extremely helpful to have studied some music theory and understand, well, what the markers underlying phrases might be. And so it all depends on what you want to do with it. And
[00:17:37] Unknown:
for somebody who is using music 21 and is either familiar with music, but not necessarily with Python, or familiar with Python, but not not necessarily with musical theory and structure, can you talk through a typical workflow for performing an analysis of 1 or more pieces of music and some of the common stumbling blocks that you hear about from people coming from either side of that equation?
[00:18:03] Unknown:
Absolutely. So the first thing that usually do with music 21 is you're going to load in an existing score. So we have the converter package and converter dot parse and give it a file name or give it a, you know, file like object or something like that. And, usually, music 21 can detect which of the formats we support the file is in, then it's loaded up into an object structure we call a stream, usually a subclass of stream like a score or a part or a measure. And these are containers that tend to hold either notes and rests and chords and other, musical objects or hold other strings. So a score might hold multiple parts and part might hold multiple measures. Then usually the next thing that I would recommend somebody do is, you know, figure out how your analytical method is gonna work on the smallest piece of musical data that is relevant. So if you're looking at, well, how often is a particular rhythmic pattern used, then you're, you know, you're gonna be figuring out, well, how do I iterate over a window of, of note objects or rhythm of, window of duration objects? And once that's all done and the analysis is working pretty well and you can kind of imagine, well, what should the output look like? Should it be another score? Should it be a CSV file? Should it be, a plot?
Love shout out to Matt plotlib and our friends who, help power our graph and plot libraries in music 21. And once once you've done all that, then usually, you know, at that point, the frustration level with any kind of programming project you've debugged, everything finally working really well, and then comes the great moment of any kind of computational research. You take everything that you've done so far and you wrap it in a 4 loop or you you wrap everything inside your stream into a 4 loop. And then you're looking at well, let's look at iterate over each part because every part matters. And now let's, iterate our converter over, directory, and we can aggregate all our results in. And that's where the real power of music 21 comes in.
For for a short piece of music, you can usually do the analysis if you're, if you're musically literate. You can do the analysis faster by hand, but it's once all those for loops, once those iterators come out that we can really start looking at trends over time or looking at trends over, composer's work or in a particular nation and, not in another style or something like that. And that's really where I the first time when I'm giving talks on music 21 to people who have just a little bit of programming background. That's usually the first time where their eyebrows go up and their smile lines appear when they see that, oh, all that time that's spent in that first part will be something that I'll be able to reuse very very quickly, almost instantly, often over every piece of music I encounter for
[00:21:10] Unknown:
the rest of my programming life. And the product that you mentioned earlier where you were using music 21 for being able to do the analysis of the musical notation objects and then using other tools for doing the semantic analysis of the lyrical structures and then combining that with what you were evaluating about the note structures led me to wonder what are some of the other typical pairings that you see in terms of tools or libraries that are used in conjunction with music 21 for being able to perform more detailed or involved analyses of more complex musical pieces or being able to incorporate things like machine learning or any sort of other more advanced analytical capabilities that are accessible to the Python ecosystem.
[00:21:56] Unknown:
Well, music 20 one's a standalone library doesn't require anything to go forward with it. There's so many great other toolkits that can be, sort of merged together with it. So I use Orange a lot for my machine learning and trying to figure out how I can cluster my results afterwards. I know a lot of my users use scikit learn, and I'm really excited to, try, using that. We use, SciPy and NumPy quite a bit in, in all the work that we do with audio and optical character recognition or optical music recognition tools with music 21. A lot of people use music 21 for composing, which I hope we can get back to in a bit. And for doing that, a lot of times people want right at the end to be able to have a beautiful musical score. So there's a software package called Abjad that, interfaces very well with music 21 for generating musical scores. Or people have used music 21 in conjunction with Pygame to make algorithmically generated soundtracks for, for video games that change every time or change in response to what, what somebody is doing in the game, and I love that kind of work. So I I I feel like I love the n squared, type of problem solving. So you take you know, there's n modules out there in Pypi, and you have n squared if you take 2 of them, n cubed if you take 3 of them together and just try to see what they could do. Oh, I also wanna give a a shout shout out to the Pye Levenshtein, developers who create a great, similarity, measure that works works pretty fast, and so we use that quite a bit in trying to figure out how similar various musical scores are to each other. And as long as we can serialize, a score
[00:23:50] Unknown:
in some way as a text tool document or as a sequence, then we're able to use a lot of, toolkits that are already out there. And in terms of your use of music 21 in the classroom, what are some of the types of assignments that you are giving to your students that leverage music 21? And some of the common reactions or, difficulties or general enthusiasm that you hear back from your students as part of working through those assignments? I think at MIT, there's really
[00:24:21] Unknown:
2 views of the humanities and the arts requirements, which is pretty substantial at MIT. And I think both these views are valid. 1 is, hey hey. This is MIT. Let's let's take what's happening in technology and in our science classes and engineering classes, and let's let's combine what's, what we're doing in our art and humanities class. And so let's really try to do something that's never been done before. And then students sometimes also think, you know, I've been doing programming and problem sets for the last 60 hours. When I take my, medieval and renaissance music class, you know, I'd I'd really like to not have that happening.
So I don't make music 21, usage a requirement in any class except in a particular class on computational musicology. But yet students have tended to, to use music 21 in their various assignments. So, I've had students who are trying to classify the evolution of rhythm from 1300 to 1600 in that medieval and renaissance music class, or 1 student was really fascinated with, well, how is it that sometimes when you're listening to, a piece of classical music, which has multiple movements, the opening, the, you know, the slow middle, the dance movement, the grand finale, that sometimes you can tell even in 2 or 3 seconds, oh, 0, we're probably in the finale without any context. And so somebody used music 21, to be able to to be able to try to figure out what's the minimum amount of information that the that the what's the minimum amount of information that the computer was going to need in order to be able to get a pretty accurate guess about where we are in a piece of music. So I found that really fascinating. So, basically, when my students find out that outside of my music history classes, I'm also doing this particular programming toolkit, a lot of them do become interested and find ways to, integrate the programming with music 21 into a little bit of their analysis.
[00:26:28] Unknown:
And can you dig a bit deeper now into how music 21 is implemented and how the overall structure and architecture of the library has evolved since you first began working on it? So music 21 is
[00:26:42] Unknown:
a ever growing collection of individual modules that are all, imported if you, if you like wild card imports from music 21 import star or, individually wrapped up. They're all designed to be basically so that you can read just the music 21 dot bass documentation or the user's guide for the bass, BAC, not BASS, and the stream and the musical note between those 3, sort of modules and the classes in there, inside note is also rest, and inside stream is the, as I said, the score in the part. You can generally get started, going there.
The latest versions of music 21 beginning about a year, year and a half ago are all Python 3 only, And I, you know, apologize to those who are still loving Python 2.7, but with a very small development team as university projects tend to be. It felt like it was time to take advantage of some of the new features of Python 3 and to, be able to really work with Unicode and work with binary versus string data separately. The music 21 stream has been 1 of the most successful parts of the music 21 toolkit, but it can also be 1 of the most frustrating because it's a list of lists essentially, but it's also something that likes to keep track of where it's come from. So a lot of music 21 operations will return a new score. So if you have a score and you transpose it, that is you change you raise it up a certain key or a certain interval, you'll end up with a new score. But quite often, you want to use that new score. So let's say we take every single piece no matter what it is and we transpose it and turn it into the key of c so that we can look in them all together. And and then we want to make some annotations on the score. We want to label every time a particularly interesting progression happens or a melodic gesture or every time 1 of a composer breaks 1 of the rules we teach in music theory that we mark people down for and say, Here's where Bach did it. But you want those annotations to be able to propagate back to your original score. Sometimes in science, you don't. So looking at how streams derive from other streams and knowing that in the course of analysis, you might be creating 100 new scores along the way and, how they get lost and and garbage collected and how some of them are preserved is 1 of the things that makes music 21 very powerful, but also can make some of the most deepest analytical processes a little bit difficult.
Based on the implementation of music 21, if I had 1 request that could go to the powers that be at Python, I would love to have a c based binary AVL tree implemented in Python because I'm trying not to do any c based dependencies so that, we don't have to worry about people needing to compile things. But that's probably 1 of the things in music 21 that is the slowest part of it, that this having to decide between keeping everything as a flat list versus having a kind of self balanced binary tree that, has to be represented in Python. So trade offs along the way. As far as how the structure of music 21 has evolved since I first started, we really went back and forth between trying to see, did people want different types of musical data to be stored separately or together? Do we want all the notes in a particular track, we could say? And do we want all the key signatures and all the meters in a different track and all the clefs and all the lyrics in another track? And so the earliest versions of music 21, I'm sure you can still find them somewhere in SVN on Google code. By the way, we're on GitHub right now. These earliest versions, we've tried out a lot of different things, and we really had to see what was most useful to the community and what wasn't. And so we're still evolving.
We try to keep it so that there's nothing backwards incompatible on, except for on a major version change. And then some of the things that we've done, a lot of things that used to generate new streams, new new copies now return generators, and being able to iterate over generators has had a big speed up and also lowered the complexity of the overall library once you understand that, a generator doesn't have every property that a list
[00:31:18] Unknown:
has. And in terms of building and implementing music 21 and growing the community around it, what have you found to be some of the most challenging aspects and also any particularly
[00:31:31] Unknown:
interesting or useful lessons learned along the way? The community has been great for music 21. I I can't believe how how robust, how supportive, how encouraging the Python community and the, music communities have been for music 21. Probably the biggest challenge was that I really wanted music 21 to be something that musicologists would be willing to learn programming in order to use. And really at in the early stages, we thought about, well, how can we make this as simple as possible? So there were some parts of music 21 that, you know, probably violated the explicit is better than implicit policy because we, you know, really tried to be like, well, if somebody is learning to program and they put in the string 3 instead of the number 3, can we still support that?
Over time, I've really found that, people from the music communities are either willing to jump straight into the deep end and say, I'm gonna learn a lot of programming, and I'm gonna learn how to use Python properly and how to use all these other libraries. In which case, some of these, some of these, assistance, in which case, some of these affordances did not need to be built into music 21. Or for a lot of people, no matter how easy we tried to make music 21, it's it's just still too intimidating. So I think that's something that the whole programming community, not just music 21, not just Python, needs to be continuing to work on how can we bring in people with more different backgrounds from, people who have different life experiences and people who really want to program, but haven't had the joy and haven't had the advantage of learning that. How can we bring them in earlier? On the other hand, the amateur music community within the programming community has been astounding in, how quickly they have picked up music 21 so that I I really would not have known that there were so many people who had been doing a life of programming, but whose, you know, whose music lessons as a teenager or whose, class on music history and music theory in college was so influential on them that they wanted to come back and, combine the 2 skills. So that's that's been a place that I've been really happy and just overjoyed with how, how the community has responded.
[00:33:51] Unknown:
And in terms of any projects that you have seen built with music 21, both by people coming from the programming side who have that affinity to music, and with people coming from the side of musical theory and doing deep study of musical structures and then coming into programming as a result of music 21, what are some of the most interesting or unusual or unexpected ways that you have seen them leverage this toolkit that you've built? 1 of the things I saw music 21 being used for is for, algorithmic composition. And I I kind of thought that this was gonna be happening, but, I saw a usage
[00:34:28] Unknown:
by, MIT student, Prachek Igbushalam, who, was using music 21 to create algorithmically generated real time scores for video games so that, you know, as your character is not doing so well, well, let's, you know, let's change the, let's change the mode, you know, major major to minor maybe. Or, you know, if you've been hearing the same repetitive soundtrack for some time, well, let's let's just spice it up. Let's change the instrumentation. Let's change the rhythmic overlay on this passage. And so that that was something that, you know, I I've I've been kind of hoping would happen, but it was, surprising to me.
I've seen people using music 21 as parts of synchronizing musical theater productions and for detecting voice ranges. I've I've seen it for flashcard apps for people, trying to remember musical instruments and their sounds. So see, these are some of the things that I I never really imagined, that it would be used for. But we did think at the very beginning, my collaborators, Christopher Rees, Josiah Oberholtzer, and Ben Ho, we did we did think about, well, how can we make this generalizable enough that that anybody will want to use it?
[00:35:43] Unknown:
And in terms of your own projects that you've used it for where you were performing a particular set of analyses or trying to answer any question, are there any cases where the results that came out of it were unexpected or, particularly
[00:35:58] Unknown:
interesting? So I came to computational musicology really from 1 question that I had as, as somebody who works on medieval manuscripts and medieval music manuscripts. And 1 of the things that that's almost always the case with something that's 700 years old is that most of what survives is is really fragmentary. You might just get a little slip of, you know, hey. Here's half of 1 staff of a piece of music, or here's, you know, here's a quarter of a page, but it's only the last quarter of every, of every measure of of every line of music. So 1 of the things that I really came to this, to be able to find is, hey. If we entered in every piece of music that, that we know has survived from the 14th century in Europe, fortunately, it's only about 3, 000 pieces of music. And then we put in all these little fragments. Can we can we find them? Can we can we, figure out where they came from? And it turns out almost, the vast majority of these little, fragments are pieces that we already that we already knew. And so it seems like we've already found all the great hits of the 14th century, and that that was really important use of music 21 for my own work in music history.
There was 1 time I just wanted to make a quick demo of music 21 for an audience, and we wanted to look at, well, let's search through all the pieces by Bach, all the corrals of these things that he ended his weekly services with. And we'll just see. Well, you know, Bach is somebody who you know, he's kind of a a big sad guy, but he really tends to like to end upliftingly. He really tend likes to end even if he's been in minor with a major chord. So let's find every time he doesn't do that. And so that that was kind of a great project with Chris Ariza, that we had done. And I'd been using this demonstration of sort of simple analysis for a couple years until finally, I just 1 day, I was looking at them and I said, wait. What are all these cases?
We've we've got the data, but I haven't done what I was trained to do as music historian to interpret it. And it turned out every 1 of these pieces was a piece that somebody had doubted for other reasons that Bach was the composer of. So that I I'm not a Bach expert. I haven't gone the next step and tried to, you know, say that these are definitely not by Bach, but at least Music 21 pointed me in that direction.
[00:38:35] Unknown:
That's funny. You're you've both, potentially debunked some, misattributed pieces of music, and you've also constructed the billboard top 10 from the 15th century. Thanks. I like that way of putting it. And so looking forward in terms of music 21 and your usage of it, do you have anything in particular planned as as far as any features or projects that you want to use it for? At this point, the core of music 21 feels
[00:39:04] Unknown:
very stable. There is some things that we still need to be, moving forward in in various analysis types. I I want to absolutely have full representation of the evolving, music XML standard and to be able to really represent every single thing that can be encoded in that standard in Python in a way that makes sense to be able to retrieve it and make musical, knowledge. But, really, I've been turning my own work toward, trying to make more applications that build on top of music 21. So 1 of the things that, I've been working on in my spare time outside of MIT is a startup called r2z.xyz, which uses music 20 one's sort of knowledge of music theory and grammar to be able to create assignments and automatically, graded workbooks and automatically not just graded, but also with feedback and help people learn music theory. And that's been, a great collaboration that wouldn't have been possible without having an open source project like music 21 behind it and connecting with other open source projects outside of, within and outside the Python infrastructure such as, VEX flow, a music notation rendering engine. So all these things being put together, has been really exciting for me to be able to use my own toolkit to do things that I didn't think I'd be able to do,
[00:40:36] Unknown:
10 years ago. And outside of computational analysis,
[00:40:40] Unknown:
are there any other areas that you have been leveraging Python in your academic and professional pursuits that you'd like to talk about? 1 of the reasons why I'm gonna have to be going a little bit more slowly in the development of music 21 is that, I'm been appointed the faculty director of MIT's digital humanities programs. And 1 of the things that we're doing there is we have a large startup lab mostly of undergraduates and postdocs and hopefully graduate students soon. And we're trying to solve problems, not just in music, but across all of the humanities that Python and programming and algorithmic thinking can help that we didn't really know about before. So our first project that we've just put up, it's, digital humanities dot mit.edu.
Our our first project that we've, just finished is on understanding how gender is represented over about 4, 200 or so novels from the 19th and late 18th centuries and understand how did men and women write differently about gender or how do certain characters depending on their gender have different adjectives, you know, associated with them. So, for 1 of the things that, you know, is not too surprising is that female authors have tended to write more about women than male authors have. But 1 thing that came out of this work that was surprising was that both female and male authors tended to put women and men in the subject and the object position in about the same ratio. So there's about, unfortunately, about twice as often as the man in the subject position, he as opposed to him, compared to a woman. So this group, we've been, working along in Python. We've been, using the natural language toolkit, another great project, and other, other things that are out there. And, and in that case, we deployed everything using Flask.
Now we're working on a project to understand how, the history of computing, took place at MIT and what kinds of, so, you know, we're building, Fortran in 19 fifties, computer emulators in Python so that people can understand what kinds of constraints were available, that we don't have today. And this has been a great and interesting project in Django and a a number of other, projects have have been absolutely instrumental in doing this. Also, I had a I guess, after I got tenure, they gave me a little bit of time off, and so I, kind of indulged my other real love and tried to work on an object oriented representation of baseball statistics called, dissecting that I never really finished, but it was it was a great way to clear my mind from music for a little bit. And are there any other aspects
[00:43:42] Unknown:
of the music 21 project or computational musicology
[00:43:46] Unknown:
or your use of Python in academic contexts that you think we should discuss further before we close out the show? 1 of the great things about using Python in the digital humanities program and in music 21 and in classes is that I'm really able to work with undergraduates, quite often freshmen even, and have them work on answers and on problems and getting answers to things that nobody in the world knows about. And so types of research that might otherwise only be done, you know, as a graduate student or maybe a little bit in a senior thesis, we can really bring forward. And so 1 of the great things about Python and the Python community is the the huge range of tests and linters and things that that the students don't necessarily need in intro to, computer science classes because somewhere the answer will be given to them. But really being able to impress upon, student programmers how important it is that they know that their code can be trusted and that their code can be reused and that the things that they're contributing.
So some of the things that the students wrote were automatic braille music converters and, counterpoint generators. And because their code was well tested and, you know, well documented, that the things that they wrote for, you know, maybe a January project are still being valuable and still being used 10, 12 years now later. I think that that's something that's really powerful that we can give to all students by using Python.
[00:45:21] Unknown:
And for anybody who wants to get in touch with you or follow along with the work that you're doing, I'll have you add your preferred contact information to the show notes. And so with that, I'll move us into the picks. And this week, I'm going to pick something that I may have picked before, but it's worth reiterating, and that's the Mozart's Requiem piece, particularly the 1 performed by the Berlin Philharmonic Orchestra and conducted by Claudio Ovato. It's 1 of my favorite classical pieces. It's just, very powerful in terms of its presence and the movements throughout it, and it has a very interesting backstory behind it. So definitely 1 worth listening to when you have a little bit of time to just relax and enjoy the music. And so with that, I'll pass it to you, Mike. Do you have any picks this week? Well, Tobias, since you started us with a piece of music, I'll also,
[00:46:10] Unknown:
put 1 out there. I'm gonna put something a little bit later. Ruth Crawford Seeger string quartet from 1931, which, was just this mind blowing piece that, that was 1 of the first pieces that the last movement was really something that could be, kinda generated algorithmically. And it's 1 of the things that I've used music 21 to analyze and deep deep into this sort of, miasma and cloud of of all these notes that are going at the same time, which I find so beautiful. There's 1 wrong note in there and music 21 helped me find it.
[00:46:49] Unknown:
Alright. Well, I want to thank you very much for taking the time today to discuss your work with Music 21 and using it to explore musicology from a computational perspective. It's definitely a very interesting area of research and 1 that I think I might have to start digging into to play around with and do some exploratory analysis with, and hopefully the listeners will do that as well. So thank you for that, and I hope you enjoy the rest of your day.
[00:47:17] Unknown:
It's been great talking to you, and thanks to all the listeners for listening.
Introduction and Guest Introduction
Michael Cuthbert's Background and Introduction to Python
Understanding Computational Musicology
Overview of Music21 Project
Typical Workflow and Challenges in Music21
Implementation and Evolution of Music21
Community and Challenges in Growing Music21
Interesting Projects and Use Cases of Music21
Future Plans and Applications of Music21
Python in Digital Humanities and Other Projects
Closing Remarks and Picks