Okay the title doesn’t really do this justice as I’m processing what I’m trying to convey as I’m typing so forgive some confusion here.
First there was music notation. Where people shared sheets of paper that had written notation of how to play music much as how we use writing to convey speech. This of course assumes that one CAN play an instrument. When recorded music came along many people dropped the practice of learning to play an instrument in favor of just getting music recordings, thus centralizing the art of actually producing music. But what if we created an app that focused on music and sound sythesis? Yes there are tons of those running around (I particularly like the Chuck programming language created by Ge Wong a Stanford which largely facilitates exactly this) but what if we covered all sound and integrated SAFE to transmit notations. Think MIDI files. The bulk of the program is stored client side, the file is mostly just the notation on how to play the song. So? Let’s build a BETTER midi player. One that utilizes the immense data storage capacity of SAFE to store things like instrument generation and catalouges, interfaces, preconfigured settings to create various sounds, and most importantly an ability to easily share and create such things with one another. As I’ve looked into music synthesis I’ve found one of the most annoying drawbacks can be a lack of import/export features. You can import but not export or vice versa. (Same problem occurs in the visual arts.) So yeah, use Chuck and other such languages to create instruments, make instruments and configurations easy to share and modify, store them on SAFE and then use SAFE again to share notations the same way we do mp3s. This would have the added benefit of perhaps making said notations easier to convert into standard music notation for live artists to play.