"Music became visual as well as aural. It could occupy space as well as time."
These two sentences come from the second paragraph of the first volume of Richard Taruskin's multi-volume Oxford History of Music (I now own the whole set thanks to Michael's champion gift giving) have been haunting me for the past few days.
Before musical notation the only way to convey a musical idea was in real time and within earshot of the person receiving the musical idea. Obviously the only way to recall a musical idea would have been to repeat it and stow it in memory. Writing words and drawing symbols, as far as we know, pre-dated musical notation, but there was a point in time before writing when the sense of hearing did not work in concert with the sense of seeing in the way we understand it. And then there was a point when they did.
This boggles my mind.
At first music only occupied time. Then it was written down, and it occupied large amounts of space because manuscripts were large, and they were filled with pictures as well as words and music. They were also made of materials that didn't deteriorate with time (under the normal circumstances of life before electricity). They filled libraries, which took up space. They stayed in one place, and people came to them.
It was useful. It was playful. People learned it by ear and by eye. They made copies by hand.
When music became mass produced through printing it took up far more physical space. There was far more of it, and the amount of physical music we had kept increasing for a good 2,000 years before we were able to store manuscripts electronically, were they take up almost no physical space at all.
People used to have to experience musical performances by being within earshot of the person playing. Once we had recordings, music was everywhere. Anyone could (and can) hear music at any time, not just people who go out of their way to hear performances or people who are musically literate themselves. Now with our current computer technology and our various devices anyone can hear music anywhere. The space an mp3 file occupies is not real space. The proportion (if you take the whole human race into account) of people who experience music only aurally is far greater than the percent of people who can also experience it visually.
Most literate people, musical or otherwise, experience "eyes that hear" (as in the words that sound in your head when you read). People who read music also experience the conflating of the ears and the eyes in the other direction. You could say that we have "ears that see." All my students understand. People who listen to music without being able to read it can experience other kinds of visual stimulation, but they won't develop eyes that hear and ears that see music unless they learn to read music and play an instrument or sing.
Musicians also conflate their sense of touch with their senses of hearing and seeing, whether the instrument is outside of the body or on the inside. It is even possible to say that the experience of playing music involves periods of synesthesia. Some people have visual and aural synesthesia constantly, and some people have it with great intensity, like Amy Beach, Alexander Scriabin, Vladimir Nabokov, and others. Some people have it only mildly when engaged in a series of activities that involve multiple senses, musical or otherwise. Many other arts like acting, dancing, painting, drawing, sculpting, and even writing, involve some kind of synesthesia.
But imagine, if you will, if the senses of smell, taste, and touch could be transmitted to another person without the receiving person being physically present at the time. That would be akin to the "quantum leap" that happened when people became literate.
(Those senses are active when we dream.)
(I imagine people already hold patents for these technologies, even though they don't exist.)