Wednesday, January 25, 2012

Computer Aided Composition & Hallucinations

The term "algorithmic composition" is loaded with baggage. Although the phrase pops up in our conversation fairly often, because of the loaded nature of the words, Alexis Kirke prefers "Computer Aided Composition" to describe the process he uses to compose music.

Although perfectly capable of coding, and having done so in the past, Alexis now prefers to work collaboratively with a professional programmer so that he can focus on the overall composition. "Programming is a creative act in itself" and it is all too easy to get sucked into it, he explains. Listening to Alexis describe how he is developing Insight, I couldn't help but think of a Vulcan Mind Meld. Alexis desires to "share the unsharable" - his consciousness, internal feelings, his Palinopsia hallucinations. (See the previous post for more on Alexis  and how Palinopsia inspired this particular body of work)

According to the current plan, if you are in the audience at the February 10th  performance you will see Alexis standing on stage behind a music stand holding an iPad. Perhaps he will be holding a pen.  A Macintosh laptop is within arm's reach. The programmer sits in the front row of seats, and there is a flute player somewhere on stage with his own computer monitor. Alexis sees you; he also sees lighting designed to trigger his hallucinations.

It is difficult to formulate words to describe what this application program will probably look like, but I'm going to give it a whirl. The iPad will show Alexis a camera image of what he is seeing at any moment, an "augmented reality". The software on the iPad will incorporate parameters that correspond to common elements of Alexis' hallucinations. As Alexis experiences hallucinations, he will activate afterimage functionality and trigger an afterimage on the iPad. (At this point, a poor video connection on our Skype call produced a well timed "trail" as Alexis moved his hand across my line of sight). When this happens he is presented with visual options, filters and parameters. These are produced in partial response to an iterative feedback loop between himself and the programmer: he selects, via multi-touch, tapping, double tapping, 3 fingered presses, which parameters are appropriate and what filters to apply. For example, he can adjust the iPad screen brightness to correspond to his perception of brightness. He can likewise adjust screen size, the specific pattern of an afterimage, the rate of visual decay, single or multiple images, random patterns, and many other aspects of the visual echo. Alexis is simultaneously saying "yes, that" "no, not that" to presented options, and creating from scratch on the fly what he sees. He has to make the iPad "see" what he sees.

All this is happening extremely rapidly as his hallucinations develop, exist, and are replaced (sometimes in milliseconds) with the next hallucination. And we must not forget that this is not a completely free form activity - there will be at least 3 sections to the composition. Each section of the piece has different parameters. Alexis will reach over to the laptop to change sections. All of the visuals are handled by the iPad. When a hallucination is accurate, it is packaged up and sent wirelessly to the laptop, where the music resides. One reason for this division of labor is that the visual software is processor intensive. Memory optimization has been critical, and Alexis brought in someone who focused exclusively on iPad memory optimization.

Somehow, it all gets put together and coordinated: visual hallucinations from the iPad, musical scores from the laptop, contribution from a live flute player watching the iPad output.

You, sitting in the audience would see Alexis' hallucinations projected on a large screen, and hear his musical score to go along with it. All in 12 minutes.

The performance of Insight, at the Peninsula Arts Contemporary Music Festival, is going to be filmed. Let's hope it becomes available online!

No comments:

Post a Comment