|
|
(3 intermediate revisions by the same user not shown) |
Line 1: |
Line 1: |
| [project page for 220C course with Chris Chafe, Spring 2017] | | [project page for 220C course with Chris Chafe, Spring 2017] |
| | | |
− | | + | This page has moved: https://ccrma.stanford.edu/wiki/Jouska |
− | == Overview ==
| + | |
− | Thesis: To create a narrative of the events that a person with Sporadic Fatal Insomnia would experience from the time of the condition’s onset until eventual death.
| + | |
− | | + | |
− | Title: Unslept
| + | |
− | | + | |
− | Duration: 6’ – 8’
| + | |
− | | + | |
− | Instrumentation: Violin and live electronics (including µgic sensor)
| + | |
− | | + | |
− | == Timeline and Formal Structure ==
| + | |
− | The events that are represented by various sound objects describe a slow transition of sleep-related and insomnia-related sounds towards a chaotic texture of intense auditory hallucination and dementia. The piece is structured as a long-form crescendo with occasional moments of relief – this is aided by the use of Shephard tones to guide the frequency regions each sound will take place in. The Shephard tones themselves will not be present, but only act as a structural guide. The rate of these glissandi increases gradually.
| + | |
− | | + | |
− | == Notable Compositional Techniques ==
| + | |
− | Spectrum-based panning – splitting parts of a sound into its spectral components (either parametrically or automatically by peak detection) and panning to the same location at different rates. For longer sounds, this can act as a sort of “Shephard tone” for panning, by creating an ongoing cycle of these movements and hiding their onset times. I want to employ auditory illusions wherever possible, to emulate the “hallucinations” of the protagonist. These may include the wagon-wheel effect, Huggins Binaural Pitch, Bilsen Band Edge Pitch, Richard Warren speech illusion, Gabor grains, or pseudo-illusions like turning rhythms into pitch by increasing speed.
| + | |
− | | + | |
− | == µgic sensor ==
| + | |
− | Mari's µgic sensor will be utilized as a major component in the piece; it will primarily be used to drive different parameter values in the live processing of the violin. The µgic sensor can track the following performative motions:
| + | |
− | | + | |
− | 1) bow stroke duration - track a ‘long note’ or long bow (with multiple slurred notes) that is held, as a controller or an element
| + | |
− | | + | |
− | 2) pizzicato - pizzicato-sforzando can be reasonably traced
| + | |
− | | + | |
− | 3) tremolo - or ‘energy amount’ a rapid detaché can be reasonably traced as either a trigger or a continuous ‘energy’ quality
| + | |
− | | + | |
− | Additionally, pitch tracking and amplitude/note-onset tracking are available as parameters to track.
| + | |
− | | + | |
− | | + | |
− | I am looking to see what sorts of interactive relations might fall under the criteria of:
| + | |
− | | + | |
− | 1) things that would be difficult to align if there were just fixed media playback
| + | |
− | | + | |
− | 2) ways in which I can combine multiple elements together (such as pitch detection and bow speed) to create more interesting processing that does not become stale over time
| + | |
− | | + | |
− | 3) evolutions of effects that would seem "inorganic" if they were triggered only with a foot pedal
| + | |