Skip to content

Instruments

Sam Bilbow edited this page Dec 8, 2022 · 3 revisions

Unity Scene

  1. Add the Unity project in polygons/unity/ to Unity Hub
  2. Open polygons/unity/EskySettings.json, to include your headset offsets and update your "displayWindowSettings" to suit your monitor + headset display layout.
  3. Open the Unity project.
  4. Open the polygons~ scene: polygons/unity/Assets/Scenes/Experiences/polygons.unity

ambi

ambi, a red wire-frame icosahedron, serves as a dual-oscillator drone synthesiser that the performer can call on my taking and bringing it closer to them with their hands. Its voice is contingent eleven real-time parameters, which are sent to the PureData patch attached to it. At specified distance two particle systems are activated from the palms of the performers hands, and the drone from ambi is activated.

click+-

click, a set of two blue wire-frame icosahedra, form a feedback system between two low- frequency oscillator controlled impulse generators. The feedback can be altered by bringing the smaller icosahedron (click-) closer to the larger one (click+). click+- makes use of LibPdIntegra- tion’s Unity send functionality; each time there is an impulse generated, a message is sent to Unity, which triggers a shower of particles to be emitted from click- to click+ forcefield-guided cone. As the performer intervenes and creates the feedback system, the closer they bring the icosahedra together, the faster the visual particle shower becomes, and the more erratic the impulse generators feedback on each other. A further sound parameter, connected to a reverb, is mapped to the amount spin click- let go with by the performer. The sound palette explores a gradient from static electricity crackles to a sound akin toa car motor catching and revving up as the feedback increases.

hands

hands constitutes the final AR instrument in polygons ; each of the performers hands, when a virtual button besides their palm is toggled on, generate a highly resonant filtered white noise. The cutoff, resonance, and amplitude of this generator-filter system is controllable independently per hand, through several dynamic movement-based attributes. The filter cutoff per hand is mapped to that hand’s distance to the performer’s face; the filter resonance per hand is mapped to the angle from that hand’s palm angle towards the centre of the performance space; and the amplitude of each is mapped to the stretching of the fingers outwards away from the palm. The sounds delivered by hands are harsh and unpleasant at certain parameter combinations and purposefully loud; from a performative standpoint, this engenders specific gestures, movements, and stances, as the performer grapples with a sonic experience akin to howling and shrieking winds.

More detailed information about musical parameter mappings can be obtained by viewing the performances I've done so far with polygons~. Detailed tables of parameter mappings will be added at a later date.


{MAH Showcase @ The ACCA} {emute_06 @ The Rosehill}

Clone this wiki locally