GEMINi - Gestural Music Interaction research group
The Gestural Music Interaction research group was established in 2011 to act as a focus for our work in the areas of gesture-based and touch-based interfaces for both performance and composition. It takes advantage of staff specialisms in both electroacoustic composition and instrumental performance with live interactive electroacoustics.
Professor Andrew Lewis
Andrew Lewis is a composer specialising in acousmatic composition (music made with transformed sounds played back over multiple loudspeakers). He is especially interested in the physical gestures which listeners infer from recorded sounds, and in how these might be related to the actual gestures made by instrumentalists. He is currently exploring the creative potential of the Microsoft Kinect controller for creating works combining instrumental and acousmatic gesture.
Richard specialises in the use of ‘fuzzy logic’ in algorithmic composition, audio processing and manipulation, and real-time gestural control. His generative music has been exhibited at the Ars Electronica festival in Linz, Austria, and his large-scale installation Weathersongs, which generates electronic music from the weather, has been exhibited in Wales and Italy. He uses Max and SuperCollider programming languages extensively in his composition and has written a suite of software modules called nwdlbots (pronounced “noodle bots”) for generative composition within Ableton Live.
Richard studied algorithmic computer music with David Cope and Peter Elsea in Santa Cruz, California, and completed a Master’s degree in Composition and Sonic Art at Bangor in 2013. He is currently an AHRC PHD scholar conducting practice-based research into the musical application of fuzzy logic.
Kimon Emmannouil Grigoriadis
Ninos Grigoriadis is a PhD student and holder of a Bangor University Anniversary Scholarship. He is a composer whose interests include the use of technology in the creation and dissemination of both acoustic and electroacoustic music. He is currently developing a touch-based system which uses physics-based graphics for the manipulation of sound material, both as compositional tool and a live performance interface.