I'm a Sydney-based saxophonist, composer and technologist with interests in contemporary classical, interactive, improvised and electro-acoustic music. My recent research and practice incorporates equal parts performance, composition and the development of musical software systems.
I completed my PhD at the University of Technology, Sydney (2016), and I currently lecture in music technology and contemporary music at the same institution. My research and creative work has been featured at the International Conference of Computational Creativity (2015), the Conference on New Interfaces for Musical Expression (2012, 2014), the International Computer Music Conference (2013), the Sound and Music Computing Conference (2012), the dBâle festival of electronic music (2012), IRCAM Live @ La Gaité Lyrique (2012) and the Australasian Computer Music Conference (2011, 2012).
Just finished the final concert of Zubin Kanga's Cyborg Pianist program, which I ran the electronics for. This is the second tour I've worked on with Zubin, who continues to push the boundaries with new works for piano and multimedia. Some great reviews of the program can be found here and here. The above video is of Patrick Nunn's 'Morphosis' performed at NIME16, which featured on the program.
NIME 2016 at the Queensland Conservatorium of Music in Brisbane is all wrapped up. A great few days of talks, performances, workshops and installations. This has been my busiest and most collaborative NIME yet - with a jointly run workshop on creating Musebots with Arne Eigenfeldt and Ollie Bown, an installation of the same project, a paper on Practice-based Research with Andrew Johnston of CCS, and two performances with pianist Zubin Kanga. My latest system/piece 'taking the auspices' was premiered last night... a new work for prepared piano, live electronics and live visuals.. it's a new audio-visual concept that I look forward to evolving further.
After quite a long slog, I have finally finished my PhD at UTS! My thesis, _derivations and the Performer-Developer: Co-Evolving Digital Artefacts and Human-Machine Performance Practices has been accepted, and you can access a digital copy here or here.
Interview with Tobias Fischer about _derivations, human-machine creativity and coding aesthetics published over at tokafi.com
Review of _derivations release by Matthew Lorenzon published over at Partial Durations
NIME 2014 | Amersham Arms (New Cross), London UK, July 2nd, 9pm
Performance with _derivations at 'Improv NIME' - during the 2014 Conference on New Interfaces for Musical Expression
Hyde/Bierstone Duo | Studio, Royal Northern College of Music, Manchester UK, October 7th, 6.30pm
Newly formed electro-acoustic duo Covalent (Zane banks - electric guitar, Ben Carey - saxophone and electronics), will perform a concert of semi-improvised and composed works for electric guitar, saxophone and live electronics.
The Sydney algorithmic improviser hack-together will take place over 3 days in April and will bring together scientists, artists and other enthusiasts in pursuit of the creation of pieces of music-playing software that exhibit modest but noticeable forms of autonomy and musical capability when paired up with human improvising musicians. The hack-together will provide an opportunity to collaborate and explore technologies, techniques and critical issues of musical autonomy. The gathering will culminate in a performance on April 21st showcasing systems produced prior to and during the workshop. The performance will include the following musicians: Peter Hollo (cello), Adrian Lim-Klumpes (rhodes), Evan Dorian (drumkit), Ben Carey (saxophone) and Roger Dean (keyboard).
In the late 1970's, Robert Plutchik adapted his concept of the eight primary emotions into a striking graph known as "Plutchik's Flower." Though this represents a simplified version of the emotional experience, it conveys the elegance of our evolution from natural origins. Using sounds and images taken from the natural word, and a new system for live spectral composition based on Plutchik's Flower, we will return to the wild, unfurled landscape of the mind. Experience a new approach to improvisation and a unique perspective on the forms of nature in an evening of musical and visual impressions.
Benedict Carey (composer and electronics), Daniel Mayne (visual artist), Rhia Parker (recorders), Benjamin Carey (saxophones), Megan Clune (clarinet), and Ben Goodger (electric guitars).
diffuse 6 @ UTS | Bon Marche Theatre, (University of Technology, Sydney) November 18th, 6.30pm
My research is practice-based and centres upon the development and use of interactive musical systems. The ‘_derivations’ system, developed throughout my PhD, was designed to facilitate improvised encounters between performers and quasi-autonomous music software. Throughout this research I explored themes of human and computer agency in interactive musical performance, and the relationship between improvisation and interpretation in human-machine improvisation contexts. _derivations and my related work in musical interactivity has been presented in performance and published in conference papers at numerous international conferences (International Computer Music Conference (2013), Conference of New Interfaces for Musical Expression (2012/2014/2016), Sound and Music Computing (2012), International Conference on Computational Creativity (2015), Australasian Computer Music Conference (2011/2012)).
_derivations is an interactive performance system designed for use in improvisatory musical performance. Acting outside the direct control of any human operator, _derivations listens to the performance of its collaborators and uses this information to make decisions about its own contribution to the unfolding musical dialogue. The resulting interactions are often abstract, intricate and playful, and showcase the unique possibilities afforded by placing both human and machine on an equal footing in performance.
For more information on the project you will find up to date downloads, videos, audio and texts related to the software at derivations.net
-Carey, B. 2016, '_derivations and the Performer-Developer: Co-Evolving Digital Artefacts and Human-Machine Performance Practices', PhD thesis, University of Technology, Sydney.
-Carey, B. 2016, “Artefact Scripts and the Performer-Developer,” Leonardo, vol. 49, no. 1, MIT Press
-Carey, B. and Johnston, A. 2016, “Reflection On Action in NIME Research: Two Complementary Perspectives,” International Conference on New Interfaces for Musical Expression, Brisbane, Australia
-Bown, O., Carey, B. and Eigenfeldt, A. 2015, “Manifesto for a Musebot Ensemble: A platform for live interactive performance between multiple autonomous musical agents”, Proceedings of the 21st International Symposium on Sonic Art (ISEA), Vancouver, Canada.
-Eigenfeldt, A., Bown, O. and Carey, B. 2015, “Collaborative Composition with Creative Systems: Reflections on the First Musebot Ensemble,” International Conference on Computational Creativity (ICCC), Park City, UT.
-Carey, B. 2014, “Artefact Scripts and the Performer-Developer,” Workshop on Practice-Based Research, Conference on New Interfaces for Musical Expression, London, UK.
-Carey, B. 2013, “_derivations: Improvisation for Tenor Saxophone and Interactive Performance System,” Proceedings of the 2013 ACM Conference of Creativity and Cognition, Sydney, Australia.
-Bown, O., Eigenfeldt, A., Pasquier, P., Carey, B. and A. Martin, 2013. “The Musical Metacreation Weekend: Challenges arising from the live presentation of musically metacreative systems,” 2nd International Workshop on Musical Metacreation, Boston, MA.
-Carey, B. 2012, “Designing for Cumulative Interactivity: The _derivations System,” Proceedings of New Interfaces for Musical Expression, Ann Arbor, USA.
-Martin, A., Jin, C.T., Carey, B., Bown, O. 2012, “Creative Experiments Using a System for Learning High-Level Performance Structure in Ableton Live,” Sound and Music Computing Conference, Copenhagen, Denmark.
This page contains download links to some MaxMSP patches and abstractions of mine. If you find them useful, or have any questions about their functionality drop me a line on the contact page.
Microsonic FM is a polyphonic granular synthesiser for Max4Live, based on quasi-synchronous granular synthesis techniques. It's capable of creating anything from sparse and bubbling textures to lush, warm pads:
An interactive/generative computer music system based around first-order markov chains. A quirky way of creating a coherent yet surprising dialogue with your computer.
stereo and 5-channel surround soundfile granulators - download here
XY Markov Player
Markov chain patch for creating probabilistic automations based on XY pad movements - download here
Here is a youtube playlist containing some Max tutorials for students. This series was developed for Visualisation and Sonification studio taught on the Bachelor of Sound and Music Design at the University of Technology, Sydney.
Below you will find a selection of small MaxMSP patches developed for students.
An additive synthesiser for mixing 3-voice tone clusters. The user navigates through coloured nodes to mix the amplitudes of the chosen tone clusters. Three performance modes, droning, rhythms and one shot triggering.
An additive synthesis patch used as an ear training and sound design tool. The patch was programmed for students to construct timbres by manipulating the amplitudes of the first 16 partials of the harmonic series.