David Rokeby

Artengine's ARTIFICIAL IMAGINATION

David Rokeby, a part of the vanguard of electronic, video, and installation art since the early 1980s, delivers an artist talk that delves deeply into the evolving interplay between humans, technology, and artificial intelligence. Living and teaching in Toronto at Ryerson University’s RTA School of Media, Rokeby has consistently explored themes of digital surveillance and the perception of time through his art, a concept he refers to as ‘process time.’

During his presentation, Rokeby recounts his artistic journey, starting from his work in the 1980s that researched the interaction between human bodies and machines. This early work laid the foundation for his 1990s projects, which began to question the implications of these interactions for artificial intelligence. He reflects on the relatively primitive state of AI during his initial explorations compared to today’s advanced deep learning technologies.

A significant part of his talk focuses on his installation, “Very Nervous System” which uses cameras to track movement and create music in real-time, fostering a dynamic interaction between the body and technology. This piece, Rokeby explains, not only reflects the movements it detects but also generates a responsive, immersive environment that explores the blurred boundaries between the physical self and digital responses.

Rokeby discusses the philosophical and existential questions that arise from his engagements with technology—particularly how consciousness and bodily experience influence and are influenced by digital systems. He is intrigued by the immediate, often surprising interactions between human intent and technological feedback, which challenge traditional notions of control and agency.

Rokeby questions the future of artificial intelligence in relation to the human body. He speculates on the cultural implications of AI systems that lack a bodily perspective, suggesting that a comprehensive understanding of intelligence might remain elusive without integrating the profound human experience of being embodied. His reflections prompt a broader contemplation on the role of the body in digital and cultural contexts, emphasizing the persistent relevance of physical presence in an increasingly virtual world.

This presentation was part of the symposium ARTIFICIAL IMAGINATION which unites innovative artists engaged with emerging technologies. This focused on exploring and sharing their individual practices, experiences, and insights related to algorithms, artificial intelligence, and machine learning. It served as a platform for an enriching exchange of ideas between the artists and the audience, aiming to contribute a distinctive artistic viewpoint to the ongoing discussions about our evolving relationships with machine collaborators. Each session, including this one, highlighted how these technologies are being integrated and reflected in contemporary artistic processes, encouraging a broader understanding and appreciation of the creative potential of new digital tools.

Born in Tillsonburg, Ontario in 1960, David Rokeby has been creating interactive sound and video installations with computers since 1982. His early work Very Nervous System (1982-1991) is acknowledged as a pioneering work of interactive art, translating physical gestures into real-time interactive sound environments. Very Nervous System was presented at the Venice Biennale in 1986, and was awarded the first Petro-Canada Award for Media Arts in 1988 and Austria’s Prix Ars Electronica Award of Distinction for Interactive Art in 1991.

Several of his works have addressed issues of digital surveillance, including Watch (1995), Taken (2002), and Sorting Daemon (2003). Taken was exhibited at the Witney Museum of American Art in New York in 2007. Another of his surveillance works, Watched and Measured (2000) was awarded the first BAFTA award for interactive art from the British Academy of Film and Television Arts in 2000.

Other works engage in a critical examination of the differences between human and artificial intelligence. The Giver of Names (1991-) and n-cha(n)t (2001) are artificial subjective entities, provoked by objects or spoken words in their immediate environment to formulate sentences and speak them aloud.

David Rokeby’s installations have been exhibited extensively in the Americas, Europe and Asia. He has been featured in retrospectives at Oakville Galleries (2004), FACT in Liverpool (2007), the CCA in Glasgow (2007) and the Art Gallery of Windsor (2008). He has been an invited speaker at events around the world, and has published two papers that are required reading in the new media arts faculties of many universities.

In 2002, Rokeby was awarded a Governor General’s Award in Visual and Media Arts, the Prix Ars Electronica Golden Nica for Interactive Art (for n-cha(n)t) and represented Canada at the Venice Biennale of Architecture with Seen (2002). In 2004 he represented Canada at the São Paulo Bienal in Brazil. In 2007 he completed major art commissions for the Ontario Science Centre and the Daniel Langlois Foundation in Montréal. His 400 foot long, 72 foot high sculpture entitled long wave was one of the hits at the Luminato Festival in Toronto (2009).

Recent projects include a series of video works which explore the patterns traced by movements across time, an installation evoking the presence of Marshall McLuhan in the coach house where he worked for the 2010 Contact Festival, and a new interactive sound installation entitled “Dark Matter” commissioned by Wood Street Galleries in Pittsburgh. He is currently preparing a new work for the opening of the Ryerson Gallery and Research Centre in Toronto in 2011.

David Rokeby is represented by Pari Nadimi Gallery.

Can I use the computer in a manner that allows me to reinforce my sense that I have a body, to reward me for having a body, and to allow me to explore that sense of having a body through the technology itself?

It seemed the system already knew I was going to move before I moved

Reclaiming Embodiment

David Rokeby

I was spending time programing computers. I was sitting at my computer, sitting at the computer in bad posture with my body, complaining, feeling awkward, trying to forget that I had a body sort of living in the space of logical programing and then wanting to understand so. So can I turn this upside down? Can I use the computer in a manner that allows me to reinforce my sense that I have a body to to reward me for having a body and to allow me to explore that sense of having a body through the technology itself. So that’s one of the many reasons that this piece came into being. So and what can I learn about my relationship with my body through engaging in this way? So the first thing I learned was that there is and I learned this, it’s a known fact, but I learned it experientially time. There’s a time delay inherent in consciousness. I discovered that it seemed the system already knew I was going to move before I moved. This was very confusing. I set up a system for especially to test this, where I would have it respond with a very noticeable, very loud sound. As soon as it saw a hint of movement from my body. And I would stand there like a gunslinger waiting to do the quick draw, trying to be as still as possible, and then suddenly move. Every single time it made the sound at the moment, I decided I was about to move very, very disturbing. And it was only by doing some research that I discovered. The reason is that consciousness tends to be about a 10th of a second behind our motor movements. So we make a movement, we make a decision to move, we commit to the movement, and we make the first initial movements before our consciousness is fully aware that we’ve made that decision. So the body is ahead of the mind. And this, in a sense, in this way or in the mind, you know, and in brackets there are in quotation marks. So there’s a time delay. And this creates a fascinating confusion. So it’s no longer clear whether the sound precedes the movement or the movement precedes a sound. And the sense of what’s in control of the other dissolves. So it creates a really interesting, very tight feedback loop. I was very inspired by Suzanne’s use of the spiral in her work. We were talking about it last night. There’s a sort of sense that that the feedback loop spirals inside itself and gets tighter and tighter and tighter so that there’s there’s the sense of what’s controlling what is lost.

Navigating the Unexpected Pathways of Code and Perception

David Rokeby

The code itself was not terribly intelligent or terribly rich. It was enriched by the fact that that it took me to places I hadn’t expected and my going to those new places added new things into the input to the code and that came out again. So there was a kind of an any errors and this is kind of passing any errors that happened. If the machine did not successfully translate my movements into an appropriate sound, I would collaborate to make it more convincing by extending that gesture a little longer until the sound happened. So there was a kind of many layers of of complex engagement with the system, including perhaps an over willingness to be credulous towards the interaction that was happening. Awareness distributed through the body is different than focused thought. Okay, that’s fairly obvious. I spent a lot of time during that time also doing experiments with diffuse perception. So I walked down the street for two weeks in Toronto. Paying, locking my phobia at the close to the vanishing point is I could find in deadening that and paying attention only to things that were disappearing out of my field of view. So really paying attention to the periphery and experiencing the very different sense of space relationship to the space that was that that afforded. And that was very connected also to the to the state that was appropriate for being a very nervous system, which was not to be willfully focusing on controlling something, but to be maximally engaged on as many different bandwidths and temporal wavelengths and things as possible.

Consciousness and AI: Bridging the Gap

David Rokeby

Consciousness is perhaps our awareness of this incompleteness in culture flowers out of it. So this awareness of incompleteness forces us to be masters at imagining beyond the limits of our immediate content or context. So it’s also a sense that the whole notion of a philosophy, etc., is generated by the fact that we are aware of that. Problematic or incompleteness. So consciousness arises out of the tension between what we are and what we can imagine. Consciousness flowers from the body. What would we be without this generative frustration? So just to sum up where this leads to in my thinking about contemporary artificial intelligence and machine learning is a problem that we that they rely on massive datasets to to become effective. And we and so there have been remarkable advances in getting good datasets for images and good datasets for sounds. We will they will be very difficult to have really good datasets for the body. So the body is being left out again. Maybe that’s a good thing, but the body is being left out of the equation. And I do find myself wondering what is an intelligence or what is what culture, artificial intelligence generates if it does not come with a sense of the experience of the body, which seems to be something that helps to define at least what our human culture is based upon. So the final question is, you know, and this is for the data science nerds out there, can this be modeled as an objective function? Because this this, this, this should be modeled as something that can be made the goal for a machine learning task? Or are we all models of machine learning or are all learning models of machine learning insufficient for this task?

Body Mind Machine

An engaging panel with Kristin Anne Carlson, Davide Rokeby and Chris Salter, moderated by Nell Tenhaff which delves into different relationships artists are cultivating with machines.

READ MORE