Jake Elwes

In “Closed Loop”, two artificial intelligence models converse with each other – one with words the other with images – in a never-ending feedback loop. The words of one describe the images of the other, which then seeks to describe the words with a fresh image… Two neural networks getting lost in their own nuances, sparking and branching off each other as they converse in a perpetual game of AI Chinese whispers.

Collaborative project with Roland Arnoldt. Special thanks to Anh Nguyen et al. at Evolving-AI for their work on GANs.


Two neural networks: a language captioning recurrent neural network writing what it sees in the images generated, and a generative neural network generating images responding to the words generated. Two datasets to train the algorithms: a dataset of 4.1 million captioned images to train a language network, and the Imagenet dataset of 14.2 million photographs to train the image generator network.

Images and Video Courtesy of Jake Elwes.