Experiments in latent space, 2021.
Artbreeder (Simon, 2020) is a collaborative tool for discovering new images and generating animations. Images are 'bred' by having ‘children’, or by mixing an images ‘genes’ with other images. The lineage of the hybrid image may be traced through a collaborative family tree, thus breeding and sharing can be used as methods for exploring highly complex spaces. The main technologies enabling the creative tool are Generative Adversarial Networks (GANs), (Goodfellow, 2014). In particular BigGAN which was trained on ImageNet, a large visual database of more than 14 million images, designed for use in visual object recognition software research.
Deep fake algorithms are now reaching the state of being able to convince humans (and other algorithms) that the person in a video is the real person, not a simulation. The Australian Lyrebird has been doing just that since time immemorial, famous for its ability to mimic the sound of its local environment, including birds, animals, chainsaws and cameras.
Marynowsky has applied experimental methods using Artbreeder in an attempt to generate unexpected outcomes from the image generating networks. In "The Lyrebirds", we see a slowly animated sequence of hybrid insects, animals, and bird-like creatures, created by cross-breeding across an extensive range of image categories. Morphing from one hybrid creature to the next, they never form recognisable new creatures or return as the original image that was first fed into the image generating network. Instead they exist in a liminal and latent space completely synthesised by the artist, the community and the algorithms, confusing our expectation of representation and as well as notions of authorship.
Humanity needs to create a new animism to understand the environment of computational creativity, using GANs. This work seeks to make beauty from GANs by generating new interesting lifeforms, not scary hybrid monsters.
In the work, "From our deceased bodies flowers will grow, we are in them and that is eternity", the artist has purposely input photography of local flowers into the Portrait AI engine, forcing it to find faces where there are none. Some of the local flower images include: Acacia trees (Wattle) ; Nasturtium; Camellia, Magnolia; Lavender; Callistemon (Bottlebrush), Grevillia etc. The images are not bred any further and exist as the first translations of the Portrait AI algorithm. The most interesting images are then selected to become part of the animated morph sequence. The sequence flows from machine abstraction to the representation of faces, confusing the threshold of anthropomorphism, human's innate tendency to see faces in the world around us.
By using computers to make art, artists help us to understand the nature of computational creativity. This work seeks to make beauty from GANs by reinforcing the idea that we come from the natural world. By creating and appreciating these works it is intended we can also develop a stronger connection with nature, because from our deceased bodies flowers will grow, we are in them and that is eternity.
In the work Emerging from the Sea, the artist has purposely input close up, and landscape photography of Seven Mile Beach into the Portrait AI engine, forcing it to find faces where there are none. The work investigates the threshold of human psychology's innate tendency to anthropomorphise the world around us.