How do we feel? 2023
Data can be subjective. Data we feed from one ‘being’ to another can be ambiguous, confusing, fluctuating, sensitive. Who decides what an emotion looks and feels like?
This tactile collection uses a kind of ‘object language’ to convey a 3D experience of a dataset. The sculpted, drawn and worded data shows conversational pathways between humans and AI systems.
Nine feelings have been fed to language model Chat GPT and DALL-E – an AI program able to generate images from text. Clay and drawing are used to visualise a feeling’s shape, texture, and form, in response to conversations between AI and artist. In a live exhibition, humans are asked to guess the emotions they see.
Through human responses to physical ‘object feelings’ we see the journey of an algorithm. We see a kind of empathy shared between human and machine, in a world where communication of feelings is forever evolving.
glazed ceramics, engraved card