woman, holding

Size: 190cm x 260cm x 60cm

Medium: Steel, plastic, epoxy, printed image,wax, electronics,tablets, thread, fabric.

Year: 2020

Edition: 1

woman, holding (2020) (installation element)
@Muza
Part of Strangers in a strange land curated by Unfinished Artspace

The work addresses the algorithmic bias of commercially available image description and image creation services. Arranged into an ambiguous form that could be interpreted as a shrine, as a memorial, or as a futuristic display, the work engages with the inherent biases of commercial facial analysis and image-description services that are trained on highly biased data sets (such as Image Net or COCO dataset) and explore the consequent bias that comes with those data sets.

In the creation of the work, a number of images of the artist were taken which were processed through multiple machine learning services that describe imagery. The services were quite unbiased when describing nonhuman and male objects, meaning the service did not use evaluative descriptors like ‘pretty’, ‘good looking’, or ‘sexy’, when describing nature, cityscapes or men wearing clothes. However, when the algorithms described images of women, they used evaluative descriptors in most cases. The same level of evaluative descriptor in the case of a male image was achieved very rarely, compared to the images of women that did not project any sexualised undertones. A shirtless man posing on a club advertisement was labelled as ‘serious’ and ‘fine-looking.

After performing a converse action using text-to-image AttnGAN (that is trained on COCO dataset) the text output from the image description service was run through a text-to-image process for which the artist removed the evaluative descriptors like ‘pretty’ and ‘good looking’. For example, a ‘woman in front of a mirror’ image descriptor results in a semi-abstract blob that can be recognised as a posed selfie in underwear, a beach photo, a mirror selfie, and other varieties that have in common a visual language of objectification of women
The title of the work stems from the often-encountered image description of the medium shot of the artist as ‘holding’, implying that the data sets with which the algorithms trained to view women as carers.
We may think of algorithms as somehow neutral, but ultimately not only have they been created by people who have their own biases and prejudices, but descriptive algorithms use data sets that contain the biases of people who have labelled the images have inherited biases about what type of person could likely be involved in a crime and what gender should be attributed to a doctor, lawyer and scientist.

Responses

Your email address will not be published. Required fields are marked *

Other Artworks

woman, holding

 

woman, holding

 

EXPLORE

More work by the community