🍤  Letta Shtohryn


woman, holding (2020)



This work takes as its starting point, commercial facial analyses and image- description services which are trained on data sets taken from social media and explores the consequent bias that comes with those data sets. Thus, women are often described as ‘holding’, implying that the data sets with which the algorithms trained see women as carers. Women looking straight into the camera as labelled as “sexy”, while a shirtless man posing seductively is labelled as “serious” and “fine-looking”. A converse action, testing text-to-image algorithms “woman in front of a mirror” results in a semi-abstract blob that can be recognised as a posed selfie in underwear.


Steel, plastic, epoxy, printed image,wax, electronics,tablets, thread, fabric.
190cm x 260cm x 60cm







We may think of algorithms as somehow neutral, but ultimately, they have been created by people who have their own biases and prejudices. So, by default, algorithms have learnt what type of person “looked criminal” and what gender should be attributed to a default doctor, lawyer and scientist. The 2020s seen from the future are a turning point for algorithmic inequality. This is the decade when the untamed gender data gap led to women’s algorithmic invisibility.





woman, holding (2020)
Photo Elisa von Brockdorff