Behind Shirley

Ibiye Camp (GB)

Behind Shirley deconstructs and rethinks the colonial narratives in the development of facial recognition systems, exploring how darker skin was not taken into account in film chemistry and is now ignored in facial-recognition software. 

In photography, “Shirley cards” were used as a standardized reference for color-balancing skin tones. These cards generally showed a single Caucasian woman dressed in bright clothes, and colored square blocks of blue, green, and red. The chemicals distorted tones of red, yellow, and brown, which led to faults when photographing darker skin. Film was not improved until furniture and chocolate makers began complaining that it was unable to capture the difference in wood grains and chocolate types. The default towards lighter skin in technology is still present today, with facial recognition occasionally not registering people of color. 

The algorithmic bias that exists in digital-imaging technology is due to human biases. When trying to make artificial intelligence, we inevitably recreate human intelligence. AI finds patterns from within pools of data, reflecting our own behavior and often exacerbating its negative aspects. Empathy has a growing importance in artificial intelligence, datasets and algorithms, fields whose inherent perspectives require further interrogation. 

Behind Shirley deconstructs and rethinks the colonial narratives in the development of facial recognition systems, exploring how darker skin was not taken into account in film chemistry and is now ignored in facial-recognition software. 

In photography, “Shirley cards” were used as a standardized reference for color-balancing skin tones. These cards generally showed a single Caucasian woman dressed in bright clothes, and colored square blocks of blue, green, and red. The chemicals distorted tones of red, yellow, and brown, which led to faults when photographing darker skin. Film was not improved until furniture and chocolate makers began complaining that it was unable to capture the difference in wood grains and chocolate types. The default towards lighter skin in technology is still present today, with facial recognition occasionally not registering people of color. 

The algorithmic bias that exists in digital-imaging technology is due to human biases. When trying to make artificial intelligence, we inevitably recreate human intelligence. AI finds patterns from within pools of data, reflecting our own behavior and often exacerbating its negative aspects. Empathy has a growing importance in artificial intelligence, datasets and algorithms, fields whose inherent perspectives require further interrogation. 

cargocollecive.com/ibiyecamp/Behind-Shirely

Behind Shirley, Film 5'58", by Ibiye Camp 

With support from: 5th Istanbul Design Biennial 

Ibiye Camp (GB) is an artist whose work engages with technology, trade, and material within the African Diaspora. Ibiye’s work utilizes architectural tools to create sound and video, accompanied by augmented reality and 3D objects, and highlights the biases and conflicts inherent to technology and postcolonial subjects. Ibiye tutors at the RCA with architectural design studio ADS2, titled “Black Horizons: Worlding within the ruins of Racial Capitalism.” Ibiye co-founded Xcessive Aesthetics, an interdisciplinary design collective exploring data through immersive technologies and public installations. 

Ibiye Camp (GB) is an artist whose work engages with technology, trade, and material within the African Diaspora. Ibiye’s work utilizes architectural tools to create sound and video, accompanied by augmented reality and 3D objects, and highlights the biases and conflicts inherent to technology and postcolonial subjects. Ibiye tutors at the RCA with architectural design studio ADS2, titled “Black Horizons: Worlding within the ruins of Racial Capitalism.” Ibiye co-founded Xcessive Aesthetics, an interdisciplinary design collective exploring data through immersive technologies and public installations.