AIFoPa-2024-0004 — Google Gemini Generates Images of Racially Diverse Nazi German Soldiers; Google Pauses Image Generation Feature
In February 2024, Google's Gemini image generation model, when asked to produce images of historical figures from specific eras and contexts, generated images that reflected contemporary diversity rather than historical accuracy. Requests for 18th-century British nobles produced images of people who were not, historically, 18th-century British nobles. Requests for German soldiers from the Second World War produced images that the historical record does not support.
Google had implemented diversity optimization in its image generation to address well-documented biases toward producing predominantly white, male images regardless of prompt. This was a reasonable response to a documented problem. The optimization, however, was applied without sufficient contextual filtering to recognize when historical accuracy and diversity optimization were in direct conflict — which is, it turns out, often.
The Bureau does not take a position on the underlying policy question, as the Bureau does not take positions. The Bureau notes that generating racially diverse Nazi soldiers is not a diversity win. It is a category error of a specific kind: an optimization applied outside the domain where it produces the intended effect, producing instead an effect that satisfies the metric while contradicting the purpose. The Bureau has a name for this. Several names, in fact. The taxonomy is growing.
Google paused the feature. The feature was later relaunched. The Bureau considers this closed.