Brin’s “We definitely messed up.”, at an AI “hackathon” event on 2 March, followed a slew of social media posts showing Gemini’s image generation tool depicting a variety of historical figures – including popes, founding fathers of the US and, most excruciatingly, German second world war soldiers – as people of colour.
And it is a bandaid fix, to be sure, because there will be endless problems caused by the festering bandaid. It only took Gemini a few minutes to make a stink.
This whole problem is down to the laziness of refusing to properly curate the training data, and it opens the door for some seriously nefarious manipulation of the user.
At this point, I’d support legislation mandating the training sets and “system” data added to the queries be open and auditable for commercial AI products.