> Uncategorized > Google’s mistake with Gemini
Google’s mistake with Gemini
You have probably heard that Google had to suspend it’s Gemini image feature after showing people black Nazis and female popes:
Well I have a simple explanation for what happened here. Namely, the folks at Google wanted to avoid an embarrassment that they’d been involved with multiple times, and seen others get involved with, namely the “pale male” dataset problem, that happens especially at tech companies dominated by white men, and ironically, especially especially at tech companies dominated by white men who are careful about privacy, because then they only collect pictures of people who give consent, which is typically people who work there! See for example this webpage, or Safiya Noble’s entire book.
So, in order to avoid this embarrassment, they scaled up a notion of diversity. Which is to say, they stuck diversity into everything, at scale, even if it made no historical sense.
In other words, the real mistake they made is to think that equity or fairness is something that can be done at scale.
In reality equity and fairness are narrowly defined, contextual notions. When we decide it’s fair to use a FICO score in order to determine an interest rate on a loan, that’s very different from using a FICO score to decide how many weeks of unemployment insurance you should receive after breaking your leg. You cannot decide that “FICO scores are legitimate discriminators” as a universal rule, just as “diverse skin tones and genders” is not a universal good, especially when “diverse” is not even a well-defined notion unless you specify a geographic area or culture.
This mistake that Google made was not a coincidence, by the way. It’s a result of a combination of laziness (as in, they just didn’t think very hard about this) and capitalism (as in, it would be expensive to do this right).