← Back to context

Comment by tivert

3 months ago

> I think Dan performs a mild sleight-of-hand trick here: he asks why we don't consider this a bug when any other software would consider it a bug. But in fairness, the question was not, "Did this prompt have a bugged output," it's "Did it have a racially biased output," and that's a more emotionally charged question.

In the initial examples, it's not a slight of hand. The thing has racial biases so bad that they bugged the output. It's job was to convert a casual photo of a particular person into "professional" photo, and instead of just changing the clothes and setting, it changed the person too.

Then all kinds of apologists tried to gaslight the bug away instead of acknowledging the system is faulty and not fit for purpose.

> If I wrote software that choked on non-ASCII inputs for say a name, and then someone said, "Hey, that's a bug," cool, yes, fair. If someone said, "This is evidence of racial bias," I mean... I'd probably object.

And what if they just said it was an instance of "bias," like the OP describes similar bugs? I don't think you'd have grounds to object.