What a surprise! A traditional outfit appears statistically significant to a large statistical model and shows more frequently. What a novel finding. I’m flabbergasted! What will be next? CEOs in jacket and tie? Dogs with fur? Why my 512x512 picture of a Inuit in a snowfield doesn’t portrait the subject wearing a bikini? Why can’t meta read my mind? WHY, MARK? WHHHHY?
How traditional? How statistically relevant is it? Most Indians i know do not wear turbans at all.
If these stats are trustworthy (and i think they are), the only Indians that wear turbans are Sikhs (1.7%) and Muslims (14.2%). I’d say 15.9% is not statistically significant.
I think you’re looking at it wrong. The prompt is to make an image of someone who is recognizable as Indian. The turban is indicative clothing of that heritage and therefore will cause the subject to be more recognizable as Indian to someone else. The current rate at which Indian people wear turbans isn’t necessarily the correct statistic to look at.
What do you picture when you think, a guy from Texas? Are they wearing a hat? What kind? What percentage of Texans actually wear that specific hat that you might be thinking of?
A surprising number of Texans wear cowboy and trucker hats (both stereotypical). A surprising number of Indians don’t wear turbans since it’s by far a minority.
I think the idea is that it’s what makes a person “an Indian” and not something else.
Only a minority of Indians wear turbans, but more Indians than other people wear turbans. So if someone’s wearing a turban, then that person is probably Indian.
I’m not saying that’s true necessarily (though it may be), but that’s how the AI interprets it…or how pictures get tagged.
It’s like with Canadians and maple leaves. Most Canadians aren’t wearing maple leaf stuff, but I wouldn’t be surprised if an AI added maple leaves to an image of “a Canadian”.
Imagine a German man from Bavaria… You just thought of a man wearing Lederhosen and holding a beer, didn’t you? Would you be surprised if I told you that they usually don’t look like that outside of a festival?
Are you literally the second coming of Jesus? Hey everybody! I found a guy who doesn’t see race! I can’t believe it but he doesn’t think anyone is changed in any way by the place that they grew up in or their culture! Everyone is a blank slate to this guy! It’s amazing!
A traditional dress is not a religious dress, it’s a dress used for a long time for it’s usefulness or fashion.
The historical use of the turban is fascinating, spanning for millennia and in a lot of regions and ethnic groups of the world.
I suggest the wiki page for further info, more precise than I have on hand.
An excerpt about the Pagri:
In Rajasthan state of India these turbans, known as Pagri or Safa, is a traditional headwear that is an integral part of the state’s cultural identity.
My point was (but it might be lost in sarcasm) that being the “hat” of Indian kings, nobles and emperors for millennia, we have a lot of drawings and also photos of Indian people with turbans, that most probably these generative models have been trained on.
On a footnote: why should the concept of a traditional dress be offensive? A lot of human groups have one.
Edit these are the words most associated with “Pagri” in english. It’s a matter of data.
On a footnote: why should the concept of a traditional dress be offensive?
Ain’t to me, couldn’t care less. I was just trying to point out that most Indians do not seem to wear turbans (and based my reasoning on the religions dress alone).
Probably they don’t because it’s not context appropriate, as we all do with our dresses. More so if you and them live in a state or city with a different dress code. These things strongly depend on context.
For generative models though, they produce usually the most stereotyped answers possible, with a pinch of randomness, so we shouldn’t be surprised about this phenomenon. They are rewarded by these things.
You don’t think nearly 1/6th is statistically significant? What’s the lower bound on significance as you see things?
To be clear, it’s obviously dumb for their generative system to be overrepresenting turbans like this, although it’s likely to be a bias in the inputs rather than something the system came up with itself, I just think that 5% is generally enough to be considered significant and calling three times that not significant confuses me.
It’s not an LLM, it’s a GAN and it’s inner workings are very different.
If that 1/6th has the most positive feedback in recognizability, for the GAN it becomes a high weighted part of the standard. These model’s categorizing flow favors unique features of images.
The fact less people of that group actually wear it than do is significant when you want an average sample. When categorizing a collection of images then, naturally, the traditional garments of a group is associated more with that group than any other group: 1/6 is bigger than any other race.
What’s the data that the model is being fed? What percentage of imaging featuring Indian men are tagged as such? What percentage of imaging featuring men wearing Turbans are tagged as Indian Men? Are there any images featuring Pakistan men wearing Turbans? Even if only a minority of Indian feature Turbans, if that’s the only distinction between Indian and Pakistan men in the model data, the model will favor Turbans for Indian Men. That’s just a hypothetical explanation.
Except if they trained it on something that has a large proportion of turban wearers. It is only as good as the data fed to it, so if there was a bias, it’ll show the bias. Yet another reason this really isn’t “AI”
Put in western or Texas and that’s what you get, the west is a huge area even just of America but the word is linked to a lot of movie tropes and stuff so that’s what you get.
This is also only when the language is English, ask in urdu or Bengali and you get totally different results, in fact just use urdu instead of Indian and get less turbans or put in Punjabi and you’ll get more turbuns.
Does it help the model to produce images that are indoubtably “american” for it’s raters or for it’s automated rating system? If yes they are statistically significant. Low frequency and systematic rarity can be both significant in a statistical analysis.
What a surprise! A traditional outfit appears statistically significant to a large statistical model and shows more frequently. What a novel finding. I’m flabbergasted! What will be next? CEOs in jacket and tie? Dogs with fur? Why my 512x512 picture of a Inuit in a snowfield doesn’t portrait the subject wearing a bikini? Why can’t meta read my mind? WHY, MARK? WHHHHY?
How traditional? How statistically relevant is it? Most Indians i know do not wear turbans at all.
If these stats are trustworthy (and i think they are), the only Indians that wear turbans are Sikhs (1.7%) and Muslims (14.2%). I’d say 15.9% is not statistically significant.
I think you’re looking at it wrong. The prompt is to make an image of someone who is recognizable as Indian. The turban is indicative clothing of that heritage and therefore will cause the subject to be more recognizable as Indian to someone else. The current rate at which Indian people wear turbans isn’t necessarily the correct statistic to look at.
What do you picture when you think, a guy from Texas? Are they wearing a hat? What kind? What percentage of Texans actually wear that specific hat that you might be thinking of?
A surprising number of Texans wear cowboy and trucker hats (both stereotypical). A surprising number of Indians don’t wear turbans since it’s by far a minority.
Woosh
I think the idea is that it’s what makes a person “an Indian” and not something else.
Only a minority of Indians wear turbans, but more Indians than other people wear turbans. So if someone’s wearing a turban, then that person is probably Indian.
I’m not saying that’s true necessarily (though it may be), but that’s how the AI interprets it…or how pictures get tagged.
It’s like with Canadians and maple leaves. Most Canadians aren’t wearing maple leaf stuff, but I wouldn’t be surprised if an AI added maple leaves to an image of “a Canadian”.
deleted by creator
Imagine a German man from Bavaria… You just thought of a man wearing Lederhosen and holding a beer, didn’t you? Would you be surprised if I told you that they usually don’t look like that outside of a festival?
I don’t picture real life people as though they were caricatures.
But AI does, because we feed it caricatures.
Are you literally the second coming of Jesus? Hey everybody! I found a guy who doesn’t see race! I can’t believe it but he doesn’t think anyone is changed in any way by the place that they grew up in or their culture! Everyone is a blank slate to this guy! It’s amazing!
No, I just don’t lob groups of people together. It’s not that hard to do, everyone’s a different person.
He was imaginary though.
A traditional dress is not a religious dress, it’s a dress used for a long time for it’s usefulness or fashion.
The historical use of the turban is fascinating, spanning for millennia and in a lot of regions and ethnic groups of the world.
I suggest the wiki page for further info, more precise than I have on hand.
An excerpt about the Pagri:
My point was (but it might be lost in sarcasm) that being the “hat” of Indian kings, nobles and emperors for millennia, we have a lot of drawings and also photos of Indian people with turbans, that most probably these generative models have been trained on.
On a footnote: why should the concept of a traditional dress be offensive? A lot of human groups have one.
Edit these are the words most associated with “Pagri” in english. It’s a matter of data.
Point taken.
Ain’t to me, couldn’t care less. I was just trying to point out that most Indians do not seem to wear turbans (and based my reasoning on the religions dress alone).
Probably they don’t because it’s not context appropriate, as we all do with our dresses. More so if you and them live in a state or city with a different dress code. These things strongly depend on context.
For generative models though, they produce usually the most stereotyped answers possible, with a pinch of randomness, so we shouldn’t be surprised about this phenomenon. They are rewarded by these things.
You don’t think nearly 1/6th is statistically significant? What’s the lower bound on significance as you see things?
To be clear, it’s obviously dumb for their generative system to be overrepresenting turbans like this, although it’s likely to be a bias in the inputs rather than something the system came up with itself, I just think that 5% is generally enough to be considered significant and calling three times that not significant confuses me.
For statistics’ sake? Yes.
For the LLM bias? No.
It’s not an LLM, it’s a GAN and it’s inner workings are very different.
If that 1/6th has the most positive feedback in recognizability, for the GAN it becomes a high weighted part of the standard. These model’s categorizing flow favors unique features of images.
5/6 not wearing them seems more statistically significant
The fact less people of that group actually wear it than do is significant when you want an average sample. When categorizing a collection of images then, naturally, the traditional garments of a group is associated more with that group than any other group: 1/6 is bigger than any other race.
so if there was a country where 1 in 6 people had blue skin you would consider that insignificant because 5 out of 6 didn’t?
For a caricature of the population? Yes, that’s not what the algorithm should be optimising for.
The algorithm is optimizing for results that are rated as precise, not just frequent.
What’s the data that the model is being fed? What percentage of imaging featuring Indian men are tagged as such? What percentage of imaging featuring men wearing Turbans are tagged as Indian Men? Are there any images featuring Pakistan men wearing Turbans? Even if only a minority of Indian feature Turbans, if that’s the only distinction between Indian and Pakistan men in the model data, the model will favor Turbans for Indian Men. That’s just a hypothetical explanation.
Except if they trained it on something that has a large proportion of turban wearers. It is only as good as the data fed to it, so if there was a bias, it’ll show the bias. Yet another reason this really isn’t “AI”
By that logic americans should always be depicted in cowboy hats.
I see you’ve watched anime featuring Americans.
Put in western or Texas and that’s what you get, the west is a huge area even just of America but the word is linked to a lot of movie tropes and stuff so that’s what you get.
This is also only when the language is English, ask in urdu or Bengali and you get totally different results, in fact just use urdu instead of Indian and get less turbans or put in Punjabi and you’ll get more turbuns.
Or just put turban in the negatives if you want
Ask an AI for pictures of Texans and see how many cowboy hats it gives back to you.
Does it help the model to produce images that are indoubtably “american” for it’s raters or for it’s automated rating system? If yes they are statistically significant. Low frequency and systematic rarity can be both significant in a statistical analysis.
It’s a traditional outfit of sikhs, not indians. Pick up a book
Are you sure?
Maybe you are confusing “traditional” with “religious”.
Pick up a dictionary, or even better, an encyclopedia.
Pre-independence, most Indian males had some sort of headgear. E.g. look at any old photos of Bombay
Are we still in “pre independence”?
Oh… didn’t know that traditional meant what you wore yesterday
You’re wrong.