cyu@sh.itjust.works to Technology@lemmy.worldEnglish · 1 year agoAll women pictured are A.I. generatedfiles.catbox.moeexternal-linkmessage-square79fedilinkarrow-up1239arrow-down156cross-posted to: technology@lemmy.ml
arrow-up1183arrow-down1external-linkAll women pictured are A.I. generatedfiles.catbox.moecyu@sh.itjust.works to Technology@lemmy.worldEnglish · 1 year agomessage-square79fedilinkcross-posted to: technology@lemmy.ml
minus-squareversionist @lemmy.worldlinkfedilinkEnglisharrow-up30arrow-down5·1 year agoThat’s a lot of white chicks.
minus-squarecyberpunk2350@lemmy.worldlinkfedilinkEnglisharrow-up24arrow-down1·1 year agoReally it’s like 6, copy pasted over and over
minus-squarespammedevito@kbin.sociallinkfedilinkarrow-up8arrow-down1·1 year agoYeah, a lot of the faces look very very similar!
minus-squaresetVeryLoud(true);@lemmy.calinkfedilinkEnglisharrow-up7·1 year agoPoints to a limited dataset including mostly white people
minus-squarekromem@lemmy.worldlinkfedilinkEnglisharrow-up4·1 year agoWithout knowing the prompt vs the model, it’s impossible to say which side of things is responsible for the lack of variety. Many modern models are actually very good at reversing sample biases in the training set.
That’s a lot of white chicks.
Really it’s like 6, copy pasted over and over
Yeah, a lot of the faces look very very similar!
So?
Points to a limited dataset including mostly white people
Without knowing the prompt vs the model, it’s impossible to say which side of things is responsible for the lack of variety.
Many modern models are actually very good at reversing sample biases in the training set.
Good point