MET MARX,
SHE’S BLACK
This is the cultural Marxism chuds warn everyone about all the time.
“i am a commurist”
– kal marex, 2015
sounds like you used both “marvel” and “captain communism” in your prompt so the AI leaned heavily on Captain Marvel, that’s literally just her suit
Is it just me or does Karl Marx look like the lovechild of Marx and Taika Waititi?
Just got done watching Our Flag Means Death so I can see it.
commurism
Commurism
Christy Marx is Karl Marx in Marx 2
stunning
Starring Kal Marix and Slvel Aaraal
a M̷̮͂Á̷̼O̶͉͂V̷̭̚E̴͕̓L̴͈̕ ̴͇̕S̸̫̾Ť̵̫U̸͍͆I̸̹̅D̶́ͅL̷͓̎Ś̶̤ ̵̗̂ production
This issue is interesting, because it was noted that this particular Captain Marvel pose shows up duplicated in a at least one key AI dataset since it’s not technically a duplicate (different posters or promo images), but because central figure is identical in so many of these images overfitting/memorization is pretty likely.
We don’t know anything about DALLE-3 architecture wise (it has a LLM text encoder and it’s almost certainly a latent diffusion models), but presumably it’s a pretty big model so that can also increase the likelihood of overfitting.
Interesting. Just a clarification, overfitting and memorization are not quite the same thing to my understanding. Overfitting is when a model memorizes rather than generalizing, but very large models can and will do both. If you ask an image generator for “a reproduction of starry night by van gogh hanging on the wall”, or a LLM to complete “to be or not to be, that is _” you are referring to something very specific that you’d like reproduced exactly. If the model outputs what you wanted you would call that memorization but not overfitting. Still you may want to suppress memorization and you certainly don’t want overfitting. Side note, massively overparameterized models are better at both memorization and generalization and are naturally resistant to overfitting as I define it, that last thing would have surprised early ML researchers since they had noticed the opposite trend, but that trend reverses when you go large enough. Also, they will sometimes memorize on a single pass through the data, even if there’s no duplication, which is quite remarkable.
That’s a fair interpretation, although I still consider it a failure state. These models shouldn’t be used as storage/retrieval tools.
is this dalle
Via bing image creator, yes