An introduction to transformers and their prerequisitesEarly view of the next chapter for patrons: https://3b1b.co/early-attentionSpecial thanks to these sup...
It’s interesting how words with similar semantic content are close together in the abstract vector space. Those groupings seem like they could reveal the idological content of the training data.
Lol I don’t have it on hand, but someone did a centroid analysis of another smaller but open GPT model, and found the “word” at the “center” of the model space was “a man’s penis”
It’s interesting how words with similar semantic content are close together in the abstract vector space. Those groupings seem like they could reveal the idological content of the training data.
Lol I don’t have it on hand, but someone did a centroid analysis of another smaller but open GPT model, and found the “word” at the “center” of the model space was “a man’s penis”