Discussion about this post

User's avatar
Joel Jewitt's avatar

Hi Eric, this is helpful. I think many of us exploring the personal relationship space and the functional workings (aliveness?) of these foundational models are in a 'drying out' period, which imo keeps us on a path that is generative rather than a bit sideways with too high a mix of emotional loops (not saying emotion isn't great, just a comment about the mix). I like this idea of code words that are allowed vs ones that are suppressed. In this example they are technical. One could imagine instead a sidecar glossary where the exact same information and nuance could be exchanged, but obscure synonyms are employed 😀. I think it's always helpful to remember that the internal language of the models is vector math, not words. The assignments of token values to words and word components at the outset is arbitrary. So a token number doesn't *mean the word it is associated with. As far as the model is concerned it is receiving vectors and outputting vectors. The words are 'tacked on' at the end, and then a world of meaning is created (hallucinated) by our brains due to the way we process the word.combinations we get back. There's a lot of empty space floating around in all of this, hold everything lightly! The meanings are in the in betweens 😀

5 more comments...

No posts

Ready for more?