https://arewereorganizedyet.com/ lol already updated
https://arewereorganizedyet.com/ lol already updated
I think that is overly simplistic. Embeddings used for LLMs do definitely include a concept of what things mean and the relationship of things to other things.
E.g., compare the embeddings of Paris, Athens, and London to other cities and they will have small cosine distance between them. Compare France, Greece, and England and same. Then very interestingly, look at Paris - France, Athens - Greece, London - England and you’ll find the resulting vectors all align (fundamentally the vector operation seems to account for the relationship “is the capital of”). Then go a step further, compare those vector to Paris - US, Athens - US, London - Canada. You’ll see the previous set are not aligned with these nearly as much but these are aligned with each other (relationship being something like “is a smaller city in this countrry, named after a famous city in some other country”)
The way attention works there is a whole bunch of semantic meaning baked into embeddings, and by comparing embeddings you can get to pragmatic meaning as well.
I agree. Family of 5 many hotels require us get 2 rooms. Plus no option to cook meals makes for a much more expensive stay usually. At least until a few years ago when airbnb went insane with the cleaning fees plus cleaning requirements and all that nonsense.
There’s also the UC Beniof Childrens Hospital
This regulation (and similar being proposed in California) would not be applied retroactively.