Immediately after switching the page, it will work with CSR.
Please reload your browser to see how it works.
So that when trained again another part of the same book, the model can learn they were from same context.
Doesn’t the angle encode semantic information? Cosine similarity works for embeddings after all.
For example exact position doesn’t matter too much when tokens are spaced out. Let’s say you use token position 100 for your query, you can shift all the keys around position 100, and the further they are back in the context the more freedom you have to play with the value.