Immediately after switching the page, it will work with CSR.
Please reload your browser to see how it works.

Source:https://github.com/SoraKumo001/next-streaming

⬅️
jebarker 3 hoursReload
Definitely captures the feel of a ski resort. I really like the feel of controlling the skier too. Performance on my iPad was great in Safari. Nice job so far.

jebarker 6 hoursReload
Colorado is bad, especially around Denver, but nowhere near the depths of California or several East Coast cities in my experience.

jebarker 1 daysReload
Thanks for the clarification. I understand much better what you mean by "scientific generalization". I can't tell whether you're suggesting that LLMs are a dead end for modeling meaning or just that LLMs as estimating probability distributions is the wrong way to think about them?

jebarker 1 daysReload
I don't really follow what you're saying here. I understand that the use of language in the real-world world is not sampled from a stationary distribution, but it also seems plausible that you could relax that assumption in an LLM, e.g. conditioning the distribution on time, and then intra-distribution generalization would still make sense to study how well the LLM works for held-out test samples.

Intra-distribution generalization seems like the only rigorously defined kind of generalization we have. Can you provide any references that describe this other kind of generalization? I'd love to learn more.


jebarker 2 daysReload
> Do they merely memorize training data and reread it out loud, or are they picking up the rules of English grammar and the syntax of C language?

This is a false dichotomy. Functionally the reality is in the middle. They "memorize" training data in the sense that the loss curve is fit to these points but at test time they are asked to interpolate (and extrapolate) to new points. How well they generalize depends on how well an interpolation between training points works. If it reliably works then you could say that interpolation is a good approximation of some grammar rule, say. It's all about the data.