Immediately after switching the page, it will work with CSR.
Please reload your browser to see how it works.

Source:https://github.com/SoraKumo001/next-streaming

⬅️
weinzierl 18 hoursReload
Inferring the Phylogeny of Large Language Models

weinzierl 22 hoursReload
Not a native speaker, here. Do you mean "proved" is preferred in a mathematical context?

weinzierl 2 daysReload
"And even other than that, machine code isn't dynamically typed because it's not typed at all."

Well, we have bytes, words, doublewords and quadwords on the machine level. They are usually referred to as data types in the processor manuals.


weinzierl 2 daysReload
I once sent a beer coaster without envelope and just with an address scribbled on and a stamp to a beer loving friend from a holiday. We both were surprised it worked.

Also in the late 90s I remember my favourite computer mag having a picture of a 5 1/4 inch floppy sent to them. Complete with postmarked stamp. Allegedly it survived the procedure.


weinzierl 2 daysReload
Ask HN: How many LLMs have been trained from scratch?
Building an LLM from scratch is huge effort. That is why I assume it hasn't happened too often. If we exclude toy and research models and restrict the scope to language models and multimodal modals with a language component, do we know the number of models trained from scratch so far?