Immediately after switching the page, it will work with CSR.
Please reload your browser to see how it works.
How can a large egg (50 g) contain 147 g choline?
https://www.opennutrition.app/search/eggs-eeG7JQCQipwf
Additionally, on https://www.opennutrition.app/search/brown-lentils-VwKWF7CQq... it says:
> Unlike larger legumes, they require no pre-soaking and cook in 20-30 minutes, making them ideal for soups, stews, and salads
That is not necessarily true. Based on my experience, it does require pre-soaking, otherwise you will have to cook it for a long time, as opposed to red lentils (which is done under 15 minutes, no pre-soaking needed), although red lentils taste more like yellow peas.
In any case, I think this could be really useful, once accurate enough. One could even implement other features on top, such as a calorie tracker and so forth, but that is a huge project on its own.
I wish you luck!
The first item I manually look up is has about double calories listed in the "dataset" versus reality. Honey bunches of oats honey roasted.
>> Foods discovered via these searches are fed back into the database,
Aren’t LLMs also unreliable? How do you ensure the new content is from an authoritative, accurate source? How do you ensure the numbers that make it into the database are actually what the source provided?
According to the Methodology/About page
>> The LLM is tasked with creating complete nutritional values, explicitly explaining the rationale behind each value it generates. Outputs undergo rigorous validation steps,
Those rigorous validation steps were also created with LLMs, correct?
>> whose core innovations leveraged AI but didn’t explicitly market themselves as “AI products.”
Odd choice for an entirely AI based service. First thought I had after reading that was: must be because people don’t trust AI generated information. Seems disengenuous to minimize the AI aspect in marketing while this product only exists because of AI.
Great idea though, thanks for giving it a shot!
> I wanted to investigate the possibility of using LLMs
ah, yeah, I guess it makes sense then...
This is not a dataset. This is an insult to the very idea of data. This is the most anti-scientific post I have ever seen voted to the top of HN. Truth about the world is not derived from three LLMs stacked on top of each other in a trenchcoat.