Immediately after switching the page, it will work with CSR.
Please reload your browser to see how it works.

Source:https://github.com/SoraKumo001/next-streaming

⬅️ Beyond Quacking: Deep Integration of Language Models and RAG into DuckDB
bob1029 9 daysReload
You could quickly wire up one of the LLM APIs as an application-defined function using SQLite if you wanted to play around with the idea of very slow and expensive queries:

https://sqlite.org/appfunc.html

https://learn.microsoft.com/en-us/dotnet/standard/data/sqlit...

Maybe stick with the aggregate variety of function at first if you don't want any billing explosions. I'd probably begin with something like LLM_Summary() and LLM_Classify(). The summary could be an aggregate, and the classify could be a scalar. Being able to write a query like:

  SELECT LLM_Summary(Comment)
  FROM Users
  WHERE datetime(Updated_At) >= datetime('now', '-1 day');
Is more expedient than wiring up the equivalent code pile each time. The aggregation method's internals could handle hierarchical summarization, chunking, etc. Or, throw an error back to the user so they are forced to devise a more rational query.

ofrzeta 9 daysReload
Here is the implementation https://github.com/dsg-polymtl/flockmtl

geekodour 8 daysReload
This paper, and solutions such as sqlcoder, and https://doris.apache.org/zh-CN/blog/Tencent-LLM etc. Which let you query DB via natural language etc.

But recently there has been a surge around MCPs being able to query databases provided the n-number of MCP servers popping up. An example: https://www.reddit.com/r/ChatGPTCoding/comments/1jd9lfa/lear...

So I was wondering of things like the Doris blogpost, this paper and sqlcoder are still relevant/what extra does this approach offer vs trying to build a over mcp?


simlevesque 9 daysReload
I'll try this at work tomorrow !