Immediately after switching the page, it will work with CSR.
Please reload your browser to see how it works.
It's meant for embedding, tinkering, and learning.
To that end, 4GB of RAM on an AI Accelerator board is fine - the expected workloads will not consume a lot of RAM. This also makes the lack of NVMe sufficient as well.
For more "horsepower" there is also the BeagleBone AI-64[1], which claims up to 8 TOPS.
I think most of us here are familiar with the fact that amd64 machines are made entirely from commodity parts: ATX cases, ATX power supplies, and so on. I wonder whether there's a similar commodification in the offing for the Pi form factor?
So, reach out & it'll be there?