Immediately after switching the page, it will work with CSR.
Please reload your browser to see how it works.
There are some usecases for quantum computers and we are pretty sure that QCs are able to do stuff classical ones can't, but it is somewhat limited. In fact there is xprize to think of some applications [1].
You can separate usecases by whether they require fault tolerance. Fault tolerance is still very far away, and even single fault tolerant qubit (error rate of say 10^(-14)) would take a lot time from now, leave alone thousands/millions of them. We only had working error correction (i.e., error corrected qubit having lower error rate than physical one) this August [2] and it's still not better if you take into account the error correction overhead. Challenges include scaling manufacturing, control, cooling, etc.
Without fault tolerance you're into what's called NISQ (noisy intermediate scale quantum) where you just deal with errors instead of correcting them. As for now I guess we are in thousands of total gates we can apply before noise kills us. Maybe 10000 in the very very best case. Moreover the currently leading approach, superconducting qubits (Google, IBM), have fixed connectivity and applying gate on two disconnected qubits have additional overhead. Alternative approaches (Quantinuum's ion, QuEra cold atoms) don't have these limitations but have others.
The killer application for NISQ, in my opinion, are quantum simulations, i.e. simulations of quantum systems with quantum computers. Our ability to simulate quantum systems are fairly limited by the exponential growth of the state space, and smarter algorithms have their limitations, and you always simplify the model significantly. There may be some insights in simulating more complicated models. One obvious use for that is chemistry, but classical chemical algorithms are very well developed and classical computers are huge compared to quantum ones. Also chemical simulations require relatively deep circuits (N^3 scaling I believe), which is the main limitation of NISQ. Simulations for physical models (say, Hubbard model) are probably easier, but using this to justify billions spent would be hard. These applications in my opinion are pretty close (a number of years) to the moment you have real advantage (i.e., solve a problem in this domain nobody can solve classically), but this is optimistic opinion.
There are also applications for which the advantage may be there but we are not sure. Optimization is one example, though I think optimization on classical data will never close the vast resource gap (we have trillions of transistors in classical clusters vs hunders of qubits), even if there is advance. Optimization on quantum data probably is better on quantum computers, but again, not much pressing problems there.
For the fully fault tolerant the best known example is factorization. It is definitely better complexity then any algorithm we have classically, good enough to expect realistically sized QC one day will beat any classical computer. That said, while this has clear consequences from security standpoint, breaking some cryptography schemes it not exactly the most exciting thing IMHO, and people will switch to post quantum crypto way before (there was a recent work that could break existing PQC, but there was a bug in it [3]). There is Grover's algorithm for search but I doubt practical advantages there.
All in all it may sound a bit pessimistic, but I think if the technology arrives people will figure out how to use it.
[1] https://www.xprize.org/prizes/qc-apps [2] https://arxiv.org/abs/2408.13687 [3] https://eprint.iacr.org/2024/555