Immediately after switching the page, it will work with CSR.
Please reload your browser to see how it works.
Software has been freeriding on hardware improvements for a few decades, especially on web and desktop apps.
Moore's law has been a blessing and a curse.
The software you use today was written by people who learned their craft while this free-ride was still fully ongoing.
Developers certainly like to have their completely integrated, connected and universal computing platform (the web). And users do not seem to particularly care about performance as long as it is good enough. And that is exactly the standard that is set, software is allowed to be so bad that it doesn't really annoy the user too much. Management doesn't care either, certainly creating good software isn't important when good enough software has already been developed.
Sure, I would like things to be different, but until one group decides that a drastic departure is necessary, nothing will change. There are also no real incentives for change, from any perspective.
The reality is that these are all business decisions:
1) Move to the cloud because the business likes the steady payout of subscriptions. Business customers love not having to hire IT teams and demand six 9s of uptime because it is someone else’s responsibility. But performance needs to just be acceptable to end users.
2) Customers refusing to upgrade on-premises software, that led to long maintenance cycles and endless patches
3) Developing once for the web vs. Multiple times for different platforms – each needing its own developers and testers.
No amount of expertise on the part of developers is going to address these fundamental forces.