That seems naive in a status quo bias way to me. Why and where do you expect AI progress to stop? It sounds like somewhere very close to where we are at in your eyes. Why do you think there won't be many further improvements?
$3k is already close to the price of just a 5090, so if you can have a reasonably fast but way more (equivalent to) VRAM (128gb) machine sounds great for a hobbyist assuming you can easily run and train llama, Stable Diffusion and similar vram-hungry projects.