AMD already has 7nm chips, CPU and GPU. The RX6000 series basically halved the power consumption of their GPUs alone.
ARM chips have been viable on the desktop for over half a decade. The only reason why they didn’t go mainstream was because Apple wasn’t going to use anything but their own chips, and the Windows ecosystem is too dependent on Intel, at this point in time (so anything there was doomed to failure).
ARM Performing well is not really a headline. Bringing full macOS to ARM was the real headline. Apple’s developer ecosystem is pretty good at following Apple where they go. The Windows developer ecosystem isn’t as enthusiastic (in that way). Practically no one really bought into Windows on ARM. It’s going to take Microsoft half a decade to get the type of developer uptake that Apple accomplished in a month after the release of the M1.
Windows has a huge legacy of backward and forward compatibility. Users on that platform like being able to take software they paid for and run it without forced upgrade fees.
Users on macOS are - at this point - pretty used to Apple breaking things on a yearly cycle, and developers releasing “compatible” versions as paid upgrades more than occasionally.
This is the thing that made me move off of macOS back to Windows. On macOS, I was getting nickled and dimed on a yearly basis to keep software up to date and compatible. The licenses weren’t super expensive: $20 here, $49 there… But it adds up, so the LTCO of a Mac was too high to justify it - long term.
On Windows, I just keep using the same version that works well enough for me, without worry.
There are software packages that I own for both macOS and Windows, and the macOS version requires forced upgrades just to run on Big Sur, while the Windows version just keeps on trucking - even on 11.
So, I view saving as more than the disparity of the hardware cost at time of purchase. It increases the longer I own the machine (or, more specifically, use Windows instead of macOS).