I want to add that this is such a rough time for building and buying a PC for many things, including gaming. We're seeing the slowest standards transition in the last 30 years, just because of the bus bottleneck surrounding all of these new standards. You have new gen DDR4, SATA3.2, USB 3.1, PCIe4, DirectX12/OpenGL 4.5, and the slow transition to 4K and 8K displays(which means displayport vs HDMI)...all on early implementations. Where the bus bottleneck comes in, is an issue with requiring the CPU to readdress all of its interrupt lanes actively. It's both a power issue and a performance issue. Therefore everything on your board will be bottlenecked from either the non-mature standards or from addon chips/cards that aren't taking advantage of it.
We saw something like this in 93 and 2002. And as we know, by the time the big bus standards matured, we had a few hardware renaissances that stirred up the PC world. By 1995, you had 32-bit make a major swing when the Pentium added true floating point to an affordable CPU and made it attractive , then in 2003 64-bit made a major swing when AMD pushed the first successful 64-bit consumer x86. The two events happened because of other events, and not because companies simply figured it on on their own. The P5 architecture had to drop most RISC coproccessing to make it work after a 4 year design battle and fight with the entire standards community(the community won, intel listened, and this is where RISC vs CISC started to die as an argument for mainstream CPU futures). And when the DEC Alpha team disbanded, they reformed with AMD because they did not like Intel's shady business practices to produce the 64-bit solution. They were the best 64-bit CPU designers in the world at the time, which is why the david beat the goliath for a few years here. Right now, nobody, and I mean NOBODY exists to push a new idea through. Intel is back to their old practices that piss off the open hardware community, and AMD isn't sure if they are focusing on mobile or desktop.
The 128-bit CPU is not a solution this time, it's a different beast. Although a 128-Bit CPU could help the hardware side in some areas, it would hurt just as much. 128-bit would also create a huge software problem. So it's possible that the asynchronous multi-architecture CPU will be a realistic solution, despite being one that chip-makers have tried to avoid for the last 10 years(well, except nVidia and the Tegra). And TBH, I doubt we'll see a true async CPU hit the mainstream as well. So we're waiting on more expensive, but known solutions, to become cheaper.
TL;DR, or too much nerd...
Sorry to Monk, or anyone who has been wanting to get into a max-setting PC gaming rig. You're basically going to buy high right now for hardware that can become severely outdated in a matter of 1 year. The simple 'drop in a new GPU card' argument will not future-proof you very well this go around(unless you want to convince yourself it's all you need and be stubbornly happy with what you have, as most people will need to do in order to justify a premature big purchase)/end of TL;DR moment
But a budget gaming rig, accepting that you wont be getting max settings(still better graphics than a console) is a smarter build. Half budget now, half budget later type of thinking. While this is always true as a good method, it's especially true until all of those standards have settled. The good news is that nearly every one of these standards changes will bring component pricing down, especially where open hardware standards exist, like USB 3.1, SATA 3.2 and displayport 1.3(which should become the industry standard as well, because HDMI is pretty limited, and comes with pricey licensing fees to manufacturers).
Then we have the substrate issue and the lithography issue.... But that will be the next major breakdown of the computing industry in 2022-or so. Even the possibility of quantum computing as a consumer level purchase will run into the same exact issues.