For laughs I tried it on my workstation -- Celeron J1900 with a GTX 560, and it ran BETTER than my media center / gaming machine which is a i7 4770k with GTX 1070.... so I started playing with hardware configs from my scrap pile and found something odd.
The crappier your CPU the LOWER the GPU load and better the game performs... AHA Jason said, since I've seen this behavior before just only with single-threaded games. Lo and behold ANOTHER telltale is present in that even the MENUS max out the CPU and GPU and the more powerful the CPU, the more overloaded the GPU becomes.
Physics/animation/input is all tied to the frame rate instead of leveraging a separate event / timeout. Solution?
Turn vSync on.
BOOM, media center only using 20-25% cpu evenly distributed across all cores, GPU use at ultra settings in 4x DSR (3840x2160) only 45%, rock solid framerates with no animation dropouts.
Normally I turn sync on as a matter of habit, and assumed it was on since I have it enabled as the default in the nVidia control panel, but for some reason the geForce "experience' default for the game forces it OFF. Turn it on in game, and boom, runs fine, no more strange pauses and massive dropped frames (to the tune of second or more worth) in the animations, no more crashes, no more bringing a machine five times more powerful than the game should need to its knees by drawing max power on every piece of hardware.
Seriously, chained to the frame rate when it's multithreaded?!? What is this 2003? OH wait... Unity... and that's where I saw this behavior before -- Firewatch.