I think I was running at native resolution, which is much less than what you're running at.
Edit: Ok, I actually tried running it again, on 1080p, and it's faster than I remember, maybe there was some optimization along the way or something happened. I still get really bad frame drops when the temperature drops. It runs at 45fps until it drops to 10fps for like 10/15 seconds while the temperature drops.
I actually checked cpu and gpu usage, and the cpu is not max at all, but the gpu is at 100% all the time, that might mean that a faster cpu might indeed be useful. Maybe at around 45fps it starts to become cpu bound?
Edit 2: on native resolution it runs at 30fps, and drops to 4fps when temperature drops.
Edit 3: Running at 960x600 resolution keeps it running at 50fps, even when the temperature drops. GPU usage is still at maximum.
Edit 4: Found the culprit. Global illumination is the setting that made the fps drop so hard when temperature drops. Can you check if it happens for you as well on m2 ultra?
I notice improved performance with that off, but only by using the Metal FPS overlay and observing that it drops from like 100+ to ~60 FPS. It's not really noticeable on a 60hz monitor either way.
So an m2 ultra gets 100+ fps at 1080? Nice! Yeah makes sense that it is not noticeable. I tried the game on crossover, and that setting does not cause slowdowns there. So it's a metal bug on frostpunk code.