I'll believe they actually optimized their PC port when their PC ports start showing some signs of effort at being actual PC ports
No FOV slider after FO76 had one is, to me, a great sign of how little Bethesda actually cares about the platform that keeps them popular (thanks to mods)
They don't want to put the work in for the biggest benefit of PC gaming.
I don't think any PC should be able to run a game well at max settings on release. Leave room for it to still look good five years from now.
And have the bare bones setting be able to work on shitty systems that do need upgraded.
Bethesda just wants to make games that run on a very small slice of PCs, and who can blame them when they can't even release a stable game on a single console? They're just not good at
How would you expect them to develop a game targeted towards hardware they can't test on due to it not existing? Latest PC hardware should be able to run max or near max at release.
Yeah. But at least you can use console command ( ~ tilde as usual ) to change fov
Default values are first person 85 and third person 70
Range is 70-120
The game, without using dlss mods, runs at 30fps with some stutters on my system using the optimized settings from hardware unboxed (linked) on a 4060ti. If I install the DLSS-FG mods I immediately get 60-90 fps in all areas, that alone should tell you everything you need to know..
Heres the rub, Im not a FPS whore. Its generally a good experience for this game at 30 FPS assuming you use a gamepad/xbox controller. KB+M it gets really jittery and theres input lag. The game was clearly playtested only using a gamepad. The reactivity of a mouse for looking is much different and the lower FPS the game is optimized for becomes harder to digest.
I have also tested on my 1650ti max-q, and a 1070. Both the 4060ti and 1070 were on an egpu.
My system has an 11th gen i7-1165g7 and 16 gb ddr4 ram. I play at 1080P in all cases.
For the 1650ti and the 1070 the game runs fine IF I do the following
i set the preset to Low (expected) and THEN turn scaling back up to maximum/100% and/or disable FSR entirely
set indirect shadows to something above medium (which allows the textures to run normally, otherwise they are blurry).
Even on the 4060ti, it saying its using 100% of the GPU but it is only pulling like 75-100 W, when it would normally pull 150-200W under load easy.
TL:DR - this game isnt optimized, at least for NVIDIA cards. They should acknowledge that.
I can at least change the FOV with an ini file edit, but there's no way to adjust the horrible 2.4 gamma lighting curve they have... It's so washed out on my display, it's crazy!
I would agree. They should acknowlege its not well optimized and are working on fixing it, especially with Nvidia cards. It rubs me wrong that they are in denial here, especially given their rocky release history.
Heck that think that 50% of the reason they didnt want even co-op or any netcode. FO76 was a nightmare on relase largely because of that.
While I'm probably in a small demographic here, I'm sitting with a perfectly capable PC, not a gaming supercomputer or anything but a healthy upgrade from standard, and when I started hearing about Starfield I got really excited.
...then I saw all this stuff about compatibility and performance, and when I tried to see where my equipment stood, I was left with the impression that if I wasn't a PC building enthusiast, it was going to be hard to figure it out, and if my computer wasn't pretty high end and new, it was probably going to struggle.
And now hearing more about the performance issues, I've basically given up even considering the game based on possible performance and compatibility issues.
This seems to be the new normal though unfortunately.
It's normal that 5-10 percent generational improvement on a product (6800 XT - > 7800 XT) is celebrated and seen as good product, when in reality it's just a little less audacious than what we had to deal with before and with Nvidia.
It's normal that publishers and game studios don't care about how their game runs, even on console things aren't 100 percent smooth or often the internal rendering resolution is laughably bad.
The entire state of gaming at the moment is in a really bad place, and it's just title after title after title that runs like garbage.
I hope it'll eventually stabilize, but I m not willing to spend 1000s every year just to be able to barely play a stuttery game with 60 fps on 1440p. Money doesn't grow on trees, which AAA game studios certainly seem to think so.
Yes GI and baked in RT / PT is expensive, but there need to be alternatives for people with less powerful hardware, at least until accelerators have become so powerful and are common in lower end cards to make it a non issue.
The Nvidia issue might be more Nvidia's fault. All three major GPU companies (including Intel now) work with major publishers on making games work well with their products. This time around, though, Nvidia has been focused on AI workloads in the datacenter--this is why their stock has been skyrocketing while their gaming cards lower than a 4080 (i.e., the cards people can actually afford) have been a joke. Nvidia didn't work much with Bethesda on this one, and now we see the result.
Nvidia is destroying what's left of their reputation with gamers to get on the AI hype train.
They didn't optimize it for consoles either. Series X has equivalent of 3060 RTX graphical grunt, yet it's capped to 30fps and looks worse than most other AAA games that have variable framerates up to 120fps. Todd says they went for fidelity. Has he played any recent titles? The game looks like crap compared to many games from past few years, and requires more power.
The real reason behind everything is the shit they call Creation Engine. An outdated hot mess of an engine that's technically behind pretty much everything the competition is using. It's beyond me why they've not scrapped it - it should have been done after FO4 already.
Look, I agree with everything from the first paragraph, and the CE does seem to have a lot of technical debt that's particularly shown in Starfield, which is trying to do something different than the other games. The engine being "old" is bad though (I know you didn't make it, but it's often said), and it being "technically behind" other engines is really true in all ways.
The Creation Engine had been adapted by Bethesda to be very good at making Bethesda games. They know the tools and the quirks, and they can modify it to make it do what they want. It has been continuously added onto, just as Unreal Engine has been continuously added onto since 1995. The number after the engine name doesn't mean anything besides where they decided to mark a major version change, which may or may not include refactoring and things like that. I have a guess that CE2 (Starfield's engine) is only called CE2 because people on the internet keep saying the engine is old, but tell them to use UE (which is approximately the same age as Gamebryo) but adds numbers to the end.
Correct me if Im wrong but dont they limit frametimes so they can reduce tv stuttering? NTSC standard for TVs is 29.94 or 59.94 fps. I assume they chose the 30fps so it can be used more widely and if its scaled to 60 it would just increase frametime lag. Again, im not sure.
Also, comparing CE2 to CE1 is like comparing UE5 to UE4. Also, i dont remember but doesnt starfield use the havok engine for animations?
Edit: rather than downvote just tell me where I am wrong
Not to put too fine of a point in it but you're wrong because your understanding of frame generation and displays is slightly flawed.
Firstly most people's displays, whether it be a TV or a monitor, are at least minimally capable of 60hz which it seems you correctly assumed. With that said most TVs and monitors aren't capable of what's called variable refresh rate. VRR allows the display to match however many frames your graphics card is able to put out instead of the graphics card having to match your display's refresh rate. This eliminates screen tearing and allows you to get the best frame times at your disposal as the frame is generally created and then immediately displayed.
The part you might be mistaken about from my understanding is the frame time lag. Frame time is an inverse of FPS. The more frames generated per second the less time in between the frames. Now under circumstances where there is no VRR and the frame rate does not align with a displays native rate there can be frame misalignment. This occurs when the monitor is expecting a frame that is not yet ready. It'll use the previous frame or part of it until a new frame becomes available to be displayed. This can result in screen tearing or stuttering and yes in some cases this can add additional delay in between frames. In general though a >30 FPS framerate will feel smoother on a 60hz display than a locked 30 FPS because you're guaranteed to have every frame displayed twice.
Todd said they capped it at 30 for fidelity (= high quality settings). Series X supports variable refresh rate if your TV can utilize it (no tearing). Series X chooses applicable refresh rate which you can also override. All TVs support 60, many 120, and VRR is gaining traction, too.
Let's take Remnant II, it has setting for quality (30) balanced (60) and uncapped - pick what you like.
CE is still CE, all the same floaty npc, hitting through walls, bad utilisation of hardware have been there for ages. They can't fix it, so it's likely tech debt. They need to start fresh or jump to an already working modern engine.
That's for movies, I don't remember why, but films can be fine in 30fps. Games are kinda horrible at 30fps, all TVs I know have 60Hz or higher refresh rate for all PC signals
What... are you doing in the background? I've got a 3070 and 4k monitor, and I get between 50 and 60 FPS with all the settings I can fiddle with disabled enabled. I use RivaTuner to pipe statistics to a little application that drives my Logitech G19 with a real-time frame graph, CPU usage, memory load and GPU load and it uses multi cores pretty well, and generally makes use of the hardware.
-- edit --
Thanks for pointing out I made totally the wrong comment above, changing the meaning of my comment 180°...
Well I get that framerate with everything maxed at 3440x1440. I have turned things down to get a higher framerate, but it shouldn't be struggling. I don't have anything else running other than the usual programs that stay open.
Look man I'm not trying to defend Howard here or imply you're tech illiterate, or that all your issues clear up 100%, but have you by chance updated your driver's? Mine were out of date enough starfield threw up a warning which I ignored and was not having a good experience with the same as you(iunno your ram or storage config but I was running on an average NVME and 32 gigs ram with 5600x and 6700xt). But after I updated a lot of the issues smoothed out, not all, but most. At 60 fps average with a mix of med high at 1080p though. Maybe worth a try?
I'm on a 5700xt and game runs around 60 fps at 1080p everything cranked and no FSR or resolution scale applied, so I'd say either your drivers are out of date or something else is wrong there imo
I'll have to double check, nothing immediately stands out as wrong, the game is on an NVME, I'm running 3600mhz cl14 memory, and I just redid the thermal paste on my CPU. With all that being said, most other games I play get 100+fps, including Forza Horizon 5 with its known memory leak issue and Battlefield 5, so I don't think anything is wrong with the system
While the frame rate I've been getting is not at all consistent, I do get 45-90 fps, which is quite playable with Freesync. Running 3840x1600 w/5800X3D and 6700XT. Not too crazy of a system. From my understanding, it's the 2000 and 3000 series Nvidia cards mainly having issues.
Good to hear. From what I've been reading, there are a lot of users (maybe just a vocal minority?) with 2000 and 3000 series cards that are performing poorly with the same GPU at 1440p high and at 1080p low settings. The graphical settings don't seem to affect performance for those users.
I had Freesync set to ON on my monitor and it caused a ton of flashing like a strobe light was on. When I turned it off it went away. Any idea what that could have been? I'd like to be able to use it.
My 2070 super has been running everything smoothly on High at 3440x1440 (CPU is a Ryzen 5600X, game installed on an M.2 SSD). I haven't been closely monitoring frames, but I cannot think of a single instance where it's noticeably dropped, which for me usually means 50+ fps for games like this. I may even test running on some higher settings, but I've done very little tweaking/testing.
I did, however, install a mod from the very beginning that lets me use DSLL, which likely helps a ton.
3080 here with zero issues. Running on ultra everything I get 50-60 fps consistently with dips down to 30 once or twice briefly in really busy areas. Also at 2160p resolution too.
Because everyone can run out and get a 4x because of Starfield. What a chode lol
When did I ever suggest anyone "run out. and get a 4x"? Don't upgrade your GPU (assuming it's within the specified requirements), wait for patches, drivers, and the inevitable community patch.
They did lol, and that's a really dumb question by a tech illiterate. Optimization isn't a boolean state, it's a constant ongoing process that still needs more work.
They only meant to say that Bethesda did optimize throughout the development process. You can't do gamedev without continually optimizing.
That does not mean, Bethesda pushed those optimizations particularly far. And you can definitely say that Bethesda should have done more.
Point is, if you ask "Why didn't you optimize?", Todd Howard will give you a shit-eating-grin response that they did, and technically he is right with that.
Also optimization happens between a minimum and a maximum. If Bethesda defines that the game should have a certain minimal visual quality (texture and model resolution, render distance, etc), that will lead to the minimum that hardware has to offer to handle it.
What modders so far did, was to lower the minimum further (for example by replacing textures with lower resolutions). That's a good way to make it run on older hardware, but it's no optimization Bethesda would or should do, because that's not what they envisioned for their game.
My entire comment is about why your response doesn't make sense, they do and it's not a process that's ever "done". It's whether how optimized is it and if it runs well on targeted specs.
Games are somehow too CPU heavy these days even though they aren't simulating the entire world like Kenshi, just stuff around you, so even though I upgraded my gpu I can barely get to 30fps.
Also had this problem with Wolong, Hogwarts and Wild Hearts.
This is what happens when consoles improve their CPUs.
Suddenly they've got more cycles than they know what to do with, so they waste them on frivolous unnoticeable shit. Now you don't have that extra headroom to get you from console 30fps to PC 60fps+. You're on a much more even footing than PCs ever were with the underpowered (even at release) PS4 and Xbox One.
You'll struggle to get a CPU that does double what a PS5 can, and if it's being held back by a single thread performance (likely), there's nothing you'll be able to do to get double that.
I agree, I have an i7-8700k and a 2080super which I'd say are like mid to high level specs and I have a terrible time running Wild Hearts and Starfield. Such a damn shame too as a big MHW and MHR fan I was really looking forward to Wild Hearts and just couldn't run the game well at all. At this point I'm just not surprised when a triple A game runs like dog water on my system, usually these games are free on gamepass I try them out and 5 minutes later I uninstall.
I wouldn’t consider a 8700k or a 2080 super high level specs or even mid level right now.
Consider that an 8700k is slower than a 13400f today which is considered the absolute lower end of the mid range, realistically 13600k or 13700 is the mid range on the intel side.
To be blunt the 8700k is 5 years old.
The 2080s is well look at this chart and you make a decision
I think a lot of people are just not appreciative of how out of date their hardware is relative to consoles atm
I mean, it's Bethesda we're talking about. They've probably done enough optimization for it to run on consoles, but went completely unhinged on pc, expecting modders to fix their game. And they will. I'd just like a polished experience for 70€
Same, it's pretty much unplayable on an HDD, buttery smooth on my SSD though. I feel like it needs to be a rather loud PSA/notice on store pages about it.
I run Starfield on a 7700x with 6750XT, 32 gb, Samsung ssd and a freesync oled and it is not flawless at all. It's alright but framerates are all over the place. In gunfights with many enemies fps drops and input lag occurs making it hard to aim for instance. In denser enviroments it is starting to drop frames a la fallout 4, and in 2023 that is still a big problem even on recent systems.
My friend runs it on a 7900xtx and although he has an awesome rig, Starfield is all over the place with fps. No steady frame rates. He can run it 'flaweless' though, but it requires more than just an SSD, but a very expensive rig as well.
I find all the jokes in these comments about having 4090's particularly hilarious considering that the problems seem to be with NVIDIA drivers and performance is better on AMD.
Part of the reason is because starfield is cpu intensive, and its not often that cpus get taxed like it does in starfield. Nvidia gpus have a higher cpu overhead, especially the 4090 in any resolution under 4k.
Ive played the game with the same CPU across three different nvidia cards. My CPU is rarely above 50%....My GPU is constantly pegged, my best GPU (a new 4060ti 16GB) and most recent is pegged at usage and at its top clock speed but not pulling full wattage, which is quite odd. On day 1 i was using a Strix 1070 8GB and it was.....rough. In fact the dgpu on the laptop, a 1650ti-maxq largely outperformed it mainly because it can do VRS.
The CPU, for reference is an i7-1165g7 capped at 35 watts. It generally runs at 4.1 Ghz @ 60-65 C. So its not a primo CPU compared to a desktop by any means, even for the 11th gen series.
"fine" is subjective, I guess.
To me fine means above 60fps at all times.
I have an "UFO rated" computer on userbenchmark and it's rarely above 40fps. And the game is not even pretty. Not fine by my standards.
I'm playing with an R7 5700x and a 3060ti. NVMe SSD storage, 32gb RAM. Ultra settings at 1440p on an ultra wide monitor. I don't understand the fuss, it is completely playable for me. I guess my standards are just lower? I don't know. It's seriously fine. I try not to get wrapped up in actual numbers, I'm more concerned about how it looks and feels to my actual eyes. Not the prettiest game I've ever seen but it looks alright to me and it feels smooth to me.
Haha yeah I'm on the same and its great! I did have issues when I didn't realize I installed on HDD, but once I moved the install to my SSD, no problems at all on 2070.
Instead of cracking jokes he should improve the piss poor optimization.
Can’t even render 50fps consistently on a Strix 3090OC at 1620p (accounting for resolution scale), what a joke.
Edit: Scratch that, it’s even worse, averaging around 40 fps with HUB Quality settings, so not even on Ultra and my 12900K is nowhere near bottlenecking.
I have an old FX-8350 processor, 16GB of RAM, and a 3060 RTX video card. I locked the FPS to 30 through Nvidia Control Panel and it runs pretty well, better than I expected on an old 8 (sorta 4) core processor from 2012. Before I locked it the FPS kept varying between 30 and 60 and that made me feel queasy.
The main thing I don't like is how faded/hazy many things look, needs some contrast or sharpening. I installed a mod but it didn't go far enough for my taste.
Sounds like you may want to look into reshade. There's a ton of reshade presets up on the Starfield nexus, many of which may make the improvements you're looking for.
Yep that will be my next step, too busy playing for now lol. Hopefully Bethesda will add some things like brightness/contrast, FOV, post-process level, texture quality, and sharpening.
I can get it to run just fine, it just looks like a game from the 1990s if only the colors brown, more brown and poop brown were used. I'm sure it's a great game but since the graphics make me throw up 20 mins in and there are no accessibility options I will never get to play it.
Lack of native HDR is pretty disappointing. I need to figure out how to get that working and maybe try a Reshade mod too. I can't stand wandering around New Atlantis in the day, the contrast is so bad it's very unpleasant to look at. It should be a beautiful awe-inspiring future city but the area right outside the lodge with the trees is just not great.
Saw that myself for the first time last night. Surprisingly atrocious. And WTH happened to the shadows around it? Did I just visit it on an overcast day?
The trees themselves would be right at home in a game from the early 2000s. Frickin' Planetside 2, the game infamous for its indestructible trees and graphics from 10 years ago, has better looking flora assets.
I will also go as far to say it looks as if the game was designed for HDR, but due to lack of time, they just compressed the range, capped it about 10% below the normal maximum to leave some breathing room, and called it a day. Even the flashlight looks washed out at times.
After some tweaking of settings, games runs fine. I shouldn't HAVE to do those things, but it's doing fine now
For reference, 3700x, rtx 2070s. Game heavily gpu bound. 50-60fps, no lag, no stutter , no whatever, honestly I never noticed losing a couple of fps, and that's just in crowded cities/areas, in other place it'll go higher.
What I did :
update drivers
installed free dlss mod from nexus
most setting on medium, render scale 50%, shadows on low
After doing this, everything has been 100% fine and never had any other issues. Before, I sometimes had slideshow in new Atlantis, heavy stutter, occasionnal massive fps drops. But these 3 things have 100% resolved my issues entirely
honestly it runs fine on my 5700xt r5 3600 combo. not max settings, I set to "high" from memory as the game defaulted to the minimum for me, but I could bump it no worries. no real frame rate or stuttering issues. I'd love to run it higher but I'm a realist, and new PC is on the cards anyway over the next year
i actually turn frame counter off. i know im not getting 60, but what i am getting is sufficient that it doesnt ruin my fun, and on my older hardware im ok with it. if i looked at the counter i would probably be more dissapointed
Ran just fine on an i5-6600K paired with a 6600XT. Had it on max settings with FSR on at 60fps and it was quite playable, even with Plex server running in the background.
Occasional crashes but the 6600K is below the minimum spec so I was surprised it even ran at all.
Consoles are essentially PCs locked down to gaming but they still have their own APIs and have very few hardware variations. Games can be optimised for the handful of different consoles in ways that just aren't possible with the thousands of combinations of PC components.
This would be insane. The majority of Steam users are running outdated hardware. Devs aren't going to cut their PC games down just to focus on the majority.
Nah, there's tons of things you can optimize, independent of the hardware. The whole industry runs on smokes and mirrors, because even a 2D game can bring the strongest hardware to its knees, if it's badly coded / unoptimized.
(Yes, I have experience with that. 🙃)
And there's always more smokes and mirrors you could be integrating to squeeze out more performance.
I think, their strategy is actually the reverse. They try to strengthen XBOX by making this game a quasi-exclusive for it.
I don't think, they'll gain many new XBOX customers with Starfield, as it's neither so exceptionally good, nor does it do things, you can't find in other games, to make anyone buy a new console for it.
But since their other big IP this year, Redfall, was a complete dud, they're probably rather even worried of losing long-time customers.
AMD might have had a surge in component sales due to Starfield bundles. But I can't see it selling a lot of Xbox consoles when it's a game that kind of makes the console look bad with a 30fps cap even on the top-of-the-range variant.