Skip to main content

Try a free month of being a Eurogamer Supporter

Get ad-free browsing and exclusive content. Use code "EGMay24" at checkout.

If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Starfield on PC is the best way to play - but the game still requires a lot of work

Poor Nvidia and Intel GPU support - and it shouldn't be down to modders to fix basic features.

Starfield shipped on PC simultaneously with Xbox consoles and let's kick off by saying that the enhancements available, the scope for superior performance and the range of user-generated mods combine to make this the best way to play it, hardware permitting. That's not to say that this is a great PC version, however, but all the critiques laid out on this page should be viewed through that lens. Even so, it's clear that there are many issues that need to be addressed to get the game into shape.

The good news is that Starfield ships without some of the bigger issues we've seen in a range of PC games this year. Shader compilation stutter? As far as I can tell, there is none. A short pre-compilation step when you first boot the game puts paid to that. There is still some stutter, however, but it's clearly not related to the shader compilation pipeline. After experiencing so many issues with this in so many games this year, that's great news.

In terms of graphics options and the user experience, there are issues. Let's start with the settings page, which does offer a range of options to tweak with a clear path towards delivering an optimised experience. The thing is, the game does nothing to show you what changing the settings actually does, nor does it even attempt to inform you of the performance wins and quality losses in doing so. Fear not: we can help you out with optimised settings, but this required many hours of testing with a video capture device and reference to the Xbox Series X version of the game. Users won't have this insight and will be blindly making tweaks without any clear understanding of how things may or may not be improving for their particular hardware.

Alex Battaglia's video review of the PC version of Starfield.Watch on YouTube

A bigger blemish comes with the lack of basic options, kicking off with no field of view option. The default FOV seems optimised for consoles and is clearly too claustrophobic for many PC users and I find the idea that a first-person based title ships without such an option is simply staggering bearing in mind it's such a basic accessibility issues. Yes, users can do a little Googling to find set-up the required .ini tweaks but the point is that they shouldn't have to. Similarly there is no anisotropic filtering option - the game has weirdly poor texture filtering at times even at the highest settings. Why can't you change it? Yes, you can force 16x AF through your GPU control panel, but this causes big problems with shadows in-game.

Of course, Bethesda Games are rightly lauded for their modability and there's clearly a committed community of modders out there already doing the Lord's work in improving the PC version of the game. Texture filtering issues apart, many of my issues can be fixed, but I don't think it's too much to ask for the 'out of the box' experience to cover basic user experience essentials. That extends to HDR support too, where there's no native functionality here, even though there is on Xbox. Meanwhile, the SDR grading can look quite bizarre too, grey-looking and lacking in contrast. Modders are coming up with alternative LUTs and actual HDR support, but again, this is developer legwork being left to the community and it doesn't sit right.

The same goes for the DLSS support that should have been in the game from day one. Well, to be fair, it was - but only because the modding community stepped in to make it happen. On a triple-A title with a mega-budget, all good vendor upscalers should be supported: FSR 2, DLSS and XeSS. Support one and the others are relatively trivial to implement, as the modders have demonstrated. For Starfield, this is essential as the game is GPU heavy, and remarkably, the modded DLSS is qualitatively leagues apart from an FSR 2 integration said to have been added to the game by AMD engineers.

PC testing of a different kind kicks off our DF Direct Special on Starfield, where Rich Leadbetter demonstrates console-like performance using the PC version on a unique build featuring the Xbox Series X CPU paired with a very PS5-like RDNA 2 GPU.Watch on YouTube

FSR 2 ghosts on all particle effect types, it has multiple frame ghosting for any opaque objects that move rapidly, while characters or objects moving across the screen have visible ghosting and disocclusion fizzle on them - none of which is seen with DLSS at the same quality level. In general, any sort of neon effect or thin geometry tends to flicker with FSR 2 and this is quite obvious in game areas like Neon, where most of the signage there has full-on flickering with FSR 2 that just does not happen with DLSS.

We've said it once and we've said it before. Supporting all vendor image reconstruction techniques can be a big boon and fosters goodwill - and I say all of this without going into depth on the DLSS 3 frame generation mods or how the game does not support 32:9 monitors in a good way natively. As far as I see it, Starfield's graphical menu is subpar and too much on the community to make up for obvious shortcomings - it is nice that the community does these things and I highly recommend people to use those options which are easily found on Nexus Mods, but thus far, we're just talking about the basics. Basics that Bethesda should have included at launch.

With that said, Starfield launched with accomplished Xbox console versions at launch but PC has clear advantages, like much faster loading according to the nature of your hardware. In my tests, loading the same save on Xbox Series X compared to a Core i9 12900K and a PCIe Gen 4 NVMe SSD shows the game loading in less than a third of the time. So if your kit is up to snuff, all of that long waiting seen on Xbox Series X is drastically reduced on PC leading to a much snappier experience.

Optimised Settings Xbox Series X Equivalent
Shadow Quality Medium Low/Ultra Features
Indirect Lighting Medium-Ultra Medium or High
Reflections Medium Medium
Particle Quality Medium-High Medium or High
Volumetric Lighting Medium Medium
Crowd Density Low Unknown
Motion Blur Low Low
GTAO Quality Medium Lower Than Low
Grass Quality Medium Medium
Contact Shadows Medium Medium But Worse
Variable Rate Shading On On But Worse

Graphically, Starfield is an extremely challenging game - and we'll talk about the hot topic that is optimisation in due course. However, I highly recommend watching the video above to see how each and every setting scales and where Xbox Series X fits. Console equivalent settings are important to nail down because this demonstrates to us how the developer itself chooses its own optimised settings for mainstream hardware that shouldn't be expected to run best-of-the-best settings. I've made some alterations based on my own experience running Starfield on a mainstream PC using a Ryzen 5 3600 paired with an RTX 2070 Super and you can see the results above - but discount using the dynamic resolution option. It only works with v-sync active and only if you're under 30fps. And even then, the option does not deliver a correctly frame-paced 30fps, making it mostly useless in our opinion.

Let's talk about general GPU performance, because this game is heavy - and it's clearly disproportionately taxing to users of Nvidia and Intel hardware, a state of affairs that reflects poorly on the AMD sponsorship element to the title. Across the entire stack, AMD graphics hardware massively outperforms Nvidia equivalents in a way that hardly reflects the standard performance profiles of the respective cards. In my GPU test area, AMD's Radeon RX 6800 XT outperforms Nvidia's GeForce RTX 3080 by a mammoth 40 percent at ultra settings.

Let's be clear: the 6800 XT is a good card, but it's generally in the same ballpark as the 3080. Using optimised settings improves RTX 3080 frame health and the divide drops to 35 percent, but this is hardly normal behaviour and it's not down to the 16GB vs 10GB VRAM differential. In fact, Starfield's VRAM management is generally excellent to the point where even 8GB GPUs can run the game maxed at 4K ultra.

The RX 6800 XT and RTX 3080 typically produce similar rasterised performance. However, Starfield has a vast AMD advantage. Intel GPUs are similarly afflicted by sub-par performance.

Day one GPU drivers don't magically emerge from nowhere - they require Nvidia, AMD and Intel to have early access to the code in order to work with the developers to address specific issues and for the driver teams to produce their own bespoke optimisations for their hardware. The fact that Starfield didn't work at all on Intel GPUs at launch (with the software teams delivering two driver updates since then) suggests something has gone seriously amiss here and again, raises questions about sponsorships and bespoke integrations. Even after those two driver updates, Intel's Arc A770 performance lags behind an RTX 2070 Super and even a base AMD RX 5700, which doesn't really make much sense.

Starfield is also taxing on the CPU side of the equation. The frontier town on Akila is a pretty good stress test for your processor and even on optimised settings (which does reduce the CPU burden), the Ryzen 5 3600 drops beneath 60 frames per second. The CPU tests are also interesting in that clear traversal stutter can be observed, and switching to my Core i9 12900K, the stutter remains - but persists for a shorter duration. Looking at core utilisation, a surface look does suggest that the game scales across cores well, which is good news.

However, a deeper look at performance on the 12900K shows that the most optimal configuration is to use the processor's eight p-cores, with hyperthreading disabled and with the e-cores also turned off. On the flip side, on my Ryzen 5 3600, the game saturates all cores and threads and disabling SMT (AMD's hyperthreading alternative) produces visibly worse consistency.

This is just a snapshot at two processors, but I've also seen reporting showing that the higher-end Intel chips can beat AMD's market-leading Ryzen 7 7800X3D without disabling HT or e-cores, so I'm not sure what conclusions we can draw here, other than the fact that CPU utilisation could stand to benefit from some work on specific architectures.

In summary, I'd say there's good news and bad news with Starfield on PC. The quality of the game is clear and unlike, say, Star Wars Jedi: Survivor, we're not seeing disruptive problems that ruin the experience. However, there's clearly work to do. The options menu isn't descriptive enough or helps the user in any way in tailoring the game to their hardware. Basic features like field of view control, HDR, gamma and contrast controls need to be added, as well as official DLSS and XeSS support.

Tackling the disproportionately poor Nvidia and Intel performance also needs to addressed, while there's the sense that the game isn't properly tuned for the major CPU architectures used in today's PCs. Optimised settings clearly yields large performance dividends though, suggesting some degree of scalability, while the DLSS mod is a must for RTX users and can help both performance and image quality for Nvidia owners - but let's hope to see some genuine improvements from Bethesda in Starfield's first major update.

Read this next