Last Updated: Oct 28, 2018
Whether you're planning a new gaming computer build and wondering what graphics settings are all about and which settings (low, medium, high or ultra) you should be choosing hardware for when studying performance benchmarks, or perhaps you've already built a PC and looking to choose the best graphics settings to increase performance, in this beginner's guide to game settings we'll explain what you need to know.
As well as the basics of how PC graphics settings and presets work, how much better higher settings actually look, and whether or not ultra settings are worth it - we'll also cover some of the more advanced, specific graphics settings that you can manually tweak, including what they actually do and which are most taxing on your graphics card or CPU.
The Witcher 3: Low vs Ultra Settings Side by Side
If you're new to PC gaming, you may hear all this talk about PC graphics/game settings and "ultra" settings and whatever else, but may wonder what exactly are PC graphics settings, why do they matter, and what you should know when planning and building a PC. When you game on consoles, everybody has the same hardware and so everyone is going to see the same quality of graphics on-screen. Whereas with PC games, the spectrum of possible hardware that a player may be using is vast.
One PC gamer may be playing a certain game a high-end PC that'll run the game and all its special graphics effects without a hiccup, whilst another player may be sporting a Potato PC of sorts that struggles to keep up and run the game smoothly. That's where game settings come in: unlike console games that have fixed graphics options for the most part, for PC games developers open up a whole host of different graphics options for players to be able to adjust and tweak in order to improve performance.
I'm all about the classics. If only WC4 was a thing...maybe one day if we get extremely lucky
The catch? When you lower settings, the graphics quality and effects won't be as good, but this is the sacrifice you have to make if you want better performance in a certain game that is struggling on your PC. If your computer can't run a certain game at its highest settings, you'll want to strike a nice balance of graphics quality and performance until you're happy with the frame rate and performance you get.
This balance varies from gamer to gamer, as some will want flawlessly smooth gameplay at all games, whereas others may not mind the occasional slow down here or there in the name of the highest quality graphics possible. It all comes down to you and what frame rate you think is best, the type of game you're playing (frame-rate slowdowns are less fun in fast-paced games where every millisecond counts; think Counter Strike or Overwatch).
So, if you really are new to the wonderful world of PC gaming, you may be wondering how do you lower or increase graphics settings? Simply put, to meddle with the settings of a game simply look for a section in the main menu called something like graphics options or video options.
Many games will have various presets of settings such as Low, Medium, High, or Ultra presets, and as well as being able to select a preset (which selects a default range of settings for you: more in this below) you should also have the option to go in and change individual graphics settings for those who wish to customize further (you may have to select the "custom" preset to be able to do this as shown below).
Selecting a preset will automatically set all the graphics settings to a certain level
What are presets? As mentioned above, most PC games will have a list of available settings presets which are basically default overall settings for various levels of graphics quality that are pre-set by the developers. They make it easier to increase or lower the graphics settings of a game without having to go in and tweak individual settings, which can be confusing and won't make sense to some gamers (we'll get to them later in this guide if you're interested though). In most games there's usually 4, sometimes 3, default presets which are usually named the following:
Differences in preset settings for Witcher 3
Are ultra settings a must-have for PC gaming? How much better are they than high, or even medium settings/presets? Is it a waste of a game if you're NOT playing on ultra settings? Of course the answer to that last one is no - you don't NEED to be running maxed-out ultra settings to fully enjoy a game. But whether or not ultra settings are worth it, and how much of a difference they'll make on graphics and the overall experience, will vary from game to game, and from gamer to gamer. Some games will be more noticeable on maximum settings compared to high or medium graphics settings, whereas in other titles you may not even notice the difference.
Although keep in mind that generally speaking, the higher the settings the less of a difference you'll see. Meaning, that a jump from low to medium, or medium to high, is likely going to have more of a change than going from say high to ultra/maxed. So I'd say that playing on high vs ultra isn't too much of a difference in most games, and most gamers won't really notice much of a difference.
Another factor in whether ultra settings are worth aiming for, besides whether or not your PC can produce a good enough frame rate on ultra settings, is how much of a graphics nut you are and how much attention to detail you..well..pay attention to. The type of game you're playing also matters. In a fast-paced shooter or action game for instance, you're probably not taking too much time to stop and smell the roses and just stare at the pretty graphics.
I'd say having higher settings is more important, and more noticeable, when playing immersive, atmospheric games like The Witcher 3 where there's plenty of time to soak in views like this
Stare at something for a millisecond too long in CSGO and you're dead. But in slower paced games that put more of a focus on amazing visuals, effects, and attention to detail - think Witcher 3 or the Crysis series as cliche examples - running the highest graphics settings you can is more important. Running Witcher 3 on PC on low settings is a bit of a waste of an epic experience IMO, and you know the developers are hurting inside when you do that.
If you want to spend more time changing graphics settings around to find the optimal balance of performance and graphics quality for your particular system and game, you can go beyond merely changing the default presets to manually tweaking the individual, specific graphics settings. Sometimes a certain preset may not be optimally set for maximum performance and you'll want to use a custom preset instead.
For example, if you have a game set on a high preset, which probably sets all of the game's graphics settings to high (not necessarily though as some developers will tweak presets more precisely), there may be certain settings you'll want to have to ultra instead (settings that don't affect FPS too much) and other specific settings you may wish to lower: settings that are perhaps most taxing on FPS/frame-rate, or maybe settings that are most CPU intensive in the case that your setup's weaker link is the processor.
When it comes to manually changing PC graphics settings for optimal performance, every game will be slightly different and it's going to take a bit of trial and error to find the best settings (or do some research online). We won't cover all the settings here as it's beyond the scope of this beginner's introduction to game settings, but let's quickly go over some of the more common ones such as Anti Aliasing, Anisotropic Filtering and Field of View.
Good questions. On your mission to find the optimal balance of quality and performance, not all settings are created equal and some are going to be more fruitful to play around with than others. In other words, there are some graphics settings that affect your frame rate more than others, and that also won't have a huge effect on the overall graphics quality (though it depends on the game), and these settings are the ones you should look to lower first.
First of all though, if you haven't already considered this, using a lower gaming resolution is the biggest way to improve performance and significantly up that fragile frame rate of yours. However, this is more of a last resort as lowering the resolution will have a noticeable affect on graphics quality, so only do so if your performance really is snail-like to begin with. So, if you're set on the resolution you're currently rocking, below are the biggest frame-rate eaters to consider lowering and tweaking around first. Note that these are general guidelines and how taxing settings are will vary from game to game, but the following game settings are a good place to start in most AAA games:
PC Graphics Settings to Lower First
If you're wondering more specifically which game settings affect the CPU the most because you know that your performance issue is a weak CPU that may be bottlenecking your graphics card, I would still try lowering shadows as mentioned above as that is usually both GPU and CPU intensive. Although it does depend on the specific game's implementation of shadows. For example, in Fallout 4 the shadows are much more CPU dependant than the shadows rendered in other titles, and so if you were tweaking that game for a CPU bottleneck then definitely start with shadows.
Some games like Witcher 3, GTA V, and Skyrim have settings for Number of NPCs, Draw Distance, and Foilage Density (not in GTAV AFAIK) which would all release pressure off your CPU when lowered, so look into those settings for a CPU bottleneck as well. AI (Artificial Intelligence) is also CPU dependant. Furthermore, tweak and/or turn off any special effects such as PhysX, cloth effects, and anything physics-related, as these types of dynamic features can really use up CPU resources. Last but not least, particle effects can be quite CPU intensive so tweak that if your game has it.