Step 1. Turn on ray tracing
Step 2. Check some forum or protondb and discover that the ray tracing/DX12 is garbage and gets like 10 frames
Step 3. Switch back to DX11, disable ray tracing
Step 4. Play the game
Best use of ray tracing I’ve seen is to make old games look good, like Quake II or Portal or Minecraft. Newer games are “I see the reflection in the puddle just under the car when I put them side by side” and I just can’t bring myself to care.
True, I’ve had very few games worth the fps hit
If I know a game I’m about to play runs on Unreal Engine, I’m passing a -dx11 flag immediately. It removes a lot of useless Unreal features like Nanite
Then you get to enjoy they worst LODs known to man because they were only made as a fallback
Nanite doesn’t affect any of the post processing stuff nor the smeary look. I don’t like that games rely on it but modern ue5 games author their assets for nanite. All it affects is model quality and lods.
Lumen and other real time GI stuff is what forces them to use temporal anti aliasing and other blurring effects, that’s where the slop is.
what’s wrong with nanite?
Nanite + Lumen run like garbage on anything other than super high end hardware.
It is also very difficult to tweak and optimize.
Nanite isn’t as unperformant as Lumen, but its basically just a time saver for game devs, and its very easy for a less skilled game dev to think they are using it correctly… and actually not be.
But, Nanite + Lumen have also become basically the default for AAA games down to shitty asset flips… because they’re easier to use from a dev standpoint.
I don’t even check anymore lol.
The slideshow Control experience does look stellar for a bit
Shadows: Off
Polygons: Low
Idle Animation: Off
Draw distance: LowDoes your PC even have a dedicated GPU? At this point you might as well give up on PC gaming and buy a console.
Alt: F4
Launch: BalatroI think my PC can run the C64 demake of Balatro in an emulator
Out of all of these, motion blur is the worst, but second to that is Temporal Anti Aliasing. No, I don’t need my game to look blurry with every trailing edge leaving a smear.
TAA is kind of the foundation that almost all real time raytracing and frame generation are built on, and built off of.
This is why it is increasingly difficult to find a newer, high fidelity game that even allows you to actually turn it off.
If you could, all the subsequent
magicbullshit stops working, all the hardware in your GPU designed to do that stuff is now basically useless.What? All Ray Tracing games already offer DLSS or FSR, which override TAA and handle motion much better. Yes, they are based on similar principles, but they aren’t the mess TAA is.
Almost all implementations of DLSS and FSR literally are evolutions of TAA.
TAA 2.0, 3.0, 4.0, whatever.
If you are running DLSS or FSR, see if your game will let you turn TAA off.
They often won’t, because they often require TAA to be enabled before DLSS or FSR can then hook into them and extrapolate from there.
Think of TAA as a base game and DLSS/FSR as a dlc. You very often cannot just play the DLC without the original game, and if you actually dig into game engines, you’ll often find you can’t run FSR/DLSS without running TAA.
There are a few exceptions to this, but they are rare.
TAA just means temporal anti aliasing. Temporal as in relying on data from the previous frames.
The implementation of DLSS and FSR are wholly separate from the old TAA. Yes, they work on the same principals, but do their own thing.
TAA as a setting gets disabled because the newer methodes fully overwrite it. Some games hide the old setting, others gray it out, it depends.
The implementation of DLSS and FSR are wholly separate from the old TAA. Yes, they work on the same principals, but do their own thing.
TAA as a setting gets disabled because the newer methodes fully overwrite it.
This is very often false.
DLSS/FSR need per pixel motion vectors, or at least comparisons, between frames, to work.
TAA very often is the thing that they get those motion vectors from… ie, they are dependent on it, not seperate from it.
Indeed, in many games, significant other portions/features of a game’s graphical engine bug out massively when TAA is manually disabled, which means these features/portions are also dependent on TAA.
Sorry to link to the bad site, but:
https://www.reddit.com/r/FuckTAA/comments/motdjd/list_of_known_workarounds_for_games_with_forced/
And here’s all the games that force TAA which no one has yet figured out how to disable:
https://www.reddit.com/r/FuckTAA/comments/rgxy44/list_of_games_with_forced_taa/
Please go through all of these and notice how many modern games:
-
Do not allow the user to turn off TAA easily, forcing them to basically mod the game by manually editing config files or more extensive workarounds.
-
Don’t even tell the user that TAA is being used, requiring them to dig through the game to discover that it is.
-
When TAA is manually disabled, DLSS/FSR breaks, or other massive graphical issues crop up.
TAA is the foundational layer that many modern games are built on… because DLSS/FSR/XeSS and/or other significant parts of the game’s graphical engine hook into the pixel motion per frame comparisons that are done by TAA.
The newer methods very often do not overwrite TAA, they are instead dependent on it.
Its like trying to run or compile code that is dependent on a library you don’t actually have present… it will either fail entirely, or kind of work, but in a broken way.
Sure, there are some instances where DLSS/FSR is implemented in games, in a way that is actually its whole own, self contained pipeline… but very often, this is not the case, TAA is a dependency for DLSS/FSR or other graphical features of the game engine.
TAA is massively different that older MSAA or FXAA or SMAA kinds of AA… because those don’t compare frames to previous frames, they just apply an effect to a single frame.
TAA provides ways of comparing differences in sequences of frames, and many, many games use those methods to feed into many other graphical features that are built on top of, and require those methods.
To use your own words: TAA is indeed a mess, and you apoarently have no idea how foundational this mess is to basically all the new progression of heavily marketed, ‘revolutionary’ graphical rendering techniques of the past 5 ish years.
-
But why did you buy a 1800€ video card then?
farming bitcoin. Duh.
So you can use a more demanding form on anti-aliasing, that doesn’t suck ass?
Like rip maps instead of mip maps? Or is it some AI bs nowadays?
Unreal doesn’t even have other forms of AA iirc. It’s up to the devs to implement
Honestly motion blur done well works really well. Cyberpunk for example does it really well on the low setting.
Most games just dont do it well tho 💀
motion blur is essential for a proper feeling of speed.
most games don’t need a proper feeling of speed.
… What?
I mean… the alternative is to get hardware (including a monitor) capable of just running the game at an fps/hz above roughly 120 (ymmv), such that your actual eyes and brain do real motion blur.
Motion blur is a crutch to be able to simulate that from back when hardware was much less powerful and max resolutions and frame rates were much lower.
At highet resolutions, most motion blur algorithms are quite inefficient and eat your overall fps… so it would make more sense to just remove it, have higher fps, and experience actual motion blur from your eyes+brain and higher fps.
my basis for the statement is beam.ng. at 100hz, the feeling of speed is markedly different depending on whether motion blur is on. 120 may make a difference.
You still see doubled images instead of a smooth blur in your peripheral vision I think when you’re focused on the car for example in a racing game.
Maybe I’m misunderstanding you but isn’t that screen tearing?
I mean just from persistence of vision you’ll see multiple copies of a moving object if your eyes aren’t moving. I have realized tho that in the main racing game I use motion blur in (beamng) I’m not actually reaching above 80fps very often.
here, I copied someone’s shader to make a quick comparison:
with blur: https://www.shadertoy.com/view/wcjSzV
without blur: https://www.shadertoy.com/view/wf2XRV
Even at 144hz, one looks smooth while the other has sharp edges along the path.
Keep in mind that this technically only works if your eye doesn’t follow any of the circles, as that would require a different motion blur computation. That’s obviously not something that can be accounted for on a flatscreen, maybe in VR at some point though if we ever get to that level of sophistication. VR motion blur without taking eye movement into account is obviously terrible and makes everyone sick.
Someone else made a comparison for that, where you’re supposed to follow the red dot with your eye. (keep in mind that this demo uses motion blur lengths longer than a frame, which you would not have if aiming for a human eye-like realistic look)
Okay now I got it from your explanation, thanks!
By the way, the first two Shadertoys aren’t working for me, I just get “:-( We either didn’t find the page you were looking for, or there was an internal error.”. The third one works, though…
Motion blur is guarenteed to give me motion sickness every time. Sometimes I forget to turn it off on a new game… About 30 minutes in I’ll break into cold sweats and feel like I’m going to puke. I fucking hate that it’s on by default in so many games.
It really should be a prompt at first start. Like, ask a few questions like:
- do you experience motion sickness?
- do you have epilepsy?
The answers to those would automatically disable certain settings and features, or drop you into the settings.
It would be extra nice for a platform like PlayStation or Steam to remember those preferences and the game could read them (and display a message so you know it’s doing it).
Depends, my girlfriend has issues playing Minecraft unless motion blur is on. Though I have to say whoever made the shader we’re using did a pretty good job implementing a not terrible looking blur imo.
Motion blur + low FOV is an instant headache.
There is always motion blur if your monitor is shitty enough.
Or your brain slow enough
Or the drugs good enough
Just turn on TAA and free motion blur in any game!
yeah the only time I liked it was in need for speed when they added nitro boost. the rest of the options have their uses imo I don’t hate them.
Has the person who invented the depth of field effect for a video game ever even PLAYED a game before?
it works great for games that have little to no combat, or combat that’s mostly melee and up to like 3v1. or if it’s a very slight DOF that just gently blurs things far away
idk what deranged individual plays FPS games with heavy DOF though
Yeah, especially games with any amount of sniping. Instantly crippling yourself.
the problem with dilf is that you need to put the subject of your life in the middle
What is the depth of field option? When it’s on what happens vs when it’s off?
Side question, why the fuck does everything in IT reuse fucking names? Depth of field means how far from character it’ll render the environment, right? So if the above option only has an on or off option then it is affecting something other than the actual depth of field, right? So why the fuck would the name of it be depth of fucking field??? I see this shit all the time as I learn more and more about software related shit.
No.
Depth of field is when backgroud/foreground objects get blurred depending on where you’re looking, to simulate eyes focusing on something.
You’re thinking of draw distance, which is where objects far away aren’t rendered. Or possibly level of detail (LoD) where distant objects will be changed to a lower detailed model as they get further away.
Gotcha. Thanks🍻
When it’s on, whatever the playable character looks at will be in focus and everything else that is at different distances will be blurry, as it would be the case in real life if your eyes were the playable character’s eyes. The problem is that the player’s eyes are NOT the playable character’s eyes. Players have the ability to look around elsewhere on the screen and the vast majority of them use it all the time in order to play the game. But with that stupid feature on everything is blurry and the only way to get them in focus is to move the playable character’s view around along with it to get the game to focus on it. It just constantly feels like something is wrong with your eyes and you can’t see shit.
It’s like motion blur. Your eyes already do that, you don’t need it to be simulated…
For depth of field, our eyes don’t automatically do that for a rendered image. It’s a 2d image when we look at it and all pixels are the same distance and all are in focus at the same time. It’s the effect you get when you look at something in the distance and put your finger near your eye; it’s blurry (unless you focus on it, in which case the distant objects become blurry).
Even VR doesn’t get it automatically.
It can feel unnatural because we normally control it unconsciously (or consciously if we want to and know how to control those eye muscles at will).
to be fair you need it for 24fps movies. however, on 144Hz monitors it’s entirely pointless indeed
My Dad showed me the Avatar game on PS4. The default settings have EXTREME motion blur, just by turning the camera; the world becomes a mess of indecipherable colors, it’s sickening.
Turning it off changed the game completely.
No, your eyes can’t do it on a screen. The effect is physically caused by the different distances of two objects, but the screen is always the same distance from you.
You don’t know what focusing on things is?
Yes, but you still get the blurry effect outside of the spot on the screen you’re focused on.
Not in the same way. Our eyes have lower resolution away from the center, but that’s not what’s causing DoF effects. You’re still missing the actual DoF.
If the effect was only caused by your eye, the depth wouldn’t matter, but it clearly does.
Yeah I get it, I’m just saying it’s unnecessary. If I need to see what’s going on in the background, then my eyes should be able to focus on it.
There are very few scenarios where DoF would be appropriate (like playing a character who lost their glasses).
Like chromatic aberration, which feels appropriate for Cyberpunk, since the main character gets eye implants and fits the cyberpunk theme.
https://en.wikipedia.org/wiki/Depth_of_field
It’s not “IT” naming. It’s physics. Probably a century or few old. That’s what they’re trying to emulate to make things like more photographic/cinematic.
Same with almost all the other options listed.
In this context it just refers to a post processing effect that blurs certain objects based on their distance to the camera. Honestly it is one of the less bad ones imo, as it can be well done and is sometimes necessary to pull off a certain look.
Depth of field is basically how your characters eyes are unfocused on everything they aren’t directly looking at.
If there are two boxes, 20 meters apart, one of them will be blurry, while aiming at the other.
Your example is great at illustrating how DoF is often widely exaggerated in implementation, giving the player the experience of having very severe astigmatism, far beyond the real world DoF experienced by the average… eyeball haver.
Put your finger in front of your face. Focus on it. Background blurry? That’s depth of field. Now look at the background and notice your finger get blurry.
Well, not exactly, but they were described to him once by an elderly man with severe cataracts and that was deemed more than sufficient by corporate.
I mean, it works in… hmmm… RPGs, maybe?
When I was a kid there was an effect in FF8 where the background blurred out in Balamb Garden and it made the place feel bigger. A 2D painted background blur, haha.
Then someone was like, let’s do that in the twenty-first century and ruined everything. When you’ve got draw distance, why blur?
Yes, it makes sense in a game where the designer already knows where the important action is and controls the camera to focus on it. It however does not work in a game where the action could be anywhere and camera doesn’t necessarily focus on it.
Yup, or if they’re covering up hardware deficiency, like Nintendo sometimes does. And even then, they generally prefer to just make everything a little fuzzy, like BotW.
It works for the WiiU games where Nintendo used it for tilt shifts. That’s pretty much it
I always turn that shit off. Especially bad when it’s a first-person game, as if your eyes were a camera.
If only I could just turn off the chromatic aberration in my eyeglasses.
What Anti Aliasing does your glasses use?
I’m on the -4.25 setting but I may be due for a new prescription as newer reality is getting blurry again.
You can get ones with less chromatic aberration, but it’ll cost you.
But what about Bloom?
I feel like bloom depends on how intense it is, and if it makes sense to reasonably play the game.
Like, if it’s the sun, yeah, bloom is OK.
If it’s anything else? Pass.
i like lens flare its pretty
I like lense flare for a bit if I’m just enjoying the scenery or whatever. If I’m actually playing the game though, turn that shit off so I can actually see
You are supposed to not see
Taps temple Auto disable ray tracing if your gpu is too old to support it ( ͡° ͜ʖ ͡°)
Disable it with new GPUs as well.
Why is that? If it has no significant impact on FPS, and you enjoy the high fidelity light simulation, then why turn it off?
The impact is significant enough
Always? I’m currently playing Control on a 4070 Super, and I honestly can’t tell the difference between ray tracing on and off on FPS.
I’m on a 4090 an ray tracing increases the temperature in my room by a degree or three
Well you do you then. I’ll leave it off.
The biggest thing in control was the reflections iirc
i need some motion blur on otherwise i get motion sickness.
Wait, I’ve been turning it off to prevent motion sickness. 🤔
My friend is the same way as you haha.
The preference against DOF is fine. However, I’m looking at my f/0.95 and f/1.4 lenses and wondering why it’s kind of prized in photography for some genres and hated in games?
It is unnatural. The focus follows where you are looking at. Having that fixed based on the mouse/center of the screen instead of what my eyes are doing feels so wrong to me.
I bet with good eye tracking it would feel different.
That makes sense, if you can’t dynamically control what is in focus then it’s taking a lot of control away from the player.
I can also see why a dev would want to use it for a fixed angle cutscene to create subject separation and pull attention in the scene though.
Different mediums. Different perception. Games are a different kind of immersion.
Don’t forget TAA!
Worst fucking AA ever created and it blows my mind when it’s the default in a game.
Bad effects are bad.
I used to hate film grain and then did the research for implementing myself, digging up old research papers on how It works at a scientific level. I ended up implementing a custom film grain in Starfield Luma and RenoDX. I actually like it and it has a level of “je ne sais quoi” that clicks in my brain that feels like film.
The gist is that everyone just does additive random noise which raises black floor and dirties the image. Film grain is perceptual which acts like cracks in the “dots” that compose an image. It’s not something to be “scanned” or overlayed (which gives a dirty screen effect).
Related, motion blur is how we see things in real life. Our eyes have a certain level of blur/shutter speed and games can have a soap opera effect. I’ve only seen per-object motion blur look decent, but fullscreen is just weird, IMO.
On Motion blur, our eye’s motion blur, and camera’s shutter speed motion blur are not the same. Eyes don’t have a shutter speed. Whatever smearing we see is the result of relaxed processing on the brain side. Under adrenaline with heavy focus, our motion blur disappears as our brain goes full power trying to keep us alive. If you are sleep deprived and physically tired, then everything is blurred, even with little motion from head or eyes.
Over 99% of eye movement (e.g. saccadic eye movement) is ignored by the brain and won’t produce a blurred impression. It’s more common to notice vehicular fast movement, like when sitting in a car, as having some blur. But it can be easily overcome by focused attention and compensatory eye tracking or ocular stabilization. In the end, most of these graphical effects emulate camera behavior rather than natural experience, and thus are perceived as more artificial than the same games without the effects. When our brain sees motion blur it thinks movie theater, not natural everyday vision.
Yeah, if you see motion blur in real life, that usually means something bad, yet game devs are not using it for those purposes.
Eyes do have a “shutter speed”, but the effect is usually filtered out by the brain and you need very specific circumstances to notice motion blur induced by this.
No, they don’t. As there is no shutter in a continuous parallel neural stream. But, if you have any research paper that says so, go ahead and share.
It has nothing to do with a neural stream, it’s basic physics.
Explain, don’t just antagonize. I bet you don’t understand the basic physics either. I’m open to learn new things. What is the eye’s shutter speed? sustain your claim with sources.
I put “shutter speed” in quotes for a reason. To gather the required amount of light, the sensor must be exposed to it for a specific amount of time. When it’s dark, the time increases. It doesn’t matter if it’s a camera or your eye.
That’s sensitivity, not shutter speed. Eye’s do not require time for exposure, but a quanta or intensity of light. This sensitivity is variable, but not in a time dilated way. Notice that you don’t see blurrier in darker conditions, unlike a camera. You do see in duller colors, as a result of higher engagement of rods instead of cones. The first are more sensitive but less dense in the fovea, and not sensitive to color. While a camera remains as colorful but more prone to motion blur. This is because the brain does not take individual frames of time to process a single still and particular image. The brain analyses the signals from the eye continuously, dynamically and in parallel from each individual sensor, cone or rod.
In other words, eye’s still don’t have, even a figurative, shutter speed. Because eyes don’t work exactly like a camera.
These settings can be good, but are often overdone. See bloom in the late 2000s/early 2010s.
Also the ubiquitous “realistic” brown filter a la Far Cry 2 and GTA IV. Which was often combined with excessive bloom to absolutely destroy the player’s eyes.
At least in Far Cry 2 you are suffering from malaria.
All those features sucked when they first came out, not just bloom.
I always hated bloom, probably because it was overused. As a light touch it can work, but that is rarely how devs used it.
It’s usually better in modern games. In the 2005-2015 era it was often extremely overdone, actually often reducing the perceived dynamic range instead of increasing it IMO.
Yeah, chromatic aberration when done properly is great for emulating certain cameras and art styles. Bloom is designed to make things look even brighter and it’s great if you don’t go nuts with it. Lens flares are mid but can also be used for some camera stuff. Motion blur is generally not great but that’s mainly because almost every implementation of it for games is bad.