Read more about Cyberpunk 2077➜ https://cyberpunk2077.mgn.tv
Cyberpunk 2077 direct comparison between DLSS upscaled image quality and source resolution without DLSS. The left side displays the resolution from which the right side is upscaled. This experiment clearly shows that you should always use DLSS if you want the best image quality possible.
#cyberpunk2077 #nvidiadlss
source
I think fps counter is inverted
4 sets of eyes
and im not looking at any of them
Wdum dlss upscaled, dlss never upscale over native in the end image.??? For upscaling is DLDSR
wouldnt it make more sense to compare whatever non DLSS resolution gets the same FPS as the DLSS side?
something is wrong. why dlss fps that low ?
boi ain’t no way. thems got jiggle physics now? cyberpunk updates you are different
look at them jiggle tho
This experiment is flawed in its conclusion due to the fact that the REDengine, like many other modern engines, has a forced TAA implementation.
DLSS does NOT look better than true native resolution. It looks better than 'native' with TAA applied, which inherently blurs the image.
This is implemented at the engine level and cannot be disabled in-game without mods. Even with mods, the 8-view-point flickering tied to the engine's design creates aliasing and other visual issues when TAA is completely removed.
Modern games are primarily designed with TAA at their core, reducing development costs by skipping traditional optimization and relying on a TAA-centered approach. DLSS or FSR is then used to sharpen the image and make performance viable on current hardware.
However, this doesn't entirely negate the results of this experiment. The poster is correct in concluding that DLSS provides the best image quality and performance for games built with this design philosophy. But this applies only to games relying on a TAA-centered approach—most modern titles, but not all.
For older games developed before the rise of DLSS and FSR, which then implemented them thorugh an update, native resolution without TAA will always look better than any DLSS preset. This isn't due to bad DLSS implementation but because in modern titles you are unknowingly comparing DLSS against 'native' with heavy TAA applied (even when anti-aliasing is set to OFF in settings), not against true native resolution.
This is a somewhat important issue in gaming today, as most players aren't aware of it. The decline in graphics progress in recent years isn't due to diminishing returns but increased reliance on TAA and upscalers to save time and costs. For example, next-gen consoles with 13 teraflops barely outperform previous-gen systems with under 2 teraflops due to poor optimization and a dependence on upscalers like DLSS/FSR to achieve playable performance.
TL;DR: DLSS often looks better than 'native' because modern engines use heavy TAA that blurs native resolution. This is a shortcut for development and optimization. While DLSS is great for modern games, older titles or engines without TAA-centered design will look better at true native resolution.
*TAA = Temporal Anti-aliasing.
*FSR = AMD's upscaling solution, similar to DLSS.
*REDengine = The engine used to develop Cyberpunk 2077.
720p to 4k is 11%, 1080p is 25% and 1440p is 44%. Or what do you mean by 'screen percentage'?