Read more about Cyberpunk 2077➜ https://cyberpunk2077.mgn.tv
Testing the Nvidia GeForce RTX 5090 GPU in Cyberpunk 2077 at 4K, 1080p, 8K and 16K, with and without DLSS 4 and Frame Generation!
⏱ Timestamps ⏱
Intro, Specs, Stuff – 0:00
4K Ultra – 0:29
4K Ultra / DLSS Q / CNN – 2:19
4K Ultra / DLSS Q / TM – 2:53
4K Ultra / DLSS P / TM – 4:23
4K Ultra / DLSS Q / TM / FG 4x – 5:16
4K Ultra / DLSS Q / TM / FG 2x – 11:19
4K Ultra / DLSS Q / TM – 12:24
4K RT Ultra – 13:47
4K RT Ultra / DLSS Q / TM – 15:55
4K RT Ultra / DLSS P / TM – 17:37
4K RT Ultra / DLSS Q / TM / FG 4x – 18:52
4K RT Overdrive – 21:43
4K RT Overdrive / DLSS Q / TM – 24:14
4K RT Overdrive / DLSS P / TM – 25:53
4K RT Overdrive – 27:36
4K RT Overdrive / DLSS P / TM / FG 4x – 28:03
4K RT Overdrive / FG 4x – 29:51
4K RT Overdrive / DLSS P / TM / FG 4x – 32:25
4K High (top right is wrong) / FG 4x – 33:54
4K High (top right is wrong) / DLSS P / TM / FG 4x – 35:07
4K Lowest / DLSS UP / TM / FG 4x – 35:58
1080p Lowest / DLSS UP / TM / FG 4x – 38:36
8K High – 40:02
8K RT Overdrive / DLSS P / TM – 41:05
8K RT Overdrive / DLSS P / TM / FG 4x – 41:46
Attempting 8K RT Overdrive / FG 4x – 43:02
8K Low – 43:41
8K Low / DLSS P / TM – 44:42
16K Lowest / DLSS P / TM – 45:37
16K Lowest / DLSS UP / (CNN instead of TM)- 46:52
16K Lowest – 48:01
🔧 SPECS 🔧
◾️ CPU – AMD Ryzen 7 9800X3D
◾️ Cooler – Deepcool LT720 AIO
◾️ Nvidia GeForce RTX 5090 32GB – Founders Edition
◾️ RAM – 32GB DDR5 6000MHz (2x16GB CL30)
◽️ OS – Windows 11 Pro
◽️ SSD – Crucial P3 1TB
◽️ SSD 2 – Lexar PCIe 4.0 2TB (games installed here)
◾️ Motherboard – Asrock B650 PG Riptide Wifi
◾️ PSU – Corsair HX1200
◾️ Case – Phanteks P500A
🎥 Recorded with a capture card for no FPS loss.
————————————————————————
Second Channel: @KryzzpDigimoon
▶️ PC Tech Reviews – https://www.youtube.com/playlist?list=PLH6sLgdc_uJ71iS0Yek79IIsyk7eSxXqo
🟢 RTX 5090 Playlist – https://youtube.com/playlist?list=PLH6sLgdc_uJ41jh7_FxH0P46rfoWdUcyh&si=vesxbV19jqCWXXca
#nvidia
#rtx5090benchmark
#rtxon
#rtx5090
#cyberpunk2077
———————————————————————–
👥 Discord: https://discord.gg/zwormz-community-243035352187666448
📸 Instagram: https://www.instagram.com/zwormz.gaming/
Thanks for watching 🙂
source
DLSS for is SICK
saying goodbye to Bob never looked so good
In one word, absolutely crazy. The 5090 is much better than the 4090.
Try very old games like Assassin's Creed 2 or Rollitos at 40k on my 4070 I managed to get it at 10k to 180 fps 8msaa
PLEASE ALAN WAKE 2 ON 5090 PLEASE
Its meant to be played… not like that. This FG_crap is killing the whole Gameindustry. It falsifys the work of the artists and studios and lead to cripple A unoptimised nearly uncompressed delivered games. Without day 1 patch and a sporadic driver update i sometimes ask myself why ppl still pay someone who pees in their pocket and wanna get paid for work thats not done. Framegeneration should be named to publisher or Alpha mode.
When u gonna test minecraft
as someone else said, try MINECRAFT (openGL) with the 5090. maybe there is higher resolution limit than 16K
Bode HAHAHAH. I'm brazilian and i love that!
this video would kill a GT710 user
Can we get 32k at 1 fps please.
goodbye bob compilation when?
So the only thing I’m gaining over my 4090 is more FPS that are generated because my 4090 can already max this game out with 100+ FPS I think I’m good. This card doesn’t impress me. And when it comes to native this cards only beating my 4090 by 10 FPS.🤦🏻♂️
Honestly I dont mind seeing some jitter around asses 😂
They have video games where you can shoot police? No wonder we have such crazy people.
I cannot even fathom playing this game at 4k 400fps I would freak out with joy and definitely cry
red dead redemption 2 we want
Wait a moment!!!! DLSS Trasformed model on quality have same FPS as native (of course better quality) even with 5090 (5000 series) this is so sick, i hope they optimize
Your statement on using the money to build high end cpu doesnt make sense . If you dont get the gpu what gpu are you doing to run that is considerdd 'high end'?
It is more realistic if they drop the headlights a bit.
What is the difference between VRAM.A and VRAM.U in performance metrics, and what do they specifically represent?
Everything but the PhysX😉
4K 60 not playable is ridiculous, console games can’t even play 4K 60, it why I went pc.