GE66 Tested Cyberpunk 2077 Bottleneck Exposed! (DLSS, RTX ON + OFF)



Read more about Cyberpunk 2077➜ https://cyberpunk2077.mgn.tv

Welcome to Tested! Tested is a Bob Of All Trades channel to showcase DETAILS during laptop testing before a final Review Video is published.

The Playlist is organized per laptop with each laptop having its own series of individual tests.

https://www.patreon.com/BobOfAllTrades

#Cyberpunk2077
#RTX 3070
#Cyberpunk2077_Ray-Tracing
#RTXON

source

18 thoughts on “GE66 Tested Cyberpunk 2077 Bottleneck Exposed! (DLSS, RTX ON + OFF)”

  1. Love these short videos. I was briefly looking at the gp66, wonder if it isn't best to punt on ray tracing this generation. I'd even consider an AMD laptop gpu but no idea what we're getting our when.

    Reply
  2. Simply awesome to see this in realtime this particular Powerful 125w GPU is def being held back by the CPU..especially for the price these laptops go for..so you believe the amd 5900 mobile series wont remove most of the bottleneck then Bob?

    Reply
  3. I think I may have said something about this in a previous video on this laptop, but those CPU temps (I understand it’s an 8-core CPU) would not make me feel comfortable playing anything remotely intensive on the CPU.

    As an example of why that worries me, on my Alienware 17 R5 with the i9-8950HK I have the CPU clocked at 4.3 GHz on all cores (been testing 4.5 to see if it can handle that on a regular basis) and it takes a complete load on it, meaning utilization on the CPU is at 100%, for it to reach into the 90s C (I’ve also mentioned this before but I did repaste with TG Kryonaut, and I have an UV set at -.150 mV so that’s part of the reason why temps remain fairly low, around the 60 C range, in workloads where it’s not as crucial but still puts in a good amount of work). The CPU can pull anywhere from 35 watts to 85-90 watts depending on what I put it through (in a fully loaded situation it can do up to 110 watts and that’s where temps can start to spike up, but it takes that kind of workload). Here, the CPU is pulling 60 watts and clocking at around 4 GHz on all 8 cores, and peaked into the 90s C in this footage, and 60 watts shouldn’t get the CPU that hot in a high performance laptop like this.

    I seriously hope the 11th gen Intel H45+ CPUs will be much more thermally efficient at the same wattage that 10th gen H CPUs are pulling (that’s where I may plan my upgrade as the RTX 2080 Ti I’m using in the AGA (also a small reason for better CPU temps as the GTX 1070 dGPU is not being used) is proving to be maybe a little too much for the 8950HK to handle in terms of pushing out frames and I do plan on getting a high refresh rate monitor at some point and to maybe dabble in competitive gaming a little bit).

    Sorry for the small rant here…

    Reply
  4. Wow! Never seen a quite beefy CPU utilization in Cyber Punk. Probably with DLSS on a FullHD you're rendering like at 720p, almost in a CPU bound scenario. God bless 2021 allows to choose 1440p resolution laptops. Sadly this is not among them.

    Reply
  5. So instead of E-Gpus now we are gonna need E-Cpus then? Lol
    On a serious note does this make 30 series GPUs basically worse to buy then since they are ultimately limited unless you have like an I9 type processor? Like is there any point or no because a 20 series can be completely saturated. I could only hope this resizable bar thing can help alleviate some things with time and tweaking by nvidia?

    Reply
  6. Nice video Bob! and great new channel as well. Btw, I'm considering getting a gp66 with those same specs (ge66) would it help performance undervolting the gpu/cpu like in your afterburner tutorial? Cause I haven't seen rtx 30 series beeing undervolted by most users. thank you for the videos!

    Reply
  7. Bob, you're really damn good at what you do. Thanks for all the information. Quick question I wanna hear your opinion on this: What's the best Power Mode option to use for gaming while plugged in? Is it in the Middle (Better Performance) or the far right (Best Performance) ? I noticed when I set it to (Best) the CPU clock speeds stays high and does not keep changing up and down depending on the load.

    Reply

Leave a Comment