It really only makes sense to compare stock-stock or OC-OC.
https://www.3dmark.com/3dm/52352929?
Anyway, yeah the RTX "3080" mobile is basically a 3070.
-
-
I would've compared it to Premas OC scores, but that's kind of useless since his GPU scores haven't been replicated by anyone else (800+ points higher than anyone else). Which is quite a bit in lappy land.
Case in point, the 3080 even while gimped surpasses the unreplicated scores.
Once mine arrives next month I'll do be sure to do a thorough analysis w/ actual gaming benchmarks, OC'ed to the max from 1080p to 4K.Attached Files:
-
-
Last edited: Feb 23, 2021
-
Attached Files:
-
-
Mine arrives tomorrow. Happy to test anything you guys want to see. Going to undervolt cpu and oc gpu as high as I can.
Last edited: Feb 23, 2021 -
For those that have received their R4 with a 4K display, what are your thoughts on it? The reason I ask is because I received my Alienware m17 R4 on Feb 15th and ended up returning it. I had the i9, RTX 3080, 32GB version with the 4K display. The main problem I had was the 4K display, the colors were off and it had a yellowish tint to it, almost like you had the blue filter on. My daughter has a 2020 Razer Blade 15 and my son has an ASUS Tuf Gaming laptop. Both of those displays looked way better than the 4K on the Alienware. I compared all 3 laptops side by side with the same pictures, articles, and videos pulled up and the 4K on the Alienware just looked bad. I'm not sure if there was something wrong with my display or what, but I uninstalled/reinstalled drivers and even removed the hard drive, added my own and did a new Win 10 install but nothing improved the display. So I returned it and ordered the m15 R3 during their President's Day sale. I got the i7-10875H, 32GB, RTX 2070 Super, 300Hz display in Lunar Light for $1699 ($1300 cheaper than the m17 R4 I had returned) and hope that display will at least be comparable to my daughter's Razer Blade. I was going to go with the m15 R4 but I couldn't use any coupons on it (which is funny because I was able to use a 10% off coupon on the m17 R4) so that's what led me to the sale page where I found the deal on the m15 R3.
JasonNH likes this. -
-
-
I understand that Prema score (he's a professionall benchmarker) is out of our league and i agree that my score 12600 vs 13989 is more of a fair comparaison but....stil it's only a 12% improvment but at least it's an improvment and i understand you are excited about this results. I personnally think it's just OK.
Where it hurts is the results in game we have so far. That is the only review i could find of some of the performance in games of the M17R4 : https://www.tomshardware.com/reviews/alienware-m17-r4
They have tested the 3080 with the I9 10980HK CPU in games benchmark and when i duplicates their settings with the previous gen Nvidia card, the 2080S gave me better FPS in the 2 test i did today.
I also tested RDR 2 and i average 83.5 FPS as opposed to the 3080 79FPS.
I will be testing Metro Exodus soon.
For sure in some games the 3080 should ouperformed the 2080S but it shouldn't be a discussion. It should beat the previous gen by a big margin in almost every game.
My guess is Dynamic Boost is only effective in Benchmark (i could be wrong).
So yes in benchmark the 3080 is 12% stronger but in games in don't see that. I see the opposite. Temperatures? Dynamic boost? i don't know.
Edit: I edited the good link for the results, https://www.tomshardware.com/reviews/alienware-m17-r4Last edited: Feb 23, 2021 -
-
I put in an order today for the M15 R4, i7-10870H, RTX 3080, and 4k OLED. I spent days debating over the panels, but seeing as I really don't play FPS games I don't think I'll be missing much with the 60hz. Besides, even though these are intended to be plugged in for their main purpose, I still find the idea of getting under 2 hours of battery life browsing the web or watching Netflix a bit distasteful as exhibited on the 300hz panel.
Considering that my last gaming notebook has a GTX 680m, it's been a while so this is exciting. Current ETA is 3/2 - 3/5. If it gets here that quickly I'll be pretty impressed.OCTINY likes this. -
Last edited: Feb 23, 2021
-
Nevertheless, it's only the beginning. Once the lappy gets in more capable hands, I think the discussion ends there to be frank. Would I have liked a 200w of the 3080? Sure. It would've been great to out it up against my shunt modded 3090 just for ****s and giggles
I got a few more weeks til I get mine (w/10980HK). Hopefully it ships on time!Last edited: Feb 23, 2021Normimb likes this. -
OCTINY likes this.
-
Normimb likes this.
-
pathfindercod Notebook Virtuoso
-
Last edited: Feb 23, 2021JasonNH likes this.
-
If you read within the numbers you can see your higher FPS from test 1 dropped significant for graphics test 2. Not the same for 2080 Super due it run at lower temp. The drop would be more with hotter card. You know Nvidias boost clocks start drop for each 5C degrees steps below 40C ? Yeah, the scores from 1 to 2 will drop more for 3000 series. How much the FPS will drop will depends on the temp increase from test 1 over into test 2. The colder you can keep the card running the better. Have been like this since Pascal.
Edit.
Put it the other way. 3080 should be stronger on paper both for graphics test 1 and 2. But a colder running 2080 card can eat off some of the disadvantage if it can run a lot colder. Higher temp would do the opposite and yeah, the 3080 would won test 2. The way it should have been.Last edited: Feb 23, 2021 -
So i guess that's why sometimes i see some pro-overclockers (like you ) benching and having a 40C average temp on TimeSpy. Cooling the GPU at/or below 40C would be the ideal target for Nvidia GPU's.Papusan likes this. -
Edit... See also
The NVIDIA GeForce RTX 3000 series also has a Hotspot... -
Great now let's get smarter and read this. -
-
etern4l likes this.
-
What you're seeing is power throttling, and if you had a 3000 series card, you'd know this. It's not apples to apples w/ Nvidia's previous cards (9 series & up). All 3000 series cards show similar dropped bins, regardless of temps. On average, the 2000 series is much more likely to drop bins due to temp than power, which is the exact opposite of the 3000 series. This is pretty well known in the OC community. And in general, the 3000 series will show higher gains in the 1st test than the 2nd test (which includes much larger drops) vs 2000 series, percentage wise.
Take any score off the 3000 series Timespy test, (air, chilled, wc'ed etc etc) and this'll prove my point. 99.9% of them show similar drops on the 2nd test, again, regardless of temps. They are power limited before being temp limited, due to how hard they drop and simply the architecture changes in conjection w/ how much power the 2nd test pulls than the first test.
I can tell you first hand it's not an apples to apples comparison when it comes to overclocking the 3000 series in comparison to the 2000 series. Will bins drop? Absolutely. Is it the prime reason for the percentage drop in the score from the first test temps? Absolutely not.
Care to explain the dropped bins for the cards below or any 3000 card really on 3Dmarks website, as all show similar drops percentage wise from the first test, regardless of temp. You can't have it both ways.
https://www.3dmark.com/spy/14869884
https://www.3dmark.com/spy/18511575
As I said in my previous posts, what you're saying used to hold true. Not anymore w/ the 3000 series. These are extremely power hungry cards.
Edit: I don't like coming off as a know it all. It's just I've had enough experience along w/ many others in regards to the 3000 series on overclock.net etc to know. As I said before, it's too bad we won't get a 200w 3080 so it can stretch it's legs a bit more.Last edited: Feb 24, 2021 -
All I say is, the cards will drop boost as it will run hotter. Better cooling is equal better performance.Last edited: Feb 24, 2021raz8020, Normimb, etern4l and 1 other person like this. -
Why not add the other benchmark 3080 laptop bench I posted? He boosted lower...with colder temps. He's power limited. From the same guy.
https://www.3dmark.com/spy/18511575
Also, that 5900hx laptop is vbios limited w/ no mux switch. Now what about the other link w/ the 3090's? And what about the drop in relation to the first test? All similar drops/scores percentage wise, regardless of temp.
Like I said, all 3000 series cards experience hard drops in the 2nd test due to power than temp. Will you lose some bins due to temps? Of course. But it's power limited more so than ever, especially when it comes to the 1% lows.
Edit: Nvm, I see you did add it! On my phone . But doesn't really help your point.
Edit 2: And I know that's what you're saying. I'm saying, the drops are bigger due to power limits more so than temp vs 2000 series, on the 2nd test. That's why no matter what 3000 benchmark link you choose, all will have similar drops on the 2nd test, while the 2000 will show less of a drop off on the 2nd test than the first test as it's more temp related vs power related. Along with the architecture changes of the 3000 series, where it gains more in the first test than 2nd test in general, regardless of temp or even power vs 2000 series. That's all I've been trying to communicate since your original comment.
"This show it crystal clear. Once the GPU test 1 is done, the second graphics test will colapse. Expect same behaviour in games as well."Last edited: Feb 24, 2021Papusan likes this. -
-
Edit: You do realize the X170 will be using the 165w 3080, right? You will see a similar drop on that system as well percentage wise on the 2nd test, regardless of temps. Scores will be slightly higher due to desktop CPU's in regards to the GPU score overall, but the hard fps drop on the 2nd test will remain. Bank on it. And I'm unsure what you mean they aren't Ampere cards? They are all based on the same architecture. Albiet, different SM and cuda core counts along w/ even worse vbios limits than the desktop versions.
And when I talk about the Ampere architecture in general performing worse than Turing on the 2nd test, when compared percentage wise on the drops from the 1st to 2nd test to the 2000 series. Look no further. Some top scores for the 3000 series. Temps look great etc
3060 TI- 40c
Drop from first test: 15.6%
https://www.3dmark.com/spy/16054727
3070- 36c
Drop from first test: 15.2%
https://www.3dmark.com/spy/17612119
3080- 39c
Drop from first test: 14.6%
https://www.3dmark.com/spy/18228743
3090- 14c
Drop from first test: 14.5%
https://www.3dmark.com/spy/18352561
3080 (notebook)- 53c
Drop from first test: 15.4%
https://www.3dmark.com/spy/17271051
2080 Super- 26c
Drop from first test: 7.3%
https://www.3dmark.com/spy/18065861
2080 ti- 19c
Drop from first test: 8%
https://www.3dmark.com/spy/17271051
2080 super (notebook)- 58c
Drop from first test: 9.5%
https://www.3dmark.com/spy/18421309Last edited: Feb 24, 2021 -
Anyway, stepping back, it is still not obvious by what mechanism would a standard Ampere desktop card power throttle on a second run. The fact that these cards can be shunt modded to use more power is obviously irrelevant.
Yes, I realize X170 uses the mobile chip... still, it will have the only Ampere mobile card on the market - everything else out there is soldered, although AW takes the infamous Golden Solder prize by also making RAM and WiFi non-upgradable on their new flagship.Last edited: Feb 24, 2021 -
Wait a sec? Was it not an earlier post that was using a single score run to prove you can expect major thermal throttling in games? You guys can't have it both ways. Want me to gather some high temps score as well? Showing similar differences? I mean I could do this all day
I've shown all the proof that is needed, based on the original comment which used the same premise of a single run. When it comes to the 3000 series, desktop or laptop, you don't base in-game performance off the 2nd test & compare it to the 2000 series, it's not an apples to apples comparison. There's obvious differences which I've gone over countless times. Simple as that. Understandably, it's harder for some to comprehend when most people still can't get their hands on a 3000 series GPU.
Nevertheless, It definitely looks like I ruffled some feathers here, if anything. So I'll be sure to tune in once that X170 w/ 3080 launches next monthLast edited: Feb 24, 2021seanwee likes this. -
Last edited: Feb 24, 2021Normimb likes this.
-
bigtonyman Desktop Powa!!!
Might have to pick one of the m17 r4's with the 4k screen and the 3080 once the comet lake refresh happens. Seems to be one of the better built and performing 3000 series laptops from what I've seen so far. I love my desktop, but I'm not home often due to work. Also not happy with my m15 r2 performance or battery life and I wouldn't mind a bigger machine as my needs have changed.
FXi likes this. -
-
-
Am I absolutely crazy here or is something seriously wrong with this gen of screens. Still using 4K 60hz screens, now with 25ms response time (what?), or an FHD 360hz with 5ms response time. 15" with an OLED seems the "best" buy but it's 60hz. Razer even has a 4K/120 screen out already on their models. I want something nice with a 3080, 11th gen doesn't matter too much to me but these screens just seem ridiculously underspec'd in relation to what you are buying.
-
-
4K 120's were available as of Q4 2019. If they aren't gimped for some reason, it would be nice if more vendors used them. Particularly now that the XE gpu's in Tiger Lake should support the refresh rates (pretty sure on that).
-
bigtonyman Desktop Powa!!!
-
Attached Files:
-
-
There’s a new VBIOS available to download on the support site. No changes in performance due to it.
Can anyone check if resizable bar is supported on their nvidia control panel? Not sure if it was yes or no before, but it’s yes now -
boththe3060 &3070.
Edit: My bad, I was confusing a secondary/alternative 3080 140W vBIOS in the package for the 3060.
Contrary to the readme there apparently is no 3060 vBIOS contained in the updater.Last edited: Mar 13, 2021raz8020, FXi, Normimb and 1 other person like this. -
Last edited: Feb 25, 2021etern4l likes this.
-
-
https://www.notebookcheck.net/Dell-...0HX-APU-and-RTX-3070-mobile-GPU.524301.0.html
Oh, so Alienware + Ryzen may be a thing. However, just assume it'll be just like the m15 R4 without thunderbolt. At least then the AGA makes sense for an eGPU route (even if it's EOL).bigtonyman, FXi, CSHawkeye81 and 1 other person like this. -
-
bigtonyman Desktop Powa!!!
-
Let's call it what it is >>> throttle.
All 3 vBIOS use a 15W lower fallback throttle value, if the EC and/or driver signal that they are no longer "happy" with the power limit.
3080 drops from 165W to 150W
3070 drops from 140W to 125WLast edited: Mar 13, 2021raz8020, FXi, alexnvidia and 5 others like this. -
Last edited: Feb 26, 2021etern4l likes this.
-
It would appear that Bar support exists on the R4's at least.
AW m17 R4 Resizeable BAR supported : Alienware (reddit.com)
*OFFICIAL* Alienware m15 R4 / m17 R4 Owner's Lounge
Discussion in '2015+ Alienware 13 / 15 / 17' started by Game7a1, Jan 12, 2021.