You started the wrong way You purchased a BGA Ops
-
-
hmscott likes this.
-
Hopefully he will give it a try -
I am running fire strike with a -100mv underclock and it's holding at 87c. Might need to paste it. Had it set to 3.5 on all cores. Hit 93 on physics test.
hmscott likes this. -
hmscott likes this.
-
Edit. 3DM Sky Diver and 3DM11 will load your processor a bit more.Last edited: Oct 13, 2016hmscott likes this. -
this is with -125mv http://www.3dmark.com/3dm/15414062
scores are lower.... but the gpu was at 71 and cpu was at low 70s with a peak of 81.
Papusan how do you check the stock load voltage? -
Optimizing the thermals via a software solution is easier than pulling apart a laptop, one of the things I enjoy about software over hardware hacking.
It's best to take a methodical 1 step at a time method, characterizing each possibility for improvement of the thermals, and noting the effect.
Undervolt, maximum frequency - 4 core multiplier, cache multiplier, fan curves, and even application tuning.
Then if the result is enough - stopping the throttling while gaming or other use, you can save the mechanical solution - re-pasting - for later in the life of the *brand new laptop*.
There is still a chance for infant mortality for the hardware components, something might fail in the first days, weeks, months.
It would suck to rip everything apart on day one, only to find out something else needs fixing - RMA involved - only to have the RMA rejected due to the vendor finding out you pulled apart their carefully constructed laptop.
Sometimes that goes ok, after a nice conversation - promising you are an expert in such things and didn't actually muck things up yourself by doing the dis-assembly / re-assembly.
Sometimes it doesn't go so smoothly.
It's better to keep ones leverage in ones favor, rather than going out on a ledge and getting caught out without a net.
The first time I copied a file, then deleted a file, and restored it from backup, I realized it was far easier to do that than re-wire a fix by hand on the wire-wrap board
Software is always the first best solution, if you have a handle provided that lets you control things that way.
Otherwise you have to break out the sticks and stones, or wire-wrap gun, to make the fix
Introduction to Wire Wrap
Still in use today
Wire Wrap Tools - 2016
http://www.digikey.sg/product-search/en/tools/wire-wrap/1245295Last edited: Oct 13, 2016 -
Did you undervolt both the CPU cores and the cache? Or just the cores?
You can try making a change to the CPU cache add undervolt, or remove it, and rerun.
Make sure there isn't something else running, like a browser with active tabs, or a game control app - like Steam/GOG/etc.
You can use hwinfo64 to see the temps, voltages, and power draw live and enable logging for finer grain monitoring.
I usually quit/exit the XTU systray app when running hwinfo64, you only need 1 monitoring program accessing the probes at a time. Plus you can remove the CPU usage of the XTU systray app monitoring.
Nice first result, keep going - from others reports you might get up to -200mV undervolt on the Cores, somewhere along the way you will BSOD.
Make sure you aren't doing important work or doing disk copies when running on a new test setting - if you do BSOD, do a disk check on C: on next boot - reboot to do that check immediately. -
And, gaining data before re-pasting gets you a good base for comparison after re-pasting.
Try leaving the cores at stock, and only doing the undervolting first at CPU defaults. It's better to change 1 thing at a time.
Cores undervolt, then cache undervolt, then see how much you can increase the core multipliers at best stock CPU undervolt.
Then you can reduce the undervolt as you OC the CPU cores past stock.
Find the limits now, and again when you get around to re-pasting.
Are you using a laptop cooling pad? At least lifting up the rear an inch or so to improve air-flow? -
-
Only change 1 thing at a time, and you can correlate effects more precisely - you are sure it's due to one specific change, and don't need to "guess" which of the 2 or more changes are responsible - you have a good chance of guessing wrong -
Seeing all this temp stuff, OC'ing and UV'ing I'm tempted to see what the new AW can do with the new cooling system.
Hmmm...not bad of temp for out of the box. I'm up for the challenge. Just say go @hmscott. Lol....
A family member does need a laptop and was headed to get an MSI, but let me divert that course...(college student and games.)
::iunlock::hmscott likes this. -
hmscott likes this.
-
This is tempting...repaste, repad, UV and OC... I'd run the heck out of it to see what it can do. Hmmm
::iunlock::hmscott likes this. -
Save up some powder for the 1080 when it arrives in Novemberiunlock likes this. -
You have attracted several really capable people to help you out, hopefully they can stick around and assist you now, and later too
Have fun -
That's a good idea @Papusan / @iunlock , if you have time maybe help him get control over how the AW software is tuned right now, and explore the Windows tuning options as well.
If the AW software profiles are work against your XTU settings it can be confusing, with one setting the CPU up initially and then the other coming in afterwards making changes on it's own.
Right now you want the power profile to have the CPU set to 100%/100% CPU - Windows High Performance setting - unknown what AW sw would call it...
See ya'll laterPapusan likes this. -
What about noise levels and gpu temps? Last gen AW had the same CPU temps but great sub 70C even on 980M, try to run Unigine Valley ExtremeHD bench. Does it have G-Sync?
Thanks)hmscott likes this. -
I already feel sorry for it before it even arrives, because you know how I drive lol...
Come Nov. I'll get the AW1080 as well as that's the one that we all have our eyes on to see how it stacks up to the many others that are currently available. I'm curious to see what kind of numbers it can produce with the GTX 1080.hmscott likes this. -
He needs to record the stock voltage at stock clocks first to see how much power it's eating up. It'll likely be a very thirsty chip as they all pretty much are across the board, which means that it should be enough voltage there to run the 6820HK at higher clocks no problem without messing with the voltage.
I would not recommend using the stock profiles that Dell has as presets. If it's anything like the R3, those settings are complete rubbish
If anyone needs help and you're willing to screen share, I'll help you do some quick tests to see if your unit can live up to the claims, because if it can't run at 4.0 at the least then I'd exchange it for a unit that can. If you get lucky there are chips that can handle 4.1GHz no problem (6820HK). This is totally up to the individual, but to be honest I'd be upset to have paid for something that claims something and for it to not work.
There are preliminary tests and things one should always do when they first get their computer. Seeing if the CPU can handle what it claims is just one out of the many.
....Now I really can't wait to get my hands on one.Last edited: Oct 13, 2016 -
Here is a thread that could help you with visuals and to get some ideas. Although it's mostly about liquid metal, the info found on there can be helpful.
[Liquid Metal Showdown] Thermal Grizzly Conductonaut vs Cool Laboratory Liquid Ultra / Pro
Here's one on CPU and Voltages:
[6820HK | 6700HQ] CPU Voltages at Idle and Max. (Stock + OC)
Here's one specifically on Under Volting:
[Undervolt Results] 6700HQ △ 6820HK (What's your MPG?)
This should help you out with answering a lot of questions....
-
hmscott likes this.
-
-
http://imgur.com/u152Nb6
Here is temps and clocks -150mv and 3.6 across -
Package power will directly correspond to cooling capacity / paste job.
Edit: It also seems like the BIOS is adding too much voltage under load.. 1.225v is pretty excessive for 3.6ghz. You should be able to push more on the undervolt.. Try to land around 1.15volt. Or keep pushing the undervolt offset until it crashes then back off by -20mv
And focus on stock clocks for now in my opinion. Just leave your multipliers alone and focus on max undervolt.. When you change multipliers the BIOS will automatically add more voltage to compensate.Last edited: Oct 13, 2016hmscott likes this. -
Thanks) I strongly suggest to open up the laptop and repaste it. 79c and 90c between the coldest and hottest cores is too much.
Also, fire up XTU benchmark 2 times and record temp and score at the end. It stresses cpu quite a bit.
Sent from my SM-G930F using Tapatalkhmscott likes this. -
-
http://imgur.com/a/aGdnP -
double post
-
So it's throttling. Recorded speed was 3.4. Try and test at -150 and -130 for the sake of experiment. But at at 3.6ghz with no throttling it should get you to 1200 mark. What ram are you using?
Sent from my SM-G930F using Tapatalkhmscott likes this. -
Red Line likes this.
-
http://www.3dmark.com/3dm/15420046
this is with -150mv, +200 gpu clock +400 mem, CPU temp max was 78 and GPU was 72
not bad, with a little paste I think the temps will be a lot better. -
hmscott likes this.
-
99.99% of the time having uneven core temps like that means that it has a terrible paste job. This is common even more so with it being just the bare die...ie...little room for error compared to the IHS on a desktop chip.
A Repaste is mandatory as stock paste deteriorates quickly and leaves little to be desired as time goes on... all paste will seem decent when fresh, but reality will hit shortly depending on how crappy the paste is and with Dell?...I'd rank them just above toothpaste.
@quickie, repaste with Gelid Extreme and/or Grizzly Kyronaut when able. I'll have a tear down of it once I get mine in. Hang in there if able...Last edited: Oct 13, 2016 -
hmscott likes this.
-
@quickie , Could you please run a Fire Strike Ultra/Standard and post the power pull (Watts) with and without OC ? on the GPU (Use HWinfo Plus run these on the dGPU mode not the Optimus mode). I'd like to know what's the GPU TDP and the real world power usage. You can Install Rivatuner and Check the (RTSS) OSD tab in HWInfo settings for custom settings with games/gfx programs to fetch these values and monitor them realtime.
Also you can use MSI-AB for Undervolting the GPU. Since It's a shared HS undervolt the Skylake voltage, It's too much (Sad part is the HK and HQ use too much voltage compared to the 6700K that K series CPU clocks upto 4.6-4.7GHz at 1.4v while these need more than 1.3 at 4.2-4.3 or even 4 that's Intel BGA) and run Cinebench11.5, WPrime 1024 to test stability. Can also run the FireStrike in Loops for the stability. And a little cut off with the 1070 stock voltage might also let you drop temps further, With Maxwell It was not possible to Uv the GPU voltage this is truly remarkable for us to control the GPU voltage with Stock vBIOS give it a try, If it works with Dell vBIOS then you are lucky.
Dell/MSI/<ANY OEM> stock pastes in this world are the WORST you can even imagine. Period. I *STRONGLY* recommend a re-paste. Else you will experience too much dust if fans are running in Auto, HighTemps and degraded UX, It's pretty simple once you get a hold of it. X or 2 lines on the GPU and one single line on the CPU works & get Gelid GC Extreme using it since 2-3 years and works like a charm, More Info here
You can control the fan speed from HWInfo on Alienwares try to max the fans out when Benching and gaming stuff. Also make a note of CFM rating on the Fans while you re-paste and Please kindly let the people know what are values, Finally you can start AIDA64 when running these tests and check that Voltage power graph, If it says ACline only then there's no stupid Hybrid Battery boost gimpstuff.
Thanks and have fun with your machine. Best of Luck.Last edited: Oct 13, 2016 -
Mmm I have the impression that the 6820HK heats more than the 6700HQ @ stock (even with the factory added frequency boost), do Intel apply more voltage on the 6820HK ?
I think the GPU temps are awesome for Pascal, was used to hitting 89°C in 3DMark and Unigine valley very rapidly with my old (now returned) GL502VS.
Regarding CPU temps they are a bit higher than what I was expecting at stock, was expecting 80-84ish°C.
Anyone knows why my software undervolt with XTU are not always persistent with my dell XPS ? I apply a -75mV but sometimes I boot up the computer, XTUservice process is running in the background as usual, but when I open the xtu app my profile is @-0mv stock configuration. It's not frequent, but I don't know why it happens, any idea ?hmscott likes this. -
So far the GPU temps out of the box seem pretty decent. With a repaste it could hold the crown for having the lowest temp GPU with the new cooling system. Basing off of the first chinese reviews that rolled out, there is definitely room for a lot of improvement from taking what is decent to perhaps having the best cooled GPU in gaming laptops. After all, the new cooling system may fair out to be worthy and I'm feeling optimistic about it...
CPU temps are easy to tame...work on your under volting and plan a repaste here soon if you so desire. Highly recommended.
XTU and TS are like sworn enemies lol. They don't seem to get along all the time. However, the main issue that I know of is that with TS, if you make changes and save it, it'll create a .ini file in the TS folder; which will load every time you launch TS and can cause some bucking of heads with any other tuning software. Therefore, I'd highly recommend that you delete the .ini file if you don't want it interfering with XTU or any other program. (You'll have to go to view -> Show hidden files to see the .ini file.)
Tip: Use XTU to set your multipliers and nothing else. Then use TS to adjust your voltage.
Hope that helps.Last edited: Oct 13, 2016 -
hmscott likes this.
-
Can you do gaming benchmarks? I believe that's important. Games that needed to be benched are:
a) Witcher 3
b) Arkham Knight
c) Crysis 3hmscott likes this. -
hmscott likes this.
-
Because TS isn't perfect nor is XTU. I've never felt comfortable due to funny business with OCin'g via TS. It's great for adjusting voltage, but as much as I don't care for XTU as well, it seems more confident with adjusting multipliers via XTU...after all it is made by Intel and I think the Skylake is made by Intel? Or is it AMD?
hmscott likes this. -
Is it AMD? -
I didn't say that it doesn't work. Personal preference.
Yea it's bloat lol..
I think Skylake is AMD FX100000.
::iunlock::Last edited: Oct 13, 2016hmscott likes this. -
[Alienware Refresh] Initial thoughts and impressions? Hit or a Miss?
Discussion in '2015+ Alienware 13 / 15 / 17' started by iunlock, Sep 2, 2016.