Hi everyone i'm just registered to ask one question. I am really interested in this laptop but theres a one more option. So this Asus gl702cz laptop or a clevo barebone with i7 8700 desktop cpu + gtx 1060 6gb. Please tell me your thinkings. I'm going to play games and render videos on it.Clevo link description below.
https://www.dubaro.de/Gaming-Notebook/Dubaro-Gaming-Notebook-N957TP6-i5-8GB-240SSD::3617.html
-
If that's the case, it would be a tough call.
Depends on your preferences really.
The Intel/Nvidia laptop might be more efficient and equally fast as the GL702ZC with 1700 because the Intel CPU uses a better manuf. process which allows all 6 cores to clock higher than 1700 does when all cores are stressed.
However, the Intel CPU might also hit thermal throttling a lot easier (due to having a bad thermal paste underneath the lid), so that could easily affect performance for the worse on the Clevo laptop.
Also, you can easily undervolt and overclock the 1700 in GL702ZC using Ryzen Master to increase its efficiency and performance (I'm usually running 3.3GhZ on all cores when I'm using software that stresses them to that level - but I could likely increase that to 3.4 or 3.5GhZ with stock or slightly lower voltages out of the box) - Intel's CPU's don't seem to overclock well without de-lidding (and as I already mentioned, they seem to throttle easily)... though, they COULD be undervolted out of the box (I think) as well using proper software, so if you did that, you wouldn't have to de-lid or re-apply the thermal paste (at least, not initially).
As for the GPU comparison... both RX 580 and GTX 1060 (6GB) will be relatively identical to each other. The RX 580 uses less power than the 1060 mobile but produces pretty much equal performance (plus, latest drivers are working great)... and of course, there's Freesync on GL702ZC for smooth gameplay.
So, the main difference between the two systems are:
1. you get Freesync on GL702ZC
2. you can easily undervolt and overclock the 1700 to increase performance and reduce power consumption (though, the i7-8700 should be able to at least undervolt too).
3. you can easily undervolt the RX 580 too to further decrease power consumption and reduce noise during gaming.
Mind you, Asus did say they will release a BIOS with security updates for Ryzen, however, they said they won't include an update so we can upgrade to Ryzen 2700.
Nothing was said about releasing an update to support Zen 2 though (7nm).
I'm hoping if most of us GL702ZC owners can show higher adoption of desktop components into laptops (such as the upcoming Acer 2700 with Vega 56), Asus 'might' be prompted to include future bios microcode updates so we can upgrade the CPU to 7nm 8c/16th (which should run at 4.2 GhZ stock, and 4.4 to 4.6GhZ boost across all cores... with 1 core boosting to 5 GhZ likely) in the same 65W TDP.
I'd also look into the premise if the Clevo offers upgrades to future CPU's and GPU's.
Most likely you might be stuck with 8700 CPU... but do check if the GTX 1060 is replaceable. -
On the video creation part the it mostly depends on the software that you will be using. If you use a cpu based or cpu+gpu accelerated program then currently there's no laptop that beats this one with ryzen 1700 in terms of rendering speed and export. However if you are using a gpu accelerated program, that doesn't take advantage of OpenCL and is focused on cuda then you best bet is Clevo.
Sent from my MI 5s Plus using Tapatalk -
Intel machine will be stuck on that coffee lake platform, whereas AMD will support AM4 through to the 7nm zen2. Whether or not ASUS release a bios with support is another story - although - myself and others are working on modding the existing bios 303, just like I done for virtualization support pre 303. It should be possible - I've currently slipstreamed updated microcod and AGESA files into 303, flashed it to the machine, and it seems to be working, though I've not had enough free time to stress test it. And I also need to get a Ryzen 7 2700 to test with.
On another note re: GPU , I've been playing with clock speeds and voltages. Does anyone have much information about improving/over clocking the desktop RX 580 and I'll try improve this further?
I'll probably start a gitlab page or something for proper discussion and input on this, and where we'd like to get the bios/feature set to.
CheersLast edited: Jun 9, 2018 -
wow i wasnt expecting these really convincing comments about this laptop.Now I am all towards this laptop now but only gpu makes me thinking.Rx 580 with a 4GB Vram underclocked against gtx 1060 6 gb vram how much lose can it be.
-
Don't let the RAM quantity fool you. It won't impact performance in significant (or any) capacity. -
-
I'm not a serious gamer, I play once or twice per month, my main use is 3d architecture and 3d design work with V-Ray rendering and Gimp usage.
Sent from my MI 5s Plus using Tapatalkhmscott likes this. -
You can also set Frame Rate Target Control to 60FPS in Radeon Software (drivers) to prevent the GPU from using needless resources (because the refresh rate of the laptop screen is 60Hz, displaying more frames than that is useless and results in 'wasted frames').
Additionally, you can undervolt the GPU using MSI Afterburner to reduce the thermals on the GPU to improve its efficiency. My stable undervolt on it is -93mV on the core - so that way, the fans don't get too loud (if you want to game without headphones).hmscott likes this. -
hmscott likes this.
-
I have a few question about this model, would really appreciate your thoughts..
1) Is it possible to mod (replace) the screen with a better one? One thing that's holding me back from getting this is the poor SRGB rating - 83% SRGB notebookcheck has it at. Perhaps I could source another 17.3 screen and install it (id void the warranty of course)
2) The other thing is 'frying' the mobo. There were about 10 pages in this thread trying to decipher what had caused the laptop to die and I can't see if the issue was ever diagnosed correctly - any ideas what caused this?
3) Heatpipes. There were pictures back in this thread of 2 configurations of heatpipes - I believe one prior to getting an RMA and the other config after. The one post-RMA had the heatpipes shifted so that RAM was more easily accessible. Does anyone know if they are manufacturing these models with the heatpipe adjustments? - should I be looking for any dates of manufacture to ensure I get the latest heatpipe config?
4) RAM. It looks like people are have had success wth installed faster ram - what's the fastest RAM that safe/worthwhile to buy for this laptop?
That's about it. Other than being able to possible upgrade the CPU in the future (which seems to largely depend on this community pushing Asus to include a BIOS update for this, or even people here modding BIOS's themselves), this laptop looks pretty solid and has pulled me away from thinking seriously about purchasing an 8750h laptop for price/performance.Last edited: Jun 11, 2018 -
1. screen change.... i quess its posible but unlikely, and not worth it, there is no other variation of this dysplay other then the 144 refresh rate one that Asus never delivered. And going for a shady third party one on Aliexpres or such sellers may result in waste of money and time. Giving that you require better SRGB coverage its more usefull to buy a 4K external monitor factory callibrated to your needs. Im quessing you are going to run video or photo creation softwares on it wich will bouth take advantage of a higher res monitor with better calibration (+ this GPU can easly handle a 4K monitor)
3. What you saw in terms of cooling every ryzen 1700 based version of this laptop has the heatpipes running over the ram sticks. The other one is the cooling design for the ryzen 1600 one. There are a few members who claim that returning the 1700 laptop will result in a complete change of the heatpipes to match the 1600 one, but the jurry is out on that one. No real confirmation (although its posible).
4. fastest ram is 2666, some have had succes with higher frequency ones and reported that the ram stick operate at 2666. maybe a BIOS update or a cpu upgrade could change that (dont hold your breath)
2. To be continued...... no ofiicial problem reported its just a matter of how well your machine was put toghether and mandatory is buying this laptop with warranty.undervolter0x0309 and hmscott like this. -
All (ok - mostly all) screens for laptops is 6 bit. Not even 8 bit. Even if it "upscaled" to 16m colors. The factory screen is pretty good. I don't know what you can install here. For color works etc you need ext. monitor anyway (my personal opinion). For casual usage I'm happy with factory one (AMD limited RGB is cool feature).
-
Actually the color calibration of the screen is a bit off, using a spyder calibration tool to properly calibrate the screen I found that, but as at least for my unit the screen was a bit to warm in color accuracy, once I switched the calibration on the screen goes to a greener hue, at first glance, after I get used to it and turn the calibration off it hits you like window's 10 night mode filter, an external monitor for graphic work is mandatory
hmscott likes this. -
I am also very curious about this mobo "fry" thing. Can someone please explain that?
-
Question.. would those microcode updates also include support for faster RAM?
And how would you go about installing the said BIOS onto the laptop (and is there a method to recover the old one should something fail?) and how did you manage to get it to work on this motherboard considering its a modified desktop one?
P.S. Where did you get the needed microcode updates for slipstreaming into the existing BIOS? Furthermore, if its that easy, why doesn't Asus bother with it?Last edited: Jun 11, 2018mcalago, Caretaker01 and hmscott like this. -
2) Probably the inadequate cooling in early units and not so well designed cooling to begin with.
It wasn't a 'fried mobo' technically, just the component that regulated auto fan spin ups and downs.
3) 1 configuration with the heatpipes not going over the RAM sticks is actually how the laptop was designed for the version with 1600 CPU. The cooling with heatpipes going over the RAM slots is designed for 1700 CPU version of the laptop.
-Whether Asus decided to switch the cooling units to the 1600 version remains to be seen.
4) I think the safest what works thus far would be 2600MhZ RAM... your best bet would be getting low latency 2600MhZ ram in that case.hmscott likes this. -
I haven't even looked at ram yet, so not comment sorry.
Bios is installed via external flasher (raspberry pi model b in my case). When using an external flasher you can recover from brick easily. There is also a method of flashing from Unix command line (from a running desktop) with flashrom if you remove the write lock (allow signed vendor bios only) when you flash externally for the first time, which saves disassembling the laptop each time you want to flash. Yes you can revert to stock bios anytime you wish either by flashing with the raspberry pi or via Asus flash tool in BIOS etc.
Asus etc don't do bios work themselves. Its outsourced to the likes of AMI (American Megatrends) at $$$ cost. Then Asus test it internally at further $$$ cost, and then it may need further tweaking at more $$$ cost. All this before public release where if it bricks machines or causes issues it can cost them more money. Laptops like ours have small userbase compared to desktop boards, hence more frequent releases for desktops. And for hackers/tweakers like us, we don't have to spend money but we also get to keep the pieces if things break (not anything to worry about if you're just doing some sane bios mods).
I've taken microcode and agesa files from ASUS B350 desktop board (roughly the same as ours) and also from the Linux firmware repository that AMD etc commit microcode to.
Speaking of mods, how much of an overclock do we think is achievable with the RX580? And what voltages and power would we need for it? If we can start a discussion on this then I can do some testing.
CheersLast edited: Jun 12, 2018Caretaker01 likes this. -
Thanks for all the responses. All sounds positive.
I also read on an Amazon review that the "The HDMI output produces a overscanned pictures" and that there's "no Radeon driver to resize the HDMI output image"...is this true?
RE the screen. It's pretty important to me and I think that 83% srgb is actually the result of it being calibrated already, so not ideal for my needs - I have a great panel for my desktop, but i'd need better accuracy for the travelling work, as it would just end up making more work when I get home. I've seen that the Lenovo Thinkpad P71 has a great screen at the same size. Think i'll scout around to see what the dangers of swapping out the screen are, maybe it's been done on other laptops - also a replacement lenovo screen on ebay is only 50 euros, may be worth a try.Last edited: Jun 12, 2018 -
Good evening everyone.
So about a month ago, I was on the fence about upgrading my old desktop with a Ryzen-based system, or getting the ASUS GL702ZC, and finally determined that portability was going to be the deciding factor. Plus, this is really the only modern laptop that totes actual desktop hardware in it.
So far my experience has been better than expected. One of the first things I did was remove the crappy stock thermal... goop, and put Arctic Silver on everything. I do plan to Liquid Metal the CPU and GPU later on after a few more things.
I originally kept combing this board to see what is the maximum speed RAM this machine can run. With RAM prices still being completely stupid, and I'm not exactly liking single channel memory, I wanted to upgrade that next. Seem the consensus is that 2,666mhz RAM is the safe maximum, so thanks to all of you for that.
However... ASUS' customer support is... less than desired. My machine came with a broken P key. So, I naturally emailed ASUS about it, seeing if I can at least get a new key or board to replace. The only option is to send it in, and pay like $500 USD. Really? So I ordered a replacement off Ebay, its for the other model numbers in the series (without the ASUS Game Center button), and going to see if thats compatible. I didn't have much luck with a keyboard replacement with my old MSI laptop so I'm hoping that at least this will work.
I rarely used the onboard keyboard since this is, a desktop replacement so I have a Type-C hub I use as a dock.
Now, one thing that does amaze me, is how well my Ryzen 7 1700 behaves. I used the Ryzen Master utility and set a profile to default it up to 3.7ghz, which is its rated speed anyway, since they have it set to 3.2ghz stock. With just a fan mat, and the thermal paste re-application, this processor self boosts sustained between 4.5 and 4.7ghz. I do understand that the older version of SenseMI only overclocks 2 cores, but I was playing Cities; Skylines, which is a decently CPU intensive game.
Now I'm going to keep my eye on things here on this forum too, because if there is a way to at least unlock the RX 580 on this thing, I'd be willing to try it. The most I can do is set the GPU's power limit to +50%. I clearly have tons of thermal headroom, and probably a smidge more once I liquid metal the processor and graphics processor. Granted I play many games maxed out at good framerates at 2560x1080, and even some titles like Farcry 5 maxed out at 3440x1440 so a boosted GPU isn't completely necessary, but if I can do it, I will.
Thats really all I gotta say about this as my introduction to the owners club here. Its a solid machine, even if the build quality is less than par and the support isn't very impressive, what do you expect, its the only desktop-grade laptop ASUS made, and I highly doubt they'll support it any more, which is why I always look up to the owners and modding community like this one, who are amazing at what they do. -
yrekabakery Notebook Virtuoso
-
The software is likely reporting your clocks incorrectly (by about 1 GhZ too much).
Ryzen 1700 has a standard base clock speed of 3GhZ.
All core boost is restrained to 3.2GhZ (default for this CPU even on desktop), while 3.7GhZ is 1 core boost.
At stock voltages, you could get the CPU to run at 3.7GhZ across all cores... some 1700's can do that with a lower than stock voltage... for some reason I think Asus might have used low silicon quality chips in this laptop... at least for the first batch.
Aside from that, you won't get a lot of thermal headroom on this laptop.
Putting in LM might help things in that regard yes, but you'd need to make sure to apply proper thermal pads onto the VRM's and VRAM chips. LM would likely be best used on the CPU and GPU.
As for overclocking the RX 580... maybe if we unlock the BIOS... but even then I think the best bet would be to try and flash a desktop RX 580 4GB BIOS onto this one... and I wouldn't recommend overclocking it since Asus never really paid that much attention to cooling here.
They restricted the RX 580 to 68W... and even then, GPU temps could go up as high as 88 degrees C when the GPU is fully stressed without an undervolt.
Undervolting the GPU can drop those temperatures by 10 degrees C easily (to 77/78 degrees C).
Your experience with their customer service is atrocious though.
Didn't you get a standard warranty with the laptop that should cover this sort of thing? I think in the USA its 2 years (the ones here in UK are usually 1 year, but I was fortunate to get mine from LaptopsDirect and at the time, they offered a 2 year warranty right out of the box).
Then again, I suspect Asus is requesting $500 off you because they might think you damaged the keyboard as opposed to the laptop coming in like that (and they have no way of verifying it).
Honestly that was really bad of them. A laptop as expensive as this one should get far superior customer service (not to mention it should have had superior cooling implemented from the get go).
If it helps, I was able to unlock Wattman capabiltiy under Windows by following instructions from someone else on AMD community.
Can't affect the core voltages through it because once I hit apply, the frequencies and voltages go back to what they were originally set to, however, I can affect the memory Voltages (was able to drop them from 1000mV to 968mV on stock 2000 MhZ).
I can couple that with MSI afterburner which allows a core undervolt by -93mV.
I'm copy pasting the answer:
I got it: after installing MSI afterburner, and playing with it, it didnt seem to work because everything was enabled and no change to the lack of wattman, but after I fully disabled all options(and I mean, all, one by one) in the afterburner main settings page, then reenabled them as seen in the images(the enabling of the io settings had to happen first followed by afterburner restarting, then after reloading the afterburner disabled the ULPS and rebooted the pc) when it came back up, wattman had reappeared and afterburner is also working without conflict, though I intend to remove afterburner for now as my desired outcome was to have a working wattman again.
But for the life of me I can't figure out how he got all 7 P states to show up.
Might be a flashed VBIOS from a desktop RX 580 4GB, but not sure.Attached Files:
-
-
Asus.com website apparently posted updated drivers (dating to 20th April 2018).
https://www.asus.com/ph/Laptops/ROG-Strix-GL702ZC/HelpDesk_Download/
Of noteworthy note would be updated utilities (all of them... including Win Bios flash utility).
Asus also seems to have slapped in an updated chipset driver version (also dating back to 20th April 2018).
Not sure if that version might have better success in unlocking all 7 P states in Wattman, but I just as well prefer using more updated stuff from AMD itself.
I already updated the Bluetooth and Wifi card drivers along with all utilities.
Everything seems to be behaving fine.
-
I'll have to see if any other software can see those numbers or not. Ryzen Master utility allowed me to slam all 8 cores to 3.7ghz base with a slight voltage bump, which it does adjust the voltage because I did go too high at one point and the system crashed running cinebench. I got 1530 with the 8 core push to 3.7ghz compared to just at 1400 points when left alone.
-
Not sure if the thermals are worth it.
I'd rather keep it at 3.35GhZ across all cores and 1.075V if I have large workloads in 3d studio Max to do. -
Do you still recommend this version over the nvidea asus?
I heard that the 580 does a better job at 3d designing and CAD than the 1070. Not sure if true.
Is there any 2018 version with the 12v ati dust cooling fans?
Enviado desde mi ONE A2003 mediante Tapatalk -
The fact that AMD also keeps up in gaming performance with Nvidia's solutions (and sometimes surpasses it) is pretty good... though Maxwell architecture and Pascal did improve on pro software utilization if I'm not mistaken.
It also depends on the CAD software you intend on using.
Professional GPU's tend to be better in viewport performance, but both consumer AMD and Nvidia gpu's should do the trick nicely.
Some third party renders might leverage more CUDA than OpenCL, but these days, they tend to utilize AMD GPU's for accelerating rendering tasks too with little or no effort anyway, so you should be good in that regard.Caretaker01 likes this. -
Because AMD's been outside of the battle for quite a while there are many softwares that don't take full advantage of this line of gpu-s. My test results show that there softwares are not yet well optimised for amd gpu-s:
- 3ds Max work's perfectly and surpasses most nvidia in their class and even in their price range.
- Autocad and revit work better on amd then nvidia quadro with nvidias certified driver's.
- Archicad and Artlantis work together to offer a poorer experience with this gpu in specific then a gtx 960m, meaning that once a few minutes the gpu downscales to low power consumption mode to the point you have to minimize the software wait a few seconds and then restore to a way better experience then before.
- Lumion 8 and twinmotion works decent but this gpu doesn't render faster than gtx 880m and doesn't live show some effects until you hit render.
- all Adobe suite works as promised no complaints.
- gimp doesn't use any type of hardware acceleration in windows, maybe in Linux it's a different story, nvidia gpus really accelerate gimp in windows.
- blender works flawless in windows maybe even better on Linux.
So far those are what I've tested, but as a disclaimer I haven't spent much time researching ways to optimize or improve by tweaking the driver's, but the softwares I used that are sub par in performance don't seem to be affected by driver tampering in adrenaline radeon software.
Here comes the conclusion: newer releases of these programs seem to have more and more glitches ironed out and seem to be more optimized for amd as they gain market share.
AMD does a lot of work for developers but it's a matter of time until they hit it, and as long as they don't become M. I. A. again
Sent from my MI 5s Plus using Tapatalkmantiyeah likes this. -
I can't comment for CAD etc but in terms of compute power (for hash cracking, mining etc) AMD is very good. And with tools now available to convert CUDA into OpenCL, I've seen some instances where the converted code has run better in AMD & OpenCL as opposed to when it was NVIDIA & CUDA. Again, ymmv but it's what you want from it. AMD fully support and heavily develop completely free and open source drivers on Linux - that is a hackers wet dream for tinkering (which I'll come to in a minute...), whereas NVIDIA's entire stack is a closed up black box.
Back to another subject - since I've been running Linux kernel 4.17 and testing 4.18 recently, I've played with the wattman functionality that's now in the kernel. Our 'mobile' RX580 can be undervolted, overvolted, underclocked and overclocked with it. The functionality is a little rough around the edges (and entirely command line) right now, but it is new and will improve. I'm currently testing a 10% overclock, and trying to see how close I can get it to the 'full fat' desktop RX580.hmscott, Caretaker01 and mantiyeah like this. -
You rock dude, nice to hear that
Sent from my MI 5s Plus using Tapatalk -
I posted earlier, was just wondering if there were an issue using 2nd screen with HDMI port of this laptop, i read that there were scaling issues with this laptop. If someone could confirm they are using a second screen and it works fine that would be great. Thanks
-
Erik C. Stubblebine Notebook Consultant
I need help. I followed the advice to under-volt the GPU with MSI Afterburner. To no avail. The version of Afterburner I have - the newest - looks like an RC controller, or something. What with summer coming to the S.F. Bay Area, and Far Cry 5 running the RX 580 hotter than any other game in my library, I am very interested in this procedure. Any and all input is welcome.
hmscott likes this. -
Erik C. Stubblebine, hmscott and Caretaker01 like this.
-
Restart MSI Afterburner once you've done that and then you should be able to lower the voltage on the core.
Try dropping it by -93mV... or -87mV.Erik C. Stubblebine and hmscott like this. -
Erik C. Stubblebine Notebook Consultant
hmscott likes this. -
Erik C. Stubblebine Notebook Consultant
hmscott likes this. -
A friend just bought this laptop and I've said he should stress test it to make sure everything is running fine, just so if there's an issue he can use the 14 day return policy of the supplier, rather than RMA it to Asus down the line, which would be more inconvenient. i'm pretty noob, but I thought of Prime 95 for the CPU, though am not sure what settings to advise him to use for it, and also i heard that it may not be a good idea to use this test for the 1700 CPU, I don't know. Also the GPU, i've no idea what would be best to test the RX580. Be good to know what stress tests (and what settings - if needed) the community recommend for this laptop. Thanks in advance.
-
Also try to test the laptop before gaming and later monitoring temperatures to see if there is some sort of degrading over time
Sent from my MI 5s Plus using TapatalkLast edited: Jun 20, 2018 -
1. Download the latest drivers from both ASUS (17.something.something) and AMD (18.5.1/2)
2. Go to Safe mode and DDU
3. Restart into normal Windows 10.
4. Install ASUS version first. Notice that this version will not have option to update via Radeon Settings.
5. DO NOT RESTART. I had learned this the HARD way.
6. Install 18.5.1 on top of the old version. The Radeon settings will now accept updates for some reasons.
7. Restart.
8. Freesync Enabled now. -
So I finally sat down and worked on the laptop to fix the keyboard. I purchased a replacement keyboard off of eBay for the other gl702 series. The only thing different is is the numlock key is back in replacement of the gaming center launchkey. This was actually the most pain in the butt keyboard installation of a laptop I have ever done. They held in the keyboard with about 25 plastic rivets. You have to go in through the motherboard to do it. However even without the plastic rivets the keyboard sits secure inside of its spot so it worked out well in the end.
hmscott likes this. -
Erik C. Stubblebine Notebook Consultant
Has anyone else noticed that we now have the WATTMAN option in Radeon Settings! Now I just have to figure it out. I just got MSI Afterburner, although these settings are finicky. Hopefully, with Wattman, we'll have more than two choices: -.87mV or -.91mV.
hmscott likes this. -
If so... that might be different, but I don't think this is the case.
However, I cannot affect core clocks or their voltages.
The only voltage I can affect is VRAM voltage... and that goes down to about 965mV (where it seems to be working for me).
I still need to use MSI Afterburner for undervolting the core.
And Wattman is a bit unstable since any change on voltages while actively running a GPU benchmark seems to crash Windows on my end... so I have to stop the benchmark before applying the undervolt.Last edited: Jun 23, 2018hmscott likes this. -
Erik C. Stubblebine Notebook Consultant
-
Found it, and... it was for Ryzen APU support in "Radeon Software Adrenalin Edition Q2 2018", twitter post had correct new version, main site still had older / different version:
https://twitter.com/AMDRyzen/status/997502429674557440
"Radeon Software Adrenalin Edition Q2 2018 is now available with driver support for the Ryzen 5 2400G and Ryzen 3 2200G. Download now: http://bit.ly/2rMNZCT "
Leaving these here to illustrate the point:
I have an AMD GPU and I'm installing the chipset drivers for my Ryzen CPU. Why does AMD mix both the chipset drivers and GPU drivers like this?
https://www.reddit.com/r/Amd/comments/8suqmd/i_have_an_amd_gpu_and_im_installing_the_chipset/
Jayz 2 video's finding / solving the driver fullscreen color corruption issue he found was due to download link on main site had older version than twitter feed download link:
A question for Ryzen APU owners regarding latest Win10 update
https://www.reddit.com/r/Amd/comments/8sqkmw/a_question_for_ryzen_apu_owners_regarding_latest/
The Ryzen 2200G-based cheap gaming PC is fixed | JayzTwoCents
https://www.reddit.com/r/Amd/commen...p_gaming_pc_is_fixed/?st=JIODO749&sh=10c72e9a
The AMD section on reddit is so busy these days I can go pages deep and still be within 1 day of new threads.Last edited: Jun 22, 2018Erik C. Stubblebine likes this. -
Erik C. Stubblebine Notebook Consultant
hmscott likes this. -
I enabled Wattman functionality via MSI Afterburner procedure I described before, and my voltage on 1077 P state is set to 906 by default.... If I try to lower it, it just reverts to default after hitting Apply button.
The only voltage in Wattman I can affect is the VRAM one. That one is set to 1000mV by default. I can drop it to 965mV stable it seems.
Looks like my GPU has a worse silicon quality than yours anyway... that's why my default voltage is higher. -
I can lower my voltages to 800mv on both the core and memory, but they don't seem to be affecting the temperatures in the slightest (and yes, I made sure the profile for Furmark is active), which doesn't make sense.
Something is not right.
I may have to reinstall the GPU drivers to get a clean slate and test out the voltages via Wattman like that.
What about the power limit?
Have you changed that?Erik C. Stubblebine likes this. -
Hey,
I've just ordered the laptop this past week since it was on sale here for 1.1k euro. Reading trough some of this thread and others on the internet I have a few question - I know there were problems early on with installing new drivers, some special procedures trough device manager. Also some people recommend reinstalling the windows to avoid OEM bloat, can I do this trough system recovery in windows settings, or I have to make a separate usb key to do that?
I guess my main question boils down to - which drivers are the most up to date ones, the ones on asus website or amd desktop variant, how to properly install them and any other tips you would have for a fresh owner of one of these laptops.
Thanks! -
Asus however updated lots of drivers on their webpage, so you can use those updated versions for Wifi, LAN, Bluetooth, Sound, etc.
For GPU and chipset drivers, just download them off AMD website and do an Express install over the drivers that came pre-installed. Freesync should continue to work like that.
As for re-installing Windows... I had issues with games being unable to produce saves with the OEM Windows and with updating the OS (and after I managed to update it, the recovery partition stopped working).
Bloat is not really an issue for Asus because they only include a few things that might be useful (such as the ROG Gaming Center).
If you plan on re-installing Windows, my advice would be to first back up your OEM OS installation to a USB drive and have it on standby just in case.
As for a clean install, you can easily install Windows 10 x64 Home (the same version that came pre-installed on the laptop) with latest updates slipstreamed to it... and, Windows should automatically activate because these days the activation keys are embedded into the BIOS/UEFI.Last edited: Jun 24, 2018BroScientist likes this. -
There's an excessive amount of GPU BIOS tweaking tutorials for the RX 580 that cryptocoin miners made to make them more efficient, maybe one of them can be used to unlock things in this laptop's GPU to undervolt and underclock?
Asus ROG GL702ZC owners lounge
Discussion in 'ASUS Reviews and Owners' Lounges' started by Deks, Oct 16, 2017.