cause maybe its just one game and if those user never come across it they won't see it the way you do.
-
Meaker@Sager Company Representative
Does SOM even support SLI yet? Or do you have to hack it in still?
1440p going up to 144hz with G-Sync should be an excellent pairing for a pair of overclocked 980M cards. -
Here is an example of it playing pretty good for the most part. but this is 1080P. When i ran it higher...it took a serious hit.
Turn volume down first though.
Last edited by a moderator: May 12, 2015 -
Yeah, that's just silly. Anything you can do on 3K or 4K you can do just as well on 1080p, just way faster. I get why people (including me) like 3K and 4K, but I have no problem with liking 1080p at all... nothing wrong with it, especially on a laptop with an 18" or smaller screen. I hate 3K and 4K on a small screen... it sucks. Needs to be, at bare minimum, 17" to not be ridiculous to try to use. The text is too tiny at 100-125% scaling and above 125% scaling it looks like crap (everything out of proportion, text too big to fit inside of dialogue boxes and menus, etc.) so you may as well go back to 1080p at 100% and have something pleasant to look at.
-
just ran 1440P and it was no where near close to 140+ frames. And neither was 1 desktop 980. So I don't see it faring better with two mobile cards. And that's not even UHD. That is also with running nvidia control panel settings at "use application settings"
-
sorry I asked non related question in middle of discussion. yesterday I flashed 770M OC rom from SLV7 on my notebook. I lost my internal LCD on EFI option in bios. only DVI output worked when I change the option to EFI.
I messed around with CSM settings and both LCD gone for goodsystem beeped every 30 secs and nothing happened (even removing bios battery didn't change anything to default, I wonder why they use eeprom for bios).
after 20 minutes of horror , I realized the system boots after 3 beeps but has no output, so I just use F2, F3 (restore default) , F4 (save and exist) blindfolded and thanks god I've got my lcd back.
then I had to flash it back to my original vbios to get everything working.
Is this happened to any of you? or did I do something wrong? I mean was there something I should have done it before or after flashing??? -
that was a pretty bloody game u playing there lol
-
Yeah, it is. I completed it already, but going back to get all the stuff I missed the first time.
-
Meaker@Sager Company Representative
) 980Ms are going to blast past a single 980M
Takaezo likes this. -
I think you mean a single DT 980 though. And you might get past one card in "some" cases, but you sure as heck will never get past 2 3 or 4 cards.
-
... 980M plug in ... driver nvcvi.inf mod ... driver install ... ready for gaming
And the best: no upgrade problems with the first mainboard revision v2.1(A) as with 780M or 880M :thumbsup:
-
guess what, you are not the only ones who ignore me. it happens in other forums too
-
Meaker@Sager Company Representative
Solariseir likes this. -
Meaker@Sager Company Representative
-
-
Hi, first time poster though I've been following the thread for some time. I'm not exactly the brightest bulb in the bunch when it comes to this kind of stuff so I wanted to ask a few questions to confirm.
So I have a Sager NP9570 with 880m gtx sli, and I really want to upgrade to 980m gtx's. Problem is I guess Sager has discontinued the NP9570 and so won't likely officially support the 980m gtx. However, from reading through the threads, particulalry with Meaker's posts, it seems like its possible to upgrade to these cards. I guess my first question is to confirm that it is possible to put the 980m into this laptop. Second, if it is possible, is it going to be as simple as swapping out the cards if I order 2 980m's, or is it going to require a bit more to get it to be compatible (bios flashing, custom drivers, etc) and if so, what steps would need to be taken? Again I'm not the smartest guy out there when it comes to this stuff, so I apologize in advance if the answers are obvious. -
Meaker@Sager Company Representative
No I just intend to frame them on my wall
(It's good to ask questions and I am only joking) You need to redo the pads, put on a good paste job and use modified INF drivers. Otherwise it's fairly plug and play.
-
Waiting on 20nm graphics cards from Nvidia and AMD? Don't. | TechSoda
maybe wait till 2016 for 16nm GPU and some unlocked skylake CPU, or perhaps clevo would make something at that time :/ -
Meaker@Sager Company Representative
I'll be happy enough for now with some 144hz 1440p gaming while I wait for it all to be revealed
unityole likes this. -
Thanks for the quick reply. I guess I'll be ordering those 980m's after all.
-
why go for 144hz whats difference does it make at high resolution? -
Meaker@Sager Company Representative
144hz means that the display can display 144 frames per second meaning all those extra frames the gpus rendered get used producing a super fluid experience (especially when combined with gsync).
Over 27 inches 1440p gives a great image. -
@ meaker, since you recently installed a 27in 1440p display, I would like to ask your expertise on a bit of confusion I have regarding the DVI-I output on the p570. The problem is that the spec sheet says it is DVI-I (Single Link) which has a maximum output res. of 1920 x 1080p, and the female plug on the p570 is distinctively that of the DVI-I (dual link) which supports 2560x 1440p.(which I am not able to accomplish) When researching this issue it is very apparent that the number of pins and the configuration of the two types are distinctively different. Thanks for your help. Which is it?
-
Meaker@Sager Company Representative
It's a single channel DVI port, clevo have just always used that connector regardless of what the output is on the PCB.
-
Thank you......what a pain in the asses to mess with this for a half day trying to figure out ..
-
Meaker@Sager Company Representative
The HDMI should be able to do that resolution and the displayport will do 4k.
Takaezo likes this. -
Yeah I have been using the HDMI and the DP since 2012, but for some reason I thought that I can slave the DVI-I port to Q k5000m(2) using the EDID options in Nvidia control panel, however the options are only to fool the system into Identifying the display in order to change or hot swap displays....I can get the EDID to register as on the Quadro(2), but I can not remove the other EDID from the Q(1), I think it only fools the system, not really applies the slave....Anyone know more about this? This is what I read http://nvidia.custhelp.com/app/answers/detail/a_id/3569/kw/EDID
-
Meaker@Sager Company Representative
All the outputs are hard wired into the primary card so there is no way to actually get the secondary card to drive any display directly.
Prema likes this. -
I read here,(I think) a while back about a customized power supply for the p570, could someone point me to that thread or link?
I recently spent about 20 hours OCing my system with mixed results. My previous best OC included an XTU 1465 score, Cinebench 15 105.4fps/1201 at 4.91GHz on all 6 cores at high low temps of 12c-67c. So Since we have had unusually cool weather I decided it would be a good time to try and break the 5.0GHz barrier. I was able to boot and begin an XTU session at 5.0GHz only twice in about twenty hours, but was never able to complete the XTU before BSOD. This session included a new low temp for the core which I was able to reduce to 6.1C prior to beginning XTU at those two 5.0GHz sessions.
The most important information garnered from this was the way in which the Amperage across the cores discerns whether a successful XTU can be completed at this level. I noticed that at highest voltages I set (about 314volts in XTU), 94 amps was the idle across the core, as the core temps increase when bench marking, this number reduces to the mid to high 55-75amps range. Even though my highest temps were only about 72C, with no throttling; if the amps across the core drop below 50amps BSOD follows.
This draws me to conclude that there is a lack of power available because: never above 90C(thermal limit), a minimum of 50amps to the core is required in order to maintain the minimum system functions an some 94amps are available at start up, so some 44 of those amps are consumed or reduced from thermal capacitance during a 5.0 OCing attempt. because I can not boot in excess of about 94 amps(314 volts) the only way to gain greater than the 44 amps that posts but won't bench is to have additional power available. Finally the fact that I was able to OC to 4.6GHz with three external monitors running, every port on the system being addressed(except Express port/microphone), But watching the amps in this process drop to under 50 with BSOD everytime at 4.7GHz level, I realize there is not enough power being supplied, as the system is easily capable of 4.9GHz stability.
I was able to bench a 4.91GHz with an XTU of 1661 minimal peripherals, so I know that my system is still in a similar state as before.(no reduced performance)
I should note that I do not use XTU for OCing but only for checking stability and benchmarking. I use the BIOS only to OC the CPU. XTU does not have the post capacity as the Bios for some reason. -
Meaker@Sager Company Representative
Dual 330w is currently the highest output I am aware of. If your gpus are idle then there is enough power. Are you cooling your vrms now?
-
you can connect 3 power supply, but I'm not sure the second converter box' wires can handle the current.
-
Meaker@Sager Company Representative
Ah, box into box. No you would probably want to replace the lead out of the final box with a thick but short cable since it gets warm as it is. However like I said if you GPUs are idle during CPU testing you are not using up both bricks with just the CPU.
Takaezo likes this. -
just a suggestion
maybe better to take out one of the VGAs to make sure it doesn't cheat on CPU in power consumption
-
@meaker, yeah I have been cooling them with the ram sinks like you suggested since Aug. When I ordered my p570 only dual 300 watt was available. (Dec 2012). , The GPU's (at least #1 heats up a little during XTU bench) , I have not thought of monitoring the GPU voltage while running an XTU bench. But I would assume that there is an implicit reason that they offer dual 330 watt systems....
@Solar that is a good idea to pull one of the cards.........
Still wasn't there a tutorial by meaker or jonkss or n=1 or mr fox to make an exceptionally stable power supply box/hack?? -
Anyone else who bought p570wm on Malibal?
-
You can use any high current power supply and adjust them in 19.5 volt. but I don't think its a good idea. If the mainboard supposed to support that current they would have made more powerful power supply for this laptop. it doesn't seem they have any problem with size -
They do make a more powerful set of dual 330 watts ps (I have dual 300's(see sig)), that I can not afford right now, so I was looking for a cheaper alternative, Thus the question.
-
Meaker@Sager Company Representative
Ah I see, can't beat the delta 330W and that's likely your problem then as that extra 60W keeps the nose clean usually.
Takaezo likes this. -
-
Hey, Meaker! Saw your overclocked result of Firestrike with 980M SLI installed) Pretty good.
Here's GTX 970 sli run at 1530/8200 for comparison, 23k GPU score should be pretty reachable with modded 980M's
NVIDIA GeForce GTX 970 video card benchmark result - Intel Core i7-4770K,MSI Z87-G45 GAMING (MS-7821) -
Meaker@Sager Company Representative
Working on that
we will see on the score front, a couple of hurdles still to go.
-
how much is the limit for overclocking the GPU? I mean adding how many MHz to base clock or memory clock and additional voltage can be dangerous to the VGA?
Temperature is the only issue or other factors are important too? -
Well not yet.
-
I transfered money on Nov 11, they started working on 13, i guess.
It was shipped on Nov 19th but is stuck in Syracuse for three days now.
I'm frustrated not only cuz the package was sorted as an "HAZMAT", but cuz they didn't ship one from Nevada.
Here's the tracking # they gave me.
UPS: Tracking Information -
Support.3@XOTIC PC Company Representative
-
Meaker@Sager Company Representative
Total score 16249
GPU score 20518
2x GTX980M @ 1306/1480 (boost clocks)
4930K @ 4.3ghz
That's about all you will get out of the stock power limit and is likely "safe" in the sense it wont massively increase chip degradation.
The rule of thumb is damage goes up with the square of voltage, linearly with clocks and in between for temperature.Solariseir likes this. -
Do you have the 135mhz limit removed?
I'm finally a P570WN owner.
Did a 3dmark run Firestrike: Generic VGA video card benchmark result - Intel Xeon Processor E5-2680 v2,Notebook P570WM
3dmark11: Generic VGA video card benchmark result - Intel Xeon Processor E5-2680 v2,Notebook P570WM
Can't wait to have SLI.
Specs if anyone's interested:
Eurocom Panther 5 (P570WM)
AUO B173HW01 v4 90% NTSC FHD Display
Intel Xeon 2680 v2
16GB (4x4GB) 2133Mhz Ram
nVidia GeForce 970M
2x Samsung 840 Pro 256GB Raid0
8x DVD Burner Tray Load
Windows 8.1 Pro
Just have to figure out how to get my ram to work at 2133mhz instead of 1866mhz now. Otherwise, awesome device. :thumbsup: -
Meaker@Sager Company Representative
You need prema's system bios and you can set 2133mhz.
Prema likes this. -
Everyone gets better result than me
3dmark11
I feel I'm poor -
*** Official Clevo P570WM | P570WM3 / Sager NP9570 Owners Lounge ***
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by jclausius, Feb 5, 2013.