The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    GX640 Owners - Post Your Temps!

    Discussion in 'MSI' started by fadegs, Apr 23, 2010.

  1. fadegs

    fadegs Notebook Geek

    Reputations:
    8
    Messages:
    92
    Likes Received:
    1
    Trophy Points:
    16
    Just got my GX640 and my CPU is idling around 55C and GPU around 68C! My GPU temp seems really high for idle. Do I have a defective unit??

    What are the temps for other GX640 owners?
     
  2. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Mine idles at 48/42 for CPU and 63 for GPU when at default clocks of 625/1000. Downclocking the GPU gets me around 54 degrees, though.
    However, this is with an ambient temperature of 23 degrees Celsius. What's your ambient temp?

    Your temperatures under load are more important, though.
     
  3. Molius

    Molius Notebook Consultant

    Reputations:
    23
    Messages:
    275
    Likes Received:
    0
    Trophy Points:
    30
    Mine temperatures are roughly the same as those of lackofcheese. Ambient temperature is similar, CPU idling at 1,2GHz at ~44, GPU idling with stock clocks on 61, downclocked at 54-55 C.
     
  4. hiryuswift

    hiryuswift Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    CPU: ~50C
    GPU: ~63C

    @ room temperature

    Mine was also a bit hot during idle when I first unboxed it, but after heavy usage for a day my idle temps dropped down a bit. Probably the thermal compound needs to settle.
     
  5. fadegs

    fadegs Notebook Geek

    Reputations:
    8
    Messages:
    92
    Likes Received:
    1
    Trophy Points:
    16
    How do I check ambient temp? I'm using HWMonitor. Is it the ACPI? Mine reports 60 C
     
  6. BenLeonheart

    BenLeonheart walk in see this wat do?

    Reputations:
    42
    Messages:
    1,128
    Likes Received:
    0
    Trophy Points:
    55
    For ambient temp, check any thermometer in your house :\
     
  7. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Room temperature is not a very well defined term. Could you be more specific?
     
  8. fadegs

    fadegs Notebook Geek

    Reputations:
    8
    Messages:
    92
    Likes Received:
    1
    Trophy Points:
    16
    Just ran 3dMark06 and my max GPU temp was 93 C and CPU at 83 C. These seem way high.
     
  9. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    The CPU isn't too bad, but the GPU temp is somewhat high. For reference, my max temps in 3DMark06 were 81 for CPU and 86 for GPU.

    However, if it's, say, 30 degrees Celsius in the room you're sitting in, the temps you're getting wouldn't be anything out of the ordinary.
     
  10. fadegs

    fadegs Notebook Geek

    Reputations:
    8
    Messages:
    92
    Likes Received:
    1
    Trophy Points:
    16
    My ambient is 25 C.

    Do you guys think it's because I just got my laptop and like the poster above said, the thermal compound hasn't settled yet?
     
  11. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Well, the best way to know for sure is to give it another day or two.
    93C for the GPU with an ambient of 25C is definitely on the high side of things.
     
  12. hiryuswift

    hiryuswift Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    0
    Trophy Points:
    15
    Typically, new applications of thermal compounds takes a bit of time and heat to fill in the uneven spaces or gaps between the heatsink and chipset surface. The compound "thins" and fills when heated, while it "thickens" when it cools.

    edit:
    here's a link to a review as an example of how some thermal compounds perform over time from initial use
    [H]ard|OCP - Thermal Paste Shootout - Q209
     
  13. catacylsm

    catacylsm Notebook Prophet

    Reputations:
    423
    Messages:
    4,135
    Likes Received:
    1
    Trophy Points:
    106
    No, for a stress test, 15/16 inch with a enthusiast level card, 93C is a good level, and 25c ambient is quite high too,
     
  14. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Well, it's probably still an acceptable temperature, it's just higher than what other people have been getting with the GX640 - I got around 86C during 3Dmark06. I don't think 3DMark06 is that much more stressful than a typical game, however, especially because the tests aren't very long.

    The GX640 is a great laptop, though, and even though the cooling could be better, it's definitely preferable to a 7.5 pound Sager NP8690, even though the Sager has an HD 5870.

    Also, for reference, I hit 101C running Furmark for 8 minutes (Full screen at 1680x1050); see my post in the Owners' thread.

    The strange thing is that the 5850 didn't downclock, but the fan kicked in at an even higher speed when it got that high, and yet the fan wouldn't run at full speed below 100C.

    I don't intend to run FurMark again, though, at least not for 8 minutes.
     
  15. catacylsm

    catacylsm Notebook Prophet

    Reputations:
    423
    Messages:
    4,135
    Likes Received:
    1
    Trophy Points:
    106
    Yeah i've found this with my machine too, the fan doesnt really go off the gpu alone but all the components, which is quite strange,
     
  16. NotEnoughMinerals

    NotEnoughMinerals Notebook Deity

    Reputations:
    772
    Messages:
    1,802
    Likes Received:
    3
    Trophy Points:
    56
    downclocked gpu to 450/450 is 49/55/50
    at standard clockers my gpu idles at 56/62/58

    cpus in the mid 50s

    don't have a thermometer around lol... I'd guess ambient is 22-25...
     
  17. min2209

    min2209 Notebook Deity

    Reputations:
    346
    Messages:
    1,565
    Likes Received:
    3
    Trophy Points:
    56
    you guys should also post the temperature monitoring software. This isn't a thermometer that the programs are using, each program assumes as Tjmax value for each sensor.. and, often times they get it wrong.
     
  18. NotEnoughMinerals

    NotEnoughMinerals Notebook Deity

    Reputations:
    772
    Messages:
    1,802
    Likes Received:
    3
    Trophy Points:
    56
    I use gpu-z for gpu and a mix of speedfan and hwmonitor for cpu

    speedfan for quick reference and just because I can incorporate it into a samurize config, even though I know it's not the most precise reading
     
  19. min2209

    min2209 Notebook Deity

    Reputations:
    346
    Messages:
    1,565
    Likes Received:
    3
    Trophy Points:
    56
    Speedfan doesn't work for CPU. the Tjmax set in there is 90 or something whereas it should be 105. So all reported temperatures are 15 degrees too low.
     
  20. NotEnoughMinerals

    NotEnoughMinerals Notebook Deity

    Reputations:
    772
    Messages:
    1,802
    Likes Received:
    3
    Trophy Points:
    56
    I wouldn't say they're THAT far off. If HW Monitor says 60 and speedfan says 58 I don't thing I should be assuming 75
     
  21. catacylsm

    catacylsm Notebook Prophet

    Reputations:
    423
    Messages:
    4,135
    Likes Received:
    1
    Trophy Points:
    106
    Used the AMD sensor tool, that'l tell you, or get a manual reader.
     
  22. min2209

    min2209 Notebook Deity

    Reputations:
    346
    Messages:
    1,565
    Likes Received:
    3
    Trophy Points:
    56
    Ah... you have an i7 quad core. The issue I noted was for the i5, and it's a commonly known thing. Mine says it idles at 35C, but can't be right.
     
  23. NotEnoughMinerals

    NotEnoughMinerals Notebook Deity

    Reputations:
    772
    Messages:
    1,802
    Likes Received:
    3
    Trophy Points:
    56
    Alright, I just interpret it as w/e SpeedFan says +/- 3-5, idling at 70 would be problematic
     
  24. peekaboom

    peekaboom Notebook Consultant

    Reputations:
    4
    Messages:
    246
    Likes Received:
    1
    Trophy Points:
    31
    Wait, do the i7's run cooler than the i5's in general?
     
  25. NotEnoughMinerals

    NotEnoughMinerals Notebook Deity

    Reputations:
    772
    Messages:
    1,802
    Likes Received:
    3
    Trophy Points:
    56
    Definitely not, i5 is definitely cooler. But how much I'm not sure, theres 10 more watts of power to dissipate on the quad
     
  26. min2209

    min2209 Notebook Deity

    Reputations:
    346
    Messages:
    1,565
    Likes Received:
    3
    Trophy Points:
    56
    Yes, that's right. I'm just saying though, there is a program.. it escapes my mind which one, that simply displays something like degrees C to TJMax. then you can just go to Intel's site and look up the TjMax for your particular processor. That should be accurate.
     
  27. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Here's something cool I put a bit of effort into:
    I ran 3DMark06 and got a score of 11500 at stock clocks, which is nothing special for this card. However, what I did do was make an awesome graph of all the important status figures for the CPU and GPU.
    I spent a few hours working out the best way to set it up, but as an Engineering student, if I can't make a decent graph, what good am I? Of course, now that I have my tools set up (including a Python script to automatically generate the graph), I can make this kind of graph for anything I like.
    In any case, the graph follows (thumbnailed):
    [​IMG]
    Ambient temperature was my usual 23C.

    What's really awesome is you can see which data correspond to which tests.
     
  28. Ryan

    Ryan NBR Moderator

    Reputations:
    2,320
    Messages:
    2,512
    Likes Received:
    17
    Trophy Points:
    56
    Neat!

    Kudos to lackofcheese for an awesome work...

    Pity to consider myself as an engineer too.... but I'm learning, so I'll get there!


    Anyways,

    what were your GPU's downclocked clocks?

    Everybody mentions these 'downclocks' but nobody actually says what clocks..
     
  29. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Well, PowerPlay downclocks to 100/1000, which doesn't seem to make much of a difference to the level of heat - I think the memory plays quite a big role in the card's idle power consumption. However, because PowerPlay seems to cause trouble, people have been manually underclocking. From experimentation, 300/300 is a good choice; going any lower doesn't seem to make much difference in temperatures.
     
  30. catacylsm

    catacylsm Notebook Prophet

    Reputations:
    423
    Messages:
    4,135
    Likes Received:
    1
    Trophy Points:
    106
    Very nicely done there cheese, didn't realise the c states influenced it that much, you can see quite a temp change between the two,

    The Temperatures are also nice, although its pleasent to know other parts of the GPU run cool while the memory and core run at a respective temperature, Has the machine endured any thermal repasting?
     
  31. Ryan

    Ryan NBR Moderator

    Reputations:
    2,320
    Messages:
    2,512
    Likes Received:
    17
    Trophy Points:
    56
    [​IMG]

    Well,

    I guess I meant differently,,

    I meant underclocking?.. or downclocking I do not know..

    but the stock clocks at 700/1000 made me hit 105~ in Furmark,

    Now the temp stables around 99'c with the clocks at 600/900...
     
  32. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    If you mean the C0/C1 at the top, they're the two CPU Cores, #0 and #1. I was conserving space for the labels, though as it turned out there was enough space to call them Core #0 and Core #1 anyway. I'll change my Python script

    By the way, if anyone's interested I can distribute my Python script, but I want to improve it and comment it properly first.

    No, I bought a stock config GX640 and it hasn't been opened.

    @meraki1990
    105C in Furmark is high indeed, but the main thing to take out of that is not to run Furmark. Once my script is nicely done up you can do a 3DMark06 run similar to mine so we can compare temperatures.
    I guess your 920XM isn't helping your temps; you'd be much better off reducing the CPU's power consumption than the GPUs if you want good performance.
     
  33. catacylsm

    catacylsm Notebook Prophet

    Reputations:
    423
    Messages:
    4,135
    Likes Received:
    1
    Trophy Points:
    106
    Ahh i thought they were c states....what an idiot haha, but its interesting to see that 3dmark is multcore processing but not trying anything parallel,

    The temps i think are good for stock 5850, if i had one i'd be taking of the heatsink and replacing the pads with paste, Death2theworld really dropped his temps with that,

    Really look forward to the completed python script too, if released, it'l be a tool i definately use.
     
  34. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Okay, here's the python script:
    http://pastebin.com/WYBJEV12

    First of all, the tools you need :
    - Python 2.6 with the Numpy and Matplotlib addons (They're nice for graphing)
    - GPU-Z and RealTemp from Techpowerup. The reason I chose these is because they contain relatively full data GPU and CPU monitoring data respectively, and are able to save data to logfiles.

    In GPU-Z you will need to tick "Log to file" in the Sensors tab.
    In Real Temp, there is a Log File option in the settings; tick it for best results. I also recommend ticking the "TM Load" option, which means the CPU load will be represented the way it is in Task Manager, because the default load measurement is something different. You should also verify that the TJMax setting matches your CPU (mine defaulted to 105C which is correct for my Arrandale CPU).

    Finally, you need to change the file addresses at the beginning of the script to match up to wherever the log files happen to be on your system.

    To avoid trouble with file conflicts, close both GPU-Z and RealTemp before running the script. Addditionally, it's best to make a copy of your log files so you can regenerate the same graphs, or modify them, whenever you like - the raw data is much more valuable than the graphs for keeping on your system.

    When you run the script, it should read the data from the files and produce a graph.


    If you have any questions, they'll probably have to wait until I wake up. Hopefully, I'll see some awesome graphs from other people when I do.



    EDIT: I think one aspect that needs a little work is how it handles having GPU-Z and/or RealTemp off or when they miss readings. In my experience, they both seem to do it every so often. I can't seem to find a way to stop it from joining any points that have a gap of more than 1s between them, which would presumably be the best solution.
    Currently, it has plots in the load graph that spike up to show when GPU-Z and RealTemp miss readings, and while this is good for when they are turned off altogether, it doesn't look too good when they seem to occasionally miss one reading.

    EDIT2: I've fixed and improved a number of things. In particular, I've added generation min/avg/max statistics to the script. I could add these to the plot window later, but at the moment I can't be bothered working on fitting the text in. The only major improvement left is what I mentioned before - leaving gaps if there are no readings from GPU-Z or RealTemp for a certain period of time. I think I know how to do it, but it will require some extra effort. Another thing I could do is label the x axis with actual times instead of the number of seconds from start.
     
  35. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    http://pastebin.com/E03Eg8L6

    Updated version. Now there will be gaps in the graph whenever GPU-Z or RealTemp miss readings.

    [​IMG]
    A screenshot of the results of the script - the top is the Python window with statistics, and the bottom is the graph window. I prefer to full screen the graph window when making graphs, though, especially ones like this one considering it covers a few hours and many thousands of data points. Matplotlib gives you some nice options, including the ability to save the graph directly to png which is what I did for my previous one. That graph was my GX640 mostly idling (video card downclocked to 300/300) for a few hours while I went to sleep yesterday, though uTorrent was probably on at the time, and maybe some other stuff.

    I look forward to criticism and/or modifications to my code. If anyone has any suggestions on how to improve it, that would be cool too. I think one thing that would improve it is adding framerates to the graph, but I'd have to find a framerate logging tool.
     
  36. BenLeonheart

    BenLeonheart walk in see this wat do?

    Reputations:
    42
    Messages:
    1,128
    Likes Received:
    0
    Trophy Points:
    55
    Temps are low..
    thats nice...
    85C while on full 100% load...
     
  37. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Here's something that should make everyone here happy. It would seem that the temperature we (or rather, GPU-Z, HWiNFO32, Furmark and HWMonitor) thought was the core was likely not. The AMD GPU Clock Tool only sees three sensors, but the extra sensor in the other tools seems to match the MemIO very closely suggests that perhaps it's just the same sensor. The slight differences between them are a little strange, though.

    In particular, this would be a good explanation for why my GX640 merely spun the fan faster when I hit 100C in Furmark - the core was still significantly cooler than that value, and so obviously the GPU didn't throttle or shut down.

    While 100C for MemIO seems high, according to ziddy123:
     
  38. peekaboom

    peekaboom Notebook Consultant

    Reputations:
    4
    Messages:
    246
    Likes Received:
    1
    Trophy Points:
    31
  39. Tree_Burner

    Tree_Burner Notebook Deity

    Reputations:
    952
    Messages:
    1,708
    Likes Received:
    29
    Trophy Points:
    66
    my gpu hit 67 playing Warcraft III for an hour. haha
     
  40. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Thanks. I found that GPU-Z logs fan speed, but the data seemed to be meaningless and didn't represent the actual fan speed for me. It might work for your setup, though, and if so it wouldn't be a problem to add another graph in.
     
  41. NotEnoughMinerals

    NotEnoughMinerals Notebook Deity

    Reputations:
    772
    Messages:
    1,802
    Likes Received:
    3
    Trophy Points:
    56
    gpu-z likes to sometimes show rpm and sometimes not, as well as constantly saying my fan spinning at 30%
     
  42. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    If you are going to tell your GPU temps, you ought to either....

    Post screenshot of HWInfo32. If you play a game, keep it running and then post after you are playing so we can see the MAX temps.

    or

    Post screenshot with the AMD GPU Clock tool, same method as above.

    This thread is sorta of meaningless without proof. Anyone can say whatever for their GPU and CPU temps.
     
  43. NotEnoughMinerals

    NotEnoughMinerals Notebook Deity

    Reputations:
    772
    Messages:
    1,802
    Likes Received:
    3
    Trophy Points:
    56
    because why would we lie?

    kind of demanding for someone who just started posting on these boards. We've uploaded screenshots everywhere
     
  44. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    None have been posted in this thread and this thread is specifically about it. And it's not demanding at all, print screen, post picture.

    As for lying, yeah people lie about their notebook all the time. It's not intentional, but they just guess based on memory or they exaggerate.

    Ok just an example. I just played a few races in Dirt 2, 3 of them. GPU overclocked, core 800, memory 1,100. As you can see, it doesn't matter whether you use HWInfo32 or AMD GPU Tool, the memory readings are identical. TSS0 = GPU DispIO = GPU Core. TSS1 = MemIO = Memory Controller. TSS2 = GPU Shader = Shader Core.

    There was someone in the G73 thread asking questions about reading the temperatures, a GX640 owner. The important one is the Core, which is TSS0 or DispIO. The Memory Controller sensor and the shader sensor are both on the Core btw. Memory controller does not equal video ram temperatures. There aren't any sensors on those, never have been on any mobile GPU, ever...

    My observations tracking G73JH temps from various owners is that the IDLE temps vary among us. Some have idle core temps around 49C. Others as high as 57C. Some have Core and MemIO temps close, other far apart. But under load, when we are gaming, our temps are stable around 77-79C for Core and 88-92C for MemIO an Shader around 79-82C. You guys may observe the same thing, idle temps varying, but under load, temps the same, which is the important part anyways right?

    Core: 79C Shader: 81C Memory Controller: 90C

    [​IMG]
     
  45. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    What, so someone couldn't photoshop their temps if they wanted to? A screenshot doesn't really constitute proof once you get down to it.
     
  46. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    If you want to post your temperatures, just post the screenshots. It's not a hard thing or time consuming thing to do... You can almost always tell if someone photoshops a screenshot also.
     
  47. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    I'm happier posting a nice graph like I did on the previous page, for the most part. It gives you much more information than just min/avg/max.
     
  48. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Well, your temp sensors look to be set up the same as mine, and I'll post a screenshot to make you happy at some point later, but in the meantime I'd like to point out that there *is* no TSS3.
    Looking at your screenshot, it's clear that TSS0 = GPU DispIO, TSS1 = GPU MemIO = GPU Thermal Diode, and TSS2 = GPU Shader, because those figures quite clearly match up between GPU Clock Tool and HWiNFO32.
     
  49. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    Thanks for the heads up. BTW this isn't to be annoying or to compare. I just want to see how effective the one fan cooling solution is from MSI. For future, if MSI continues to use 1 fan solution when I need to upgrade again, and it's effective, then MSI will be at the top of my list to buy. And I'm sure for future MSI owners want to know also. I'd rather have a thin notebook than a fat one.

    If this is such a hassle then don't bother. But also mind you, my GPU clock is 800 mhz and memory is 1100 mhz :D
     
  50. BenLeonheart

    BenLeonheart walk in see this wat do?

    Reputations:
    42
    Messages:
    1,128
    Likes Received:
    0
    Trophy Points:
    55
    I'm also interested if MSI's 1 fan solution is great...
    on either the GX640 and the GX740...

     
 Next page →