The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    R9 m275 How good is this GPU?

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by refllect, Jul 5, 2014.

  1. refllect

    refllect Notebook Consultant

    Reputations:
    155
    Messages:
    121
    Likes Received:
    6
    Trophy Points:
    31
    It's in the Lenovo y40. How does it compare to the GTX 850m and GTX 860m?
     
  2. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    Looks like a Bonaire core with higher clocks. I don't think it matches up very well against the 860m Maxwell when it comes to power efficiency or rendering. Going by Notebookcheck, Maxwell 860m should be about 75% faster than the m275 while the 850m Maxwell should be 50% faster. However, the Maxwells are known to be extremely cool and can overclock very well.
     
  3. XxxKing YBxxX

    XxxKing YBxxX Notebook Evangelist

    Reputations:
    2
    Messages:
    698
    Likes Received:
    92
    Trophy Points:
    41
    Yup as marksman said it gets trounced by both 860m and 850m. Y40 is really not a good laptop IMO, not great bang for buck and bad specs for a supposed gaming laptop.

    Sent from my Nexus 5 using Tapatalk
     
  4. Truth34

    Truth34 Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    Even the 4GB DDR3 850M?
     
  5. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    The fact that they paired a dual-core i7 with the Y40 makes it a bit questionable. Granted, the dual core i7 is still capable, but a lot less than an i7 4700QM, especially an overclocked 4700QM (which you can't do on ULV CPUs).
     
  6. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    That's much harder. I have a general avoid GDDR3 at all costs policy but Maxwell is supposedly less sensitive to VRAM bandwidth. Look, your best bet is to avoid Bonaire and the GDDR3 Maxwell. Its basically planned obsolescence.
     
  7. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    Nobody really uses GDDR3, it's an older version of GDDR5.

    DDR3 =/= GDDR3

    Oh, on a side note, DDR3 has about 1/4 of GDDR5's bandwidth per clock rate. You would need a 4000 mhz DDR3 to match a 1000 mhz GDDR5.

    You might be able to run 1080p with the DDR3, but that means low AA, anti-strophic, and any other graphic features that requires memory bandwidth.
     
  8. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Not sure what you mean by "nobody really uses GDDR3" because it's been used in GPU's for a decade and counting. GDDR3 SGRAM is not related to DDR3 SDRAM; it's based on DDR2 SDRAM. GDDR5 SGRAM is based on DDR3 SDRAM; it's not related to GDDR3.

    I think there is some confusion here. 4000 MHz GDDR3 is referring to its data rate or "effective" clock speed, not its real clock speed, which would be 2000 MHz. Likewise, GDDR5 with a real clock speed of 1000 MHz has a data rate of 4000 MHz, and CPU-Z reports the real clock speed of my 1600 MHz DDR3 as 800 MHz.

    A good of rule of thumb to remember is that all DDR/GDDR memories have an effective data rate that is 2x their real clock speed, except GDDR5, which is 4x its real clock speed. This is because GDDR5 does 4 data transfers per clock cycle, so it's "quad-pumped." GDDR3 does 2 data transfers per clock cycle, so it's "dual-pumped."

    Therefore, at the same clock speed and bus width, GDDR5 would have twice, not four times, the memory bandwidth of GDDR3.
     
    LanceAvion likes this.
  9. Loney111111

    Loney111111 Notebook Deity

    Reputations:
    396
    Messages:
    828
    Likes Received:
    28
    Trophy Points:
    41
    I was talking about DDR3.

    I haven't seen any recent laptops that use GDDR3. It's either DDR3 or GDDR5 for GPUs nowadays.
     
  10. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    That's because it's a misnomer. Discrete GPU's use GDDR. It's only integrated graphics that use DDR system (CPU) RAM.
     
  11. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    Both you and Loney111111 are correct. There is a distinction between GDDR3 and gDDR3.

    GDDR3 uses a higher 1.8V-2V, is based on DDR2 signalling technology (which you mentioned is definitely correct) and has looser timings (i.e. Bandwidth optimized). This means that GDDR3 can scale easily up to 2800Mhz which is the highest commercially available SKU (courtesy of Samsung) but the highest implemented in a GPU was 2484mhz on the desktop Geforce GTX 285 chip. The GDDR3 chip has more pins and is also capable of concurrent read/write I/O (the data was fuzzy as quoted by the Samsung spec sheet but I interpreted this to mean GDDR3 can attain a larger portion of it's peak theoretical bandwidth). It pretty much has stopped being manufactured since the last of the GT200 legacy GPUs went EOL.

    DDR3 used in graphics is pretty much identical to the one used in system memory. The only real difference is that it usually runs at much looser timings to guarantee good bandwidth performance without having to use expensive binned ICs. Compared to GDDR3, memory speeds top out at 2100mhz (this was the highest bin officially offered by Samsung as of 2013) when used for Graphics, obviously, we've all seen Desktop DDR3 go as high as 3000mhz but these are not cost effective to be used as high volume VRAM. Its slightly inferior to true GDDR3 at the same clockspeed due to lack of the concurrent read/write feature but only uses a fraction of the power. Plus, it is also extremely cheap, you can get 1Gb for less than $5AUD bulk rate.

    Budget GPUs produced thereafter (after the AMD 4870 era) would never use GDDR3 as it's much more expensive than regular gDDR3 (about $25 ish in bulk for 1Gb). i.e. all modern low bandwidth implementations use graphics optimized DDR3 (hence the small g, gDDR3)

    DDR3 RAM is double pumped, this means that it sends 2 signals per clock. GDDR5 is quad pumped, this means 4 signals per clock. GDDR5 has twice the bandwidth of DDR3 at the same base clockspeed and memory bus size.
     
  12. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Very interesting read. I knew nothing about any of this! Thank you for informing us all!
    Do you work in that industry? Amazing knowledge!

    +rep!
     
  13. Marksman30k

    Marksman30k Notebook Deity

    Reputations:
    2,080
    Messages:
    1,068
    Likes Received:
    180
    Trophy Points:
    81
    Nah, just mostly trawling pages upon pages of Samsung Spec and Product sheets. Back when I was researching the possibility of swapping out my Laptop W110er gDDR3 VRAM for GDDR5.