The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    M11xR2 very important benchmark info.

    Discussion in 'Alienware M11x' started by freeman, Jun 10, 2010.

  1. freeman

    freeman Notebook Deity

    Reputations:
    126
    Messages:
    741
    Likes Received:
    0
    Trophy Points:
    30
    I know many of you have heard that M11xR2 isn't that significant, and some people even give the benchmark comparison from dell website. Well, after about 1/2-1 hour of research. I came to the conclusion that those information are not accurate, or at least quoted inaccurately. I made a new post because I believe this is important for people that trying to decide which revision of M11x they should buy, or to keep/cancel their current M11xR1. Because of the wide spread of misquoting Dell number, I here giving you my analysis w/ the links to track back so you can verify the result yourself. And the true is just middle in the mud somewhere that the search would just come back w/ bunch of wrong benchmark number that spread like wildfire instead. I wrote it here first, so that's where you should visit. But the continue discussion of M11xR2 should continue here.
    Addendum:BatBoy(mod) want the discussion of M11xR2 benchmark to stay here, so I guess benchmark comments/remarks stay here.
     
  2. Wiggy Fuzz

    Wiggy Fuzz Notebook Consultant

    Reputations:
    0
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    30
    reading that, there is no way l4d only gets 33fps with no at and no aa, on the m11x r1.

    hell, i get 35fps while running the custom seteroscopic 3d drivers.
     
  3. Bendak

    Bendak Notebook Evangelist

    Reputations:
    36
    Messages:
    589
    Likes Received:
    14
    Trophy Points:
    31
    Yeah, i averaged 35fps on high settings with 2xaa/2xaf.
     
  4. unreal25

    unreal25 Capt. Obvious

    Reputations:
    1,102
    Messages:
    2,373
    Likes Received:
    0
    Trophy Points:
    55
    I don't have Left 4 Dead, but the 3DMark Vantage scores are likely taken from R1, because in large extent, they depend on the graphic card and not so much on CPU (vs say 3DMark06).
     
  5. freeman

    freeman Notebook Deity

    Reputations:
    126
    Messages:
    741
    Likes Received:
    0
    Trophy Points:
    30
    I don't think some people here understand how this works. GPU doesn't run the game, CPU does. GPU runs 3D engine. Well this mean there have to be something tha feed data into 3D engine, and that's CPU. So, if the CPU is underpower and only fast enough to feed in 35frames, well that's what you gonna get out of it regardless of how good the GPU is. Well, try imagine GPU only being taxes 80% when running a games because of the bottle neck,what happen when you replace the CPU andremove that bottle neck? The 20% GPU that wasn't ther before is now shown, this is especially true when the game is heavily designed for taxing GPU. So, let me go back and point out that on one of the review I linked to earlier, you can see that there is a graph of framerate for M11xR1 running MW2 which show the results between the game running w/ 2xAA vs 4xAA. The graph show that there is relatively little changes in performance when graphics quality is being pressed upon. This show that there is a slack in GPU power and ther is CPU bottle neck. So, comparing it w/ dell published MW2 framerate it does make sense that M11xR2 make a significant improvement simply by replacing GPU, at least with MW2.
     
  6. Wiggy Fuzz

    Wiggy Fuzz Notebook Consultant

    Reputations:
    0
    Messages:
    141
    Likes Received:
    0
    Trophy Points:
    30
    i think you're thinking too hard about it ;)

    we'll all see with the official benchmarks. for me, synthetic benchmarks are completely different to real life situations. for games, i can see overall cpu speed making much more of a difference. would be nice if dell let us overclock the r1 processors higher, if the cooling can take it.
     
  7. Shokz

    Shokz Notebook Enthusiast

    Reputations:
    0
    Messages:
    44
    Likes Received:
    0
    Trophy Points:
    15
    It's actually interesting you should note the change in AA settings - that's a very good indicator of wasted GPU potential (where the CPU can't keep up). If you can increase the AA substantially in a game with little framerate drop then you know that it's your CPU holding your framerate back at the lower AA levels. M11xR1 owners could try this out on any games that struggle to run to see if it's the CPU at fault (and therefore whether R2 will help in these games or not).
     
  8. erawneila

    erawneila Company Representative

    Reputations:
    468
    Messages:
    463
    Likes Received:
    0
    Trophy Points:
    30
    I have all of the systems here if you guys want me to try something specific... Let me know.
     
  9. miXwui

    miXwui Notebook Consultant

    Reputations:
    1
    Messages:
    109
    Likes Received:
    5
    Trophy Points:
    31
    You have the M11xR2 with you right now?!

    And this should provide some insight into the new i5-520UM cpu performance compared to the old SU7300

    Intel Core i5-520UM benchmarked on Asus UL30JT
     
  10. Shokz

    Shokz Notebook Enthusiast

    Reputations:
    0
    Messages:
    44
    Likes Received:
    0
    Trophy Points:
    15
    Crysis, Bad Company 2 and DiRT 2 would be interesting to look at. Crysis and Bad Company 2 are two games that could probably benefit from a faster CPU for playability reasons (at medium settings and higher) and DiRT 2, whilst already playable at high, still seemingly benefits from a faster CPU at lower detail levels so that could be another game to check for differences in framerate at medium settings. The present tests on NotebookCheck do point to the R1 CPUs being underpowered for the GPU, but AA tests should help to confirm this.

    @ miXwui - I think he simply means he has the presently released ones. And the laptop you linked to would of course be a good relatively direct comparison for tasks that don't make use of the graphics card.
     
  11. Polytonic

    Polytonic Notebook Consultant

    Reputations:
    1
    Messages:
    131
    Likes Received:
    0
    Trophy Points:
    30
    I would LOVE to see Battlefield: Bad Company 2 Performance on the M11X-R2 (i7-640UM), especially with the four threads that BC2 will be able to utilize. Youtube video with FRAPs counter would be amazing.

    Also, can anyone clarify what the "Overclockable" means specifically on the Arrandales? On the configuration page, it has "- Overclockable" after all the turbo information.

    [​IMG] [​IMG]

    The Rep I spoke to via Dell Chat was totally clueless, thought all i7s were Quad Cores, thought there was a 6-Core i7 mobile option, and claimed the m11x could reach "3GHz at your own risk" saying that it was possible to reach over 4GHz on the Extreme Edition i7-640UM. Which... doesn't exist.

    Is there some sort of extra overclock option supported to reach 2.66GHz?




    Edit: Random thought... I don't know why everyone uses MW2 to benchmark. I can get a decent 45 FPS on my integrated Intel X3100 (GMA 965) with dips into the 20s, depending on when my chipset starts to throttle itself. Granted it's tweaked and everything at the lowest, still... MW2 was at least well coded (I don't really like MW2 as much, but I do give Infinity Ward credit for good coding).
     
  12. SparhawkJC

    SparhawkJC Notebook Evangelist

    Reputations:
    170
    Messages:
    430
    Likes Received:
    0
    Trophy Points:
    30
    Lol looks like I was beaten to the punch on the Bad Company 2 request. Could you run through some multiplayer matches to see the performance? Also if it's not too much trouble maybe some WoW performance, maybe running through Dalaran or some instances?
     
  13. Shokz

    Shokz Notebook Enthusiast

    Reputations:
    0
    Messages:
    44
    Likes Received:
    0
    Trophy Points:
    15
    I read elsewhere that the base clock can go from the regular 133MHz up to 166MHz when overclocked (someone in the other thread quoted it from the manual), but i can't get that to equal 2.66GHz no matter whether i add the overclock before or after turboboost, so i'm a bit confused...
     
  14. freeman

    freeman Notebook Deity

    Reputations:
    126
    Messages:
    741
    Likes Received:
    0
    Trophy Points:
    30
    The 2.66Ghz number(wait, is it 2.66 or 2.26?) is based on non-OCed CPU running at 133Mhz bus in TurboBoost mode, so raising the bus would increase the clock cycle on both non-TurboBoost 1.2Ghz and TurboBoost to even higher.
     
  15. Arklight

    Arklight Notebook Evangelist

    Reputations:
    81
    Messages:
    417
    Likes Received:
    1
    Trophy Points:
    31
    Yes, Bad Company 2, Dirt 2, Well we can't test starcraft 2 now. Command and Conquer 4...

    Or better yet, what's the base OC'd frequency of the i7-640um? That's specific.

    thanks.
     
  16. freeman

    freeman Notebook Deity

    Reputations:
    126
    Messages:
    741
    Likes Received:
    0
    Trophy Points:
    30
    I get what you are saying, personally I see benchmark as a tool convey certain piece of information, and it have no way of describing actual experience of game play. That said, the original reason I create this thread was stated in the original message. People misquoting benchmark nummbers.
     
  17. MexicanSnake

    MexicanSnake I'm back!

    Reputations:
    872
    Messages:
    1,244
    Likes Received:
    0
    Trophy Points:
    55
    wprime, 3dmark06 and 3dmark vangate plz :D :D :D :D :D

    More off topic:

    You can say whatever you want guys... But Im expecting less than a 15% of performance increase comparing the core 2 duo SU7300 vs i7-640UM...

    The full OCed "2.26ghz" performance IS NOT like a regular i7 @2.26ghz, also I read somewere that the top OC rarely shows on the core i7's... Also @ 2.26ghz it will consume more energy and produce more heat. The i7-640UM has a TDP of 18W at NORMAL usage, I expect the TDP to be higher at 2.26ghz... I hope Im wrong and see OMG performance :D.

    In my opinion 15/20% is a NICE performance boost, but you know the price is another topic... with that price tag I could get a basic m15x and upgrade later :).
     
  18. Shokz

    Shokz Notebook Enthusiast

    Reputations:
    0
    Messages:
    44
    Likes Received:
    0
    Trophy Points:
    15
    See the black image above for the 2.66GHz figure. I must have managed to type the wrong number into the onscreen calculator because it turns out it does work out:

    133MHz*8 = 1.06GHz (Core i5 stock speed)
    166MHz*8 = 1.33GHz (Core i5 max overclock)
    133MHz*14 = 1.86GHz (Core i5 max Turboboost)
    166MHz*14 = 2.32GHz (Core i5 max overclock and Turboboost)
    133MHz*9 = 1.20GHz (Core i7 stock speed)
    166MHz*9 = 1.49GHz (Core i7 max overclock)
    133MHz*16 = 2.26GHz (Core i7 max Turboboost)
    166MHz*16 = 2.66GHz (Core i7 max overclock and Turboboost)
     
  19. DR650SE

    DR650SE The Whiskey Barracuda

    Reputations:
    7,383
    Messages:
    8,222
    Likes Received:
    182
    Trophy Points:
    231
    If you could get a shot of the bios, and what the overclock on the core-i CPUS is, and maybe a screen shot of CPU-Z both over clocked, and non-overclocked. That would be good to see, as the SU4100/SU7300 had some issues as far as 1.6GHz vs 1.73 GHz. Thanks again for taking the time to helps us out.
     
  20. luffytubby

    luffytubby Notebook Deity

    Reputations:
    354
    Messages:
    829
    Likes Received:
    10
    Trophy Points:
    31
    GTA4 must take priority over the other games -

    It' is known as the most unoptimized PC of perhaps the last ten years. It's a great game, but horrible CPU demanding.
    It was coded for PS3, with it's Cell CPU, that has many smaller cores, in which each core is specified to do a certain thing. Developers will often use one core, sorely for physics, another core for textures and another for the soundscape, and so on..
    When they ported GTA4 to PC to get into the christmas rush, they did a poor job, and its still reflected in the product today.
    Xbox 360 and PS3 both had much more powerful CPUs than GPUs. It's always been like this for consoles, and thus many console ports are very heavy on the CPU part, but poor porting jobs also often puts unneeded stress on the GPUs.


    If GTA4 can run it smooth at decent settings, then it truly is a capable machine. It is generally believed that the only people who should bother are those with Quad-Core and the most powerful desktop Dual-Cores.




    Besides that?

    Source games, where also made to run more on the CPU than GPU. Back in 04, it was something like 70% of the worlds computers ran on integrated graphics. I remember HL2 being decent on a crappy Nvidia FX 5200 *shudders*
     
  21. Arklight

    Arklight Notebook Evangelist

    Reputations:
    81
    Messages:
    417
    Likes Received:
    1
    Trophy Points:
    31

    But it wouldn't really max out to 2.66 right?
    Indeed it is still a good performance boost.

    I hope erawneila shares the numbers! :D
     
  22. freeman

    freeman Notebook Deity

    Reputations:
    126
    Messages:
    741
    Likes Received:
    0
    Trophy Points:
    30
    It's not that we like to use MW2 for benchmark, but ATM, MW2 is the only number we know for sure that actually coming from M11xR2 w/ core i7-640UM. If you have other numbers that you can show w/ citation, please do share.

    No, the 18W TDP is a max TDP running un-OC, as said 2.26Ghz is not a OCed mode, just TurboBoost mode, meaning even at 2.26Ghz when running at TB mode . It's still at 18W or below.
     
  23. Polytonic

    Polytonic Notebook Consultant

    Reputations:
    1
    Messages:
    131
    Likes Received:
    0
    Trophy Points:
    30
    I was actually wondering why Dell decided on MW2 of all games, other than to perhaps... I can't really describe it in another way... statpad? Camouflage its shortcomings?
     
  24. MexicanSnake

    MexicanSnake I'm back!

    Reputations:
    872
    Messages:
    1,244
    Likes Received:
    0
    Trophy Points:
    55
    Im not sure, that doesnt explain why the m11x-R2 has a worse battery life ;) ( 20/30 minutes less,just talked with a rep)...

    But yeah lol I meat to say turboboost instead of OC :eek: .

    Edit wooooooooooha 1000 posts!!!!!
     
  25. freeman

    freeman Notebook Deity

    Reputations:
    126
    Messages:
    741
    Likes Received:
    0
    Trophy Points:
    30
    Well, I don't know the reason for that either and don't want to speculate without doing anymore research. Since I don't have time to dig any deeper right now, maybe somebody else can. Just look into the 3D engine that MW2, do more research on the engine being used and compare to other game. Some body else already doing research on a difference direction(prop for taking time doing that, and plz post result here, i like to know too), which is finding out how much CPU bottle neck and how much spare GPU power is left on M11xR1 when running various games at difference detail settings.
     
  26. Arklight

    Arklight Notebook Evangelist

    Reputations:
    81
    Messages:
    417
    Likes Received:
    1
    Trophy Points:
    31
    I agree with you. Benchmarks will also answer the question "Should I upgrade or not?", or "Is it worth the price?"
     
  27. Shokz

    Shokz Notebook Enthusiast

    Reputations:
    0
    Messages:
    44
    Likes Received:
    0
    Trophy Points:
    15
    We're talking literally about minimum running power draw when Dell cites the battery life. With the old model it was 9.0 hours, with the new model it's 8.5 hours. Breaking this down, considering the battery is still 85Wh (meaning it would go from full to empty in one hour if powering a device with a draw of 85 watts), for the old model to last 9.0 hours, that would be an average wattage of 9.4W and for the new model to last 8.5 hours that would be 10.0W; an increase in draw of 0.6W. The new Core i processors however have a TDP that is 8.0W higher than that of the Core 2 Duo processors, but that's including their integrated graphics and so it's anyone's guess as to how much more power they actually use.

    EDIT: Turns out that it's actually a 63Wh battery, i was thinking of the M15x >.<

    So, in correcting, the old model would have seen an average draw of 7.0W for Dell's figure and the new model would have seen a draw of 7.4W, making a 0.4W difference.
     
  28. popypopy

    popypopy Notebook Evangelist

    Reputations:
    25
    Messages:
    301
    Likes Received:
    0
    Trophy Points:
    30
    Gee, biggest selling entertainment product ever produced... maybe?
     
  29. Polytonic

    Polytonic Notebook Consultant

    Reputations:
    1
    Messages:
    131
    Likes Received:
    0
    Trophy Points:
    30
    I think The Sims, World of Warcraft, Starcraft, and Counter Strike would like to contest that statement.
     
  30. freeman

    freeman Notebook Deity

    Reputations:
    126
    Messages:
    741
    Likes Received:
    0
    Trophy Points:
    30
    Solitaire would like to contest that statement, since it's shipped with all Windows OS. Most Linux distro have them too. Basically it has the most installed base of any games that ever exist.
     
  31. Shokz

    Shokz Notebook Enthusiast

    Reputations:
    0
    Messages:
    44
    Likes Received:
    0
    Trophy Points:
    15
    Fact of the matter is, even just including PC sales of "real" games, it's still probably lower than tenth place.

    Basically, it shouldn't be the benchmark because it's simply not a game that'll stress test a gaming PC and any gamer would know that. It's a pointless game to test.

    (Also, i'm pretty sure Gordon Freeman wants a word too... And that's saying something, cos he doesn't like to talk)
     
  32. Cpt.Zero

    Cpt.Zero Notebook Consultant

    Reputations:
    58
    Messages:
    245
    Likes Received:
    0
    Trophy Points:
    30
    i just hope that a retail release m11x r2 would be reviewed soon... just want an actual benchmark of the m11x old version pitting against the m11x r2 b4 i will be buying this new version and give my current m11x to my wife as :D...
     
  33. freeman

    freeman Notebook Deity

    Reputations:
    126
    Messages:
    741
    Likes Received:
    0
    Trophy Points:
    30
    Here is an interesting tidbit provided by Anandtech, it's actually a review of M11xR1 on March 30, but the reviewer weight in on the future changes when upgrade to core i5/i7. Again it's still a prediction, but from the actual professional reviewer.
    If you interested in reading the whole article, goes here. The opinion on i5/i7 is on page 8.
     
  34. vorob

    vorob Notebook Deity

    Reputations:
    83
    Messages:
    1,140
    Likes Received:
    59
    Trophy Points:
    66
    I still don't understand what this thread about...
     
  35. unreal25

    unreal25 Capt. Obvious

    Reputations:
    1,102
    Messages:
    2,373
    Likes Received:
    0
    Trophy Points:
    55
    It's about absolutely nothing (except speculations) at the moment, since no one posted any R2 benchmarks yet. :)
     
  36. Grimgrak

    Grimgrak Notebook Geek

    Reputations:
    5
    Messages:
    85
    Likes Received:
    0
    Trophy Points:
    15
    As the world err thread turns with erawneila
     
  37. erawneila

    erawneila Company Representative

    Reputations:
    468
    Messages:
    463
    Likes Received:
    0
    Trophy Points:
    30

    I got busy today and didn't have a chance to run any benchmarks, but I'll try them on Monday. The M11xR2 has overclocking options - 15 steps increasing by 2MHz each, so everyone should be able to overclock to some degree. Not everyone will get the max...
     
  38. TimeConsumer

    TimeConsumer Notebook Guru

    Reputations:
    2
    Messages:
    59
    Likes Received:
    0
    Trophy Points:
    15
    Don't forget, the 2.26ghz turboboost is only to one core. And like 1.8ghz turboboost to both cores.
     
  39. freeman

    freeman Notebook Deity

    Reputations:
    126
    Messages:
    741
    Likes Received:
    0
    Trophy Points:
    30
    It was originally about people who post some benchmark numbers that posted on dell website and claimthose numbers are from M11xR2, yet the dell website didn't specify it's from M11xR1; and the time that the benchmark was done, it's less likely that it's from M11xR2.

    Now, we just looking at M11xR1 benchmark and try to determine how much of the bottle neck SU7300 was and see how much would have been lifted if that bottleneck wasn't there.(yes, speculation works) And also waiting for new and actual benchmark from M11xR2 when those numbers start to arrive.
     
  40. Polytonic

    Polytonic Notebook Consultant

    Reputations:
    1
    Messages:
    131
    Likes Received:
    0
    Trophy Points:
    30
    Not to harp on you or anything, but since Dell's page is advertising as overclockable to 2.66 GHz, wouldn't I, hypothetically speaking as the customer, be entitled to keep replacing the unit until I get one that can hold up at 2.66 GHz?
     
  41. Arklight

    Arklight Notebook Evangelist

    Reputations:
    81
    Messages:
    417
    Likes Received:
    1
    Trophy Points:
    31
    the 1.8 turboboost max on 2 cores is not yet overclocked right? so we get a max of 1.9 - 2.2 on OC'd? (just a guess.)
     
  42. Mackan

    Mackan Notebook Evangelist

    Reputations:
    121
    Messages:
    691
    Likes Received:
    0
    Trophy Points:
    30
    Cool, thanks for the info. :) I would look forward to a 3DMark06 score to compare with the R1.
     
  43. Shokz

    Shokz Notebook Enthusiast

    Reputations:
    0
    Messages:
    44
    Likes Received:
    0
    Trophy Points:
    15
    133MHz*14 = 1.86GHz, which is the max the i7 goes up to with both cores running, it would therefore follow that 133MHz(+2MHz*15)*14 = 2.28GHz is the max speed with overclock and both cores enabled.
     
  44. Mackan

    Mackan Notebook Evangelist

    Reputations:
    121
    Messages:
    691
    Likes Received:
    0
    Trophy Points:
    30
    Question is if we would get anywhere near those values. The turbo boost frequencies depend on some "factory-configured" values for maximum allowed CPU current, power consumption, temperature, etc, according to Intel's white paper.

    I've no experience of these new ix CPUs and Turbo Boost. But it would be interesting to study the CPU frequency and active cores under gaming. It if throttles up and down, because some of these "factory-configured" values are exceeded, I can't imagine it being a good gaming experience.

    I've seen a couple of other threads regarding ix CPUs and throttling... so, I am suspicious. So it remains to be seen if it is really possible to reach 2 cores OC to 2.28 GHz steady during a complete gaming session.
     
  45. ninja2000

    ninja2000 Mash IT

    Reputations:
    434
    Messages:
    1,674
    Likes Received:
    268
    Trophy Points:
    101
    we need someone to run prime on both one and four threads to see what it maxes out at!
     
  46. mk1freak

    mk1freak Notebook Evangelist

    Reputations:
    44
    Messages:
    349
    Likes Received:
    1
    Trophy Points:
    31
    i find that WoW 25 man raids are the most cpu intensive( and i've done it on many different laptops and I was disappointed that the m11xr1 cant even do 25man ICC(or other 25mans) at minimum settings.

    people can quote in any thread that "this lappy played wow on ultra with FPS in dal at no lower than 20fps" but that's not a real indicator of performance IMO

    spoke to a few different reps about upgrading the cpu on the machine and at least the common theme is, for those of us going that route, it wont be available for at least a month or so.

    will the cpu upgrade give us better 25man performance in the m11x? thats yet to be determined and until i see video benchmark proof with my own eyes(or blizz codes their games better), i dont expect it to.
     
  47. unreal25

    unreal25 Capt. Obvious

    Reputations:
    1,102
    Messages:
    2,373
    Likes Received:
    0
    Trophy Points:
    55
    Actually I played ICC25 quite well. Not the Ultra but medium-high settings, with basically shadows toned down a bit, particles a little and ground stuff a bit. The only completely unplayable game I tried so far on m11x (r1) was APB beta.
     
  48. YodaGoneMad

    YodaGoneMad Notebook Deity

    Reputations:
    555
    Messages:
    1,382
    Likes Received:
    12
    Trophy Points:
    56
    I really think a lot of the problems people have the M11X is due to poor configuration with their settings. Things like turning off Page File for WoW, or taking shadows down.

    The M11X is short on bandwidth more than anything else, and so if you get slowdowns you need to cut things like shadows which eat bandwidth. There is just no way a game like WoW is CPU dependent. I ran 40 man Molten Core on all high years ago on a P4 and (I think) a Radeon 9700 Pro.

    Intel Pentium 4 1.3 GHz or AMD Athlon XP 1500+
    - 512 MB or more of RAM (Vista requires 1 GB or more of RAM)
    - 3D graphics processor with Hardware Transform and Lighting with 32 MB VRAM, such as an ATI Radeon 7200 or NVIDIA GeForce 2 class card or better

    Seriously guys, it only needs a P4 1.3 ghz. A single core of the SU7300 is easily twice as fast.
     
  49. Fuzzyhead

    Fuzzyhead Notebook Geek

    Reputations:
    14
    Messages:
    81
    Likes Received:
    0
    Trophy Points:
    15
    You are so wrong, incredible.

    You may have noticed that present WoW has way better graphics than the early release Wow.

    Please try to play WoW with a P4 1.3 ghz, good luck mate, you will need it

    Some don't admit it, but Wow is a hardware hungry game, especially the crowded areas can kill your machine
     
  50. luffytubby

    luffytubby Notebook Deity

    Reputations:
    354
    Messages:
    829
    Likes Received:
    10
    Trophy Points:
    31
    The shadow effects, and improved spell effects since then.. and just the fact that the later dungeons are so much more full of visuals, makes me think that it's changed quite a bit.
    Particularly with the shadows. you need a great computer to play it smooth with no drops. I can totally see why the shadow parts would be related to the CPU.





    But wait a minute guys... can you even play WoW 25 man? the raid UI must be so small it would be impossible to see anything!? Lol.. actually I would love to see a screenshot, if anyone of you have one of a 25 ICC :)
     
 Next page →