The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    4K gaming on Alienware 18 with 780m SLI possible?

    Discussion in 'Alienware 18 and M18x' started by l701x, Jun 19, 2014.

  1. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Hi there,

    I have had my Alienware 18 for a while now and am loving it. I have seen the price of good 4K monitors come down hugely recently and am looking at purchasing the SAMSUNG U28D590D monitor.

    I believe the laptop has DisplayPort 1.2 so in theory it should be able to run at 60Hz on the monitor (although the connection is a minidisplayport on the laptop so I suppose I'd need a converter cable).

    Are there any games that would give acceptable frame rates at the 4K resolution with 780m SLI? I am particularly hoping Bioshock infinite will as I love that game!

    My CPU is the basic 4700mq so wondering whether that could be a potential bottleneck. My system also has 24gb RAM if that makes any difference.

    Any input would be much appreciated :)

    Thanks,
    Charlie.
     
    dawsonkm likes this.
  2. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    It has a 28-inch screen and a resolution of 3840 x 2160 pixels. That's a hell of a nice monitor.

    780M SLI is like a GTX 690 desktop card (performance wise). You should be able to run games on lower settings. Medium to High settings, or a mixture of both.
     
  3. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Thanks for your reply, do you think it'll be a better experience playing with medium settings at 4K as opposed to ultra settings at 1080p on a monitor of the same size? I know running @4k means there is little need for anti-aliasing, but will textures and such appear to look better on a 1080p monitor with ultra settings and anti-aliasing, or a 4K monitor with medium settings?

    Thanks again.
     
  4. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Honestly, I don't have any idea. If I had to guess, I'd say it will look the same. You're lowering settings, which means less detail. But at the same time, you're playing on a monitor that is meant to "bring out the detail" of whatever you're doing. It's sort of contradictory, if you ask me.

    Monitors like this are really meant for desktop GPU's like the Titan Black and Titan Z. Games would look amazing with either of those GPU's and this monitor. Gaming laptops come with 1080p because mobile GPU's aren't designed to handle higher resolutions, yet.

    For things other than gaming, it will obviously look better on the monitor because the GPU's won't be taxed so much.
     
  5. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Will the CPU throttle the GPUs in anyway moreso when using a 4K resolution than at 1080p? It's usually my graphics cards working at close to 100%, not my CPU. Also is there any potential overclocking on the GPUs I could achieve? Temperatures are usually around low 70s on each card.

    I may get the 4K monitor just for work, but it would be nice to think I could play games on it if I wanted to.

    Thanks again for your input.
     
  6. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    The 4700MQ won't hinder performance in any way.
     
  7. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Nevermind.
     
  8. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I meant the GTX 690. I owned a GTX 690 not too long ago, and it performed about the same. It averaged 80 FPS on Battlefield 4 at 1080p on the Ultra preset. So, I suppose it's arguable that it's in between both - perhaps a bit better than the 690, but not as good as the 780TI.
     
  9. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Weird, comparing 3DMark11 Xtreme scores the 780M SLI does indeed match a 690. But a 780M is only about a desktop 660 Ti and nowhere near the desktop 680, but 780M in SLI matches 690. Scratching my head here.
     
  10. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    NVIDIA SLI scaling sucks badly, especially with the GTX 690. Oddly enough, two dedicated GPU's in SLI scale better than a single dual-GPU card.

    The 780M performs like an underclocked GTX 680, not a 660Ti.
     
  11. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    I honestly don't think its enough for 4k gaming.. not at a decent quality. 1440k yes. But 4k does scale into 1080p nicely so there's that.
     
  12. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I can confirm that with 95% scaling on two 850/5000 780Ms in a game that's good for SLI, it performs about the same as a 780Ti, according to my friend's reported framerates in games. If you overclock them to 950/6000, once you hit ~80% scaling in a game (that actually USES it, mind you. Not like DayZ which is server-limited) then you'll match a 780Ti's basic performance.

    Of course... most games won't give constant 99% scaling, so it's very mileage-may-vary. But it's definitely better than a 690 if you use those overclock speeds, for sure.
     
  13. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Anyway, with that reply out of the way...

    YES. You can 4K game with it. In fact, a lot of games have been giving you 4K options right now. Ever run Sniper Elite V2 and turn SSAA up to 4x? That's 4K downsampling if your game is at 1080p. Try sleeping dogs at 1080p with "extreme" anti aliasing? That's a 4K res downsample. So yes, you can easily run those games without issue at 4K, and a lot of other games at 4K. But you will likely need to turn off anti-aliasing and you will not be able to run the more demanding or unoptimized games at 4K easily. Like Watch Dogs? Forget it. GTA 4? HA. DayZ? LOLOLOLOL. Nope. But if you're sticking to say... CoD: Black Ops 2 with AA turned off? Or Dark Souls 2? Sure. In fact, I've downsampled Dark Souls 2 with GeDoSaTo from 4K and been able to play it perfectly fine. I had to turn SMAA off and remove the Bokeh filter, but it worked, and decently well too. I'm fairly certain if I turned off a couple other extra GeDoSaTo features, it'd give me 60fps constant. I was getting 60 mostly constant at 4K (and in borderless windowed too, which knocks ~10% main-GPU scaling off SLI, so it would have worked a little better in fullscreen).

    So basically, to answer your question, truly most of your games would work. Maybe not the games you WANT to work, but most will. As I said, you'll probably need to turn off AA, but it shouldn't be too necessary with such a high pixel density. Just the best-looking of games (like BF4) will have a problem, and the unoptimized ones (Watch Dogs, Ghosts, Titanfall, etc) will also be problematic. I also suggest running a 950/6000 overclock on your cards if you can get it stable. It WILL help when gaming at that resolution. Anyway, even desktops won't be truly ready for 4K gaming until Maxwell launches its new flagships and people grab like two of those and shove it into a PC. Most games these days that look good are also so bloody unoptimized that they won't even work well no matter WHAT you throw at it.
     
    TBoneSan likes this.
  14. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Thanks very much for your replies.

    It seems the 780Ti gets between 30 and 34 fps in Bioshock Infinite at 4K running on Ultra settings with DOF enabled. So maybe there is a little hope for 4K gaming with my laptop :)

    May wait a bit to see if any cheaper/better 4K monitors are released, although the Samsung I mentioned in the original post does seem like a very good deal, even though it's a TN panel.
     
  15. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Also if my temps are hitting ~under 75C most of the time, would you say there is some overhead for overclocking the cards? Also what would be a safe temperature for the 780m to be running at? Thanks again guys.
     
  16. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    @ D2Ultima - well written. I concur :)

    The max I overclock for gaming is around +110/400 and that's only in 2 games - Skyrim (modded) and Watchdogs. Everyone feels different about temps they are comfortable playing at. Some people don't even like OCing for gaming. Generally you really want to keep them under 80 degrees. Personally I don't like to see temps exceed 75.
     
  17. jlyons264

    jlyons264 Notebook Evangelist

    Reputations:
    380
    Messages:
    474
    Likes Received:
    31
    Trophy Points:
    41
  18. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I'm always referring to stock as a baseline for performance. It matches a GTX 690 almost perfectly at stock.

    I don't doubt the fact that, if you overclock, you'll reach "base" stock 780Ti performance levels. But, since the 780Ti is not overclocked, I don't think it's fair to say that 780M SLI performs at its level when you have to overclock them. When you overclock the 780Ti, you're going to blow 780M SLI out of the water...

    He'll be able to play games on High, but it won't be as good as he is hoping. On a monitor like that, lowering settings (even a little bit) really impacts the level of detail because with these monitors, you can literally see every little change and difference. It brings them out so much. In other words, it's really noticeable. At 1080p, it's not.
     
  19. nightdex

    nightdex Notebook Evangelist

    Reputations:
    189
    Messages:
    436
    Likes Received:
    153
    Trophy Points:
    56
    +1 on this. I've seen mid range texture settings on a 4k screen. Suffice to say, It looks dam ugly. A. 780TI can't even look all that impressive in game on a 4k screen. Like you said earlier. 4k was meant for either the Titan Black or the Titan-Z. Nothing below those 2 cards are even going to be remotely close to looking beautiful. 1080p is the max I would personally go to in the Alienware realm.
     
  20. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Really to play the demanding like Crysis, Metro, Watchdogs etc on max settings not even 2 x 295x's (yes quad) can do the job at 60fps, and that's without AA on. Tiny Tom Logan did a good review and most got close at around 50fps. Not saying its not worth getting 4k though. Alot of other good games like L4D2 could get there on 2 x 780m.
     
  21. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Will the monitor I mentioned scale to other 16:9 resolutions nicely like 2880x1620 and 2560x1440? I think these resolutions would be more achievable with my notebook cards, but will the scaling cause the images to look "weird". I use weird as I have never tried using lower than native res on screens and don't entirely know what to expect.
     
  22. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    I know 4k scales perfectly into 1080p. Maybe someone can chime in on other resolutions.
     
  23. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Yeah 1080p would look just like a 1080p picture on a 4K display because it can just illuminate 4 pixels for every 1 which is rendered. But with other 16:9 there won't be a nice multiple of 4 pixels.

    Anyone else have any ideas? Thanks again for all your help guys :)
     
  24. l701x

    l701x Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    5
    Trophy Points:
    16
    Just ran Bioshock Infinite at 2880x1620 using downsampling and my frame rates were between 70 and 110fps.

    Temps were pretty high as the cards were working very close to 100%, 80 degrees and 78 degrees C respectively.

    Locking the frame rate to 60 cooled things down.

    Would be good to be able to play at a resolution like that on a 4K monitor.
     
  25. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Hehe 4k is a Pandora's box I probably won't open for a couple more years
     
  26. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    If you want the monitor, go buy it. Just don't expect to be able to play every game on the highest settings like you can with the laptop itself. It's a great monitor, and within the next 2-3 years, laptops will be able to handle 4k gaming without a problem. Hell, some may even come with 4k displays by then...who knows.

    Right now we're at that point where it's possible but not ideal. We're like in the middle of, "there's no way it will handle it" and "it will easily handle it."
     
  27. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I know I keep coming back to this and I'm sorry, but:

    -single 780M about on par with desktop 660 Ti, itself slower than desktop 680
    -2x780M matches 690, which is 2x680 slightly downclocked
    -780 Ti outperforms 2x780M stock for stock

    So 780 Ti outperforms 690, and somehow due to SLI scaling, 780M SLI catches up with 690 even though a single 780M only has the performance of a 660 Ti

    Ok my mind is truly blown.
     
  28. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I've already tried it. 1080p on a native 4K screen is still fuzzier around the edges than 1080p on a native 1080p screen of the same size 'cause it applies bilinear interpolation on the upscaled image. Windows and games can't seem to do nearest-neighbor upscaling from 1080p to 4K; that would actually make the upscaled 1080p image razor-sharp. I tested it on a 28" 4K monitor though, so the image degradation from the upscale should be less noticeable on a much smaller and higher PPI laptop screen. If anything, the slight blur from the interpolation would just be a bit of free anti-aliasing. :p
     
  29. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    I think I'm diving into 4k when we get the new cards :)
     
  30. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Here's what, I'll list some raw numbers here. The numbers are *EXACTLY* the raw horsepower of the video cards, made by multiplying the clock speed by the number of operations per second. I did this back in the day when the 470M and 480M came out and I was confused because the 470M looked like it was stronger than the 480M, only for NBR to go crazy about it with testing about a week later XD.

    These numbers are *NOT* some kind of pixel fillrate or anything. Also remember that some games use memory less than others, so differing memory bandwidth can be negligible or hugely impacting depending on the game. There's a reason lots of people took the 660Ti instead of the 670 =3 despite a lot less memory bandwidth =3.

    1,305,600 / 160 GB/s - 780M (stock, 850/5000)
    2,611,200 / 320 GB/s - 780M (stock SLI 99% scaling)
    1,459,200 / 192 GB/s - 780M (OC, 950/6000)
    2,918,400 / 384 GB/s - 780M (OC SLI 99% scaling)
    1,229,760 / 144.5 GB/s - 660Ti (stock, 915/6000)
    1,229,760 / 192 GB/s - 670 (stock, 915/6000)
    1,545,216 / 192 GB/s - 680 (stock, 1006/6000) == 780M (OC, 1006/6000)
    2,810,880 / 384 GB/s - 690 (stock, 915/6000, SLI 99% scaling)
    2,520,000 / 336 GB/s - 780Ti (stock, 875/7000)

    Therefore, depending on your game, a stock 780M SLI can actually slightly outperform a 780Ti at stock. Usually games do not scale THAT well, but it is quite easy to get in the ballpark. I know for a fact that my friend with an AMD Radeon R9 290 *CANNOT* overclock his 290 to beat my 780Ms at 950/6000. He kept having me run a bunch of benchmarks like Unigine Heaven and he couldn't surpass me even once, even though he OC'd/OV'd his card so much that it started failsafe shutting down into blackscreen and causing him to have to restart his PC. So yeah, 780Ms are quite strong. The 690 was better than I'd thought though, didn't realize it still beat a 780Ti.

    Unfortunately, I don't know how to calculate raw potential of AMD cards, so I can't cross-compare other than with benchmark software etc.

    EDIT: Checked the Titan Black. Only difference is about 14MHz extra clock speed (yeah, only 14) and 3GB extra memory over the 780Ti. So unless you're gonna use it for stuff that makes use of its double-precision calculating power (I think I have the term right) OR you want the extra memory and don't wanna wait for the rumored 780Ti 6GB version... the Titan Black is quite literally a money-sink, and won't really benefit 4K gaming over a 780Ti.
     
  31. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Yes, in some cases it is better. Some not. Scaling is a big factor in determining "real-world" performance with multiple GPU configurations.

    Either way, 780M SLI is not meant for 4K gaming. Neither is 880M SLI. The Maxwell 880MX [in SLI] (or whatever it's called) will potentially be enough to handle it. As I mentioned above, we're almost at the point where laptops can handle gaming on 4K monitors. 20nm Maxwell may be the first generation of GPU's that can do so without lowering settings.

    Pascal will definitely be able to handle gaming on 4K displays. I presume by 2016, we may actually see some laptops have 4K displays themselves.
     
  32. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well, the thing is no cards can handle it right now. Unless you cram three or four 780Ti or Titan Black cards into a machine, the best-looking games aren't gonna be running at 4K on any kind of high settings. Crysis 3 at 4K is pretty much impossible with current tech too. So that's why I say, it depends on the game. Running CoD 4? SURE. Running Bioshock 1? Hell yes. Dark Souls 2? Why not! Running sleeping dogs or Sniper Elite v2? Already super-samples to 4k in-game! Have at it! Two 780Ms will eat that up for breakfast, lunch, dinner, give you 60fps and burp satisfactorily now that you actually made them break a sweat. Running BF4, Crysis 3, Watch Dogs, BF3 (purely due to its unoptimizations, rather than graphic fidelity), Star Citizen, Arma 2 and/or any DayZ mod, STALKER: Call of Pripyat? No way dude, game over. Those will sputter and die halfway close to max settings, and most of those aren't because of super good visuals. In this respect, no current GPU is meant for 4K. And if you believe DICE, no current CPU is ready for 4K either, because BF4 is literally so CPU-bound and dependent that a guy with an overclocked i7-3930K and an AMD 6970 gets the same performance on ultra that I do with my 3.5GHz 4800MQ and OC'd 780Ms, both games at 1080p.

    In essence, you and I have a similar idea about it, just that I think absolutely nothing is ready for 4K, and it's not particularly the GPU's fault, but the games. Optimization-complete, I'm fairly sure there'd be less than 20 games that'd get under 60fps with two 950/6000 780Ms at 4K, far less two Titans or two of any stronger card. That just isn't the case right now though, unfortunately. And with the way games have been since the current-gen consoles' release, wasting resources and blaming it on the graphics being "just that good" (which they're not), I think even Pascal will have issues at 4k maintaining 60fps.
     
  33. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    I don't think anything is ready for 4K gaming either. I was speaking in regards to 2016-2017. By then, they will be.

    We'll have to wait and see what NVIDIA does. From the rumors of Maxwell, it should be playable. I'm not saying it will perform flawlessly - that's absurd. The GTX Titan Black and Z are the only cards that handle 4K gaming, and even those two cards aren't performing that great. Reviews show they have serious FPS dips and scaling issues.

    If the rumors are true, a single 880MX (or whatever) should perform similarly to a GTX 780 or 780TI at stock.
     
  34. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    A 880MX being a 780Ti at stock would be a welcome improvement. If Maxwell's hype is to be believed, a mobile card being a 780 from the last gen is nothing big. Remember, it's to be expected that 2-3 way SLI of the last-gen's flagship = new gen's flagship. That's how it was with the 580 to 680, and I think how it was with the 285 to the 480. So we'll actually be falling behind if we're *only* a 780 in performance.

    And yeah, I know. The Titan Black is essentially a 6GB 780Ti as I put in my post; it isn't any big improvement, unless you're using the double-precision encoding block on it. The Titan Z should just be two Titans chucked together, I think. Not sure if it's two Titan Blacks or regular Titans, but since three regular Titans can't play Crysis 3 at 4k without dropping under 30fps... Yup.
     
  35. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,295
    Likes Received:
    3,042
    Trophy Points:
    431
    My 870M can play some games fine at 3200x1800 (obviously not cranking everything..). I would imagine you can do some damage at 4k with SLI 780M.

    That said 1080 and 1440 look great, no reason to fret.
     
  36. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    780m sli overclocked benches like an overclocked 780ti.
     
    D2 Ultima likes this.
  37. Glzmo

    Glzmo Notebook Deity

    Reputations:
    476
    Messages:
    822
    Likes Received:
    86
    Trophy Points:
    41
    The higher resolution your screen has, the more you will need higher resolution textures for it to look good. This basically means that you'll probably want to play games on a "4K" screen with at least the highest possible texture settings, if not all settings maxed out (debatable and subjective, I guess).
    As for Anti-Aliasing, I still prefer playing games at lower resolutions with Anti-Aliasing (at least 4x RGSSAA or SGSSAA) as compared to higher resolutions without Anti-Aliasing as it gets rid of shimmering and other things. But it's very subjective, so it's probably best if you can try it out.
    Of course, running games maxed out on a "4K" screen with Anti-Aliasing will be ideal, but you aren't likely going to get that from the 780Ms in most of the more taxing games. Even top-end multi-card desktop setups struggle with that at times.
     
  38. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    That will be something for tweaked next gen cards.
     
  39. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    You'll need 3x Titan Black to game with a modicum of comfort at 4K with AA on, that's $3000+ in graphics processing power alone. All the power to you if you have the means or just don't care about money, but I'll wait for next gen. :p
     
  40. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    Yes you could be ripped off by nvidia or you could buy a pair of dual 290x cards for that or just run one lol.
     
    octiceps likes this.
  41. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Hmmm that's actually pretty interesting to consider. For $3K, Titan Black tri-SLI or 2 x R9 295X2 (290X quad-Fire)? Which would be better at 4K and beyond? Would the 6GB frame buffer allow the Titan Blacks to meet or even pull ahead of the quad-Fire despite less theoretical GPU horsepower? And what about scaling and SLI being superior to CrossFire historically in terms of frame time variation and microstutter?
     
  42. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    This AnandTech review and this TweakTown review should help answer some questions.

    Basically, R9 295X2 CrossFire is a bit spotty in terms of performance, some games improve, some stay the same, and some just flat out fail.

    I'm assuming you mean the 295X2? Fair enough, although you may get better/more consistent performance with 2x 780 Ti in that case.
     
  43. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    The AnandTech review is single R9 295X2.

    Yeah, CrossFire just seems pretty spotty in general, but man, when it works, it seems to work pretty well. Tomb Raider tripled in performance. Dafuq?

    6386_30_4k_showdown_sapphire_radeon_r9_295x2_8gb_vs_295x2_8gb_in_crossfirex.png

    The PCPer review also showed about half the games they tested scaling 80-90% which would already be very good for two GPU's, to say nothing of four.

    Radeon R9 295X2 CrossFire at 4K - Quad Hawaii GPU Powerhouse | PC Perspective

    He's talking about two R9 295X2 which is four 290X's in quad-Fire. It costs $3000 just like three Titan Blacks.

    Almost every single benchmark I've seen has shown 290X beating 780 Ti at 4K while 780 Ti wins at 2.5K and 2K.
     
  44. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,180
    Likes Received:
    17,889
    Trophy Points:
    931
    Personally I would go triple 290X with waterblocks if going for an insane system or a single 295x for an ultra compact.
     
  45. Glzmo

    Glzmo Notebook Deity

    Reputations:
    476
    Messages:
    822
    Likes Received:
    86
    Trophy Points:
    41
    AMD cards certainly perform better at higher resolutions or high quality FSAA settings. CrossfireX scaling is usually also better than SLi scaling. Note however that AMD's drivers can be quite a hassle compared to Nvidia's.
     
  46. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I have never heard before that crossfire scaling is better than SLI... I have heard that AMD cards seem to handle ultra high resolutions a bit better though, which is likely due to their increased memory bandwidth. The 780 and above cards should have dealt with that though.

    nVidia's scaling is also vastly superior in the fact that it works in windowed mode. I *heard* that one game allowed borderless windowed to work with CrossfireX, but it's not supposed to work in windowed modes of any kind, so I am unsure if that game was somewhat of a fluke. As much as you may say "who cares about windowed while gaming?", a lot of people do XD. If possible, I run games in borderless windowed all the time, and I know quite a few people who only run games in windowed because they always alt-tab for other stuff.
     
  47. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Gaming is the one thing I NEVER multitask on. Firstly because it's gaming so it's sacred, and second because when I'm gaming I'm too focused to do anything else LOL.
     
  48. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    AMD has had many instances of better scaling. However, how many frames were runt frames...?
    Now their frame times are reportedly fixed I'd be interested to see some comparisons.
     
  49. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Now an AMD evolved game is likely to have better scaling performance, I do agree with this. I think the best tests would be non-manufacturer-loving games and/or older nVidia-biased games (say pre-2012) because as far as I remember, nVidia titles usually had equal performance with AMD cards, but weaker AMD cards would almost always beat nVidia cards in AMD games (like how Far Cry 3 favoured the living daylightballs out of AMD cards at launch; not sure about its comparisons now as people usually forget about revisiting games after about a month to two months).
     
    TBoneSan likes this.
  50. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I don't mean to sound like a fan boy, but SLI has always been the superior multi-GPU technology. Nvidia has had hardware frame metering built into its GPU's since at least G80 while it has taken AMD until last October with the release of the Hawaii GPU and its XDMA CrossFire implementation to fix their frame pacing woes. And its not even a complete fix either, as DX9 games have yet to receive the frame pacing improvements and probably never will. I'm guessing AMD just expects them to fade away.

    Take this from someone who returned a 3850 X2 for a cheaper and "technically" inferior 8800 GT back in the day, which ended up providing a much better and smoother experience despite the nominally lower FPS. The microstuttering on the dual-GPU Radeon card, even when the FPS counter showed 60+ (a lie, as we now know), drove me nuts. I vowed that would be the last time I would ever use CrossFire. I mean, I still get microstutter on my current SLI setup at 30-40 FPS, but it's just the microstutter inherent to AFR and not as severe or persistent. The 3850 X2 was probably tearing and dropping/runting frames left and right, too bad tools like FCAT didn't exist back then to show us this.

    This is probably why anecdotal evidence of microstutter seems to come more from CrossFire users than from SLI users, or at least this is what I've seen in forums over the years.

    Although I do have to give AMD credit for being much more flexible with which cards they allow to you to CrossFire while Nvidia is quite restrictive. (Could asymmetric GPU setups actually be part of the reason for CrossFire's poorer consistency? Interesting to consider, too bad no FCAT tests of discrete asymmetric setups have been done, only APU+dGPU.)

    So CrossFire is more flexible, but SLI is the more consistent experience, which is what I think ultimately matters most.
     
    Mr. Fox and TBoneSan like this.
 Next page →