After years of trying, have AMD and Nvidia finally cracked the multi-GPU conundrum?Manufacturers have been doubling up graphics chips to boost performance for almost as long as the 3D accelerator has been around, with 3dfx first enabling SL1 on the Voodoo2 back in 1998. However, with very few exceptions, such as the GeForce 7950 GX2 in 2006, most dual-GPU cards have been beset with problems, such as ridiculously noisy coolers and costing more than two single-GPU cards.
The biggest problem for multi-GPU graphics, though, has always been software support, in particular micro-stuttering. This issue evaded accurate description for a long time, because common benchmark tools, such as FRAPS, only log the frame rate every second, so these micro-stutters simply weren’t recorded.
However, thanks to our ancient ancestors being prey animals, the human eye is remarkably sensitive to small movements, so while FRAPS can’t pick up micro-stutters, they’re still very evident and irritating to most people. As a result, for many years this very magazine would extremely rarely recommend buying a dual-GPU card or multiple graphics cards, recommending instead that you buy the fastest possible single-GPU card and overclock it instead.
Wind forwards to 2014, and it looks as though AMD and Nvidia have cracked the multi-GPU conundrum. Nvidia was the first firm to announce a new dual-GPU card, the Titan Z, the first it’s produced since the GeForce GTX 690 way back in April 2012. As you might imagine from its name, the Titan Z sports two full-fat GK104 Titan GPUs, although to keep the power consumption and heat under control, Nvidia has, according to a leaked Asus press release, dropped the clock speed of the GPUs considerably.
In contrast, AMD has combined two hot and power-hungry GPUs on its new Radeon R9 295X2.Together, these GPUs require a ludicrous 50A of power and have their own hybrid air and water-cooling system to keep temperatures under control.
However, the biggest improvement in multi-GPU graphics isn’t these monster cards per se - it’s in the drivers and work behind the scenes with game developers. That’s because, with afew exceptions, most notably Battlefield4 at4K,you’re unlikely to see much, if any micro-stuttering on either of these cards. This is a huge deal, and arguably far more exciting than the announcement of two new dual-GPU cards.
Having spent a lot of time experimenting with both cards over the past two months, I wondered ifthese improvements in software translate into reduced micro-stuttering on other configurations too? After all, long experience has shown that dual-GPU cards, most notably the GeForce 9800 GX2, often get different (and better) drivers from dual-card configurations.
I headed back to the Scan warehouse, armed with a huge shoppingtrolley and alonglist of graphics cardstotest.Suitably equipped, it soon became apparent that both AMD and Nvidia haven’t j ust given their new dual-GPU cards an edge; most dual-GPU configurations also provided a healthy performance boost without micro-stuttering. The only real exception was three-way SLI, which is still prone to micro-stuttering in some games.
Dual-GPU cards arestill overpriced,butthefactthatdual-GPU configurations are now largely free from micro-stuttering issues is a huge step forwards, effectively meaning that 4K gaming is one step closer to becoming a problem-free experience.
0 comments:
Post a Comment