NVIDIA has undoubtedly ruled the computing graphics market. The last two years have been a perfect storm as PC gaming has enjoyed a huge resurgence, and people who might have otherwise overlooked the segment found a purpose – thanks in no small part to the cryptocurrency boom. The 10 Series graphic cards from 2016 is their best-selling hardware lineup to date, but the world – and eager fans – seem divided on where GPU innovation is headed next.
For this, NVIDIA has unleashed two new contenders for your precious dollars: the GeForce RTX 2080 Ti and GeForce RTX 2080. This review focuses on the former, which is considerably more powerful – and considerably more expensive. If you’re just after a more pristine gaming experience than either card will make your current rig shine to some degree, but those willing to dish out the extra cash should know what they’re getting into.
The GeForce RTX 2080 Ti attempts to break the mold with a host of new features, including ray-tracing and intense efficiency, all powered by the company’s latest Turing architecture. It’s an extraordinary push forward for both NVIDIA and the industry as a whole, and after witnessing the possibilities firsthand I came away cautiously optimistic.
The real question is mainstream adaptability and how willing consumers are to trade up from graphics cards they may be perfectly content with.
Decades In The Making
So what are you actually getting for the money? Obviously, ray-tracing is the biggest capability NVIDIA wants buyers to concentrate on for the immediate future, with claims that this technology is supposed to the ‘holy grail’ in graphical prowess. Realistically simulating the lighting of a scene and its objects by way of hybrid implementation — with accompanying rasterization techniques for rendering physically correct reflections, refractions, shadows, and indirect lighting in real-time. This was something deemed impossible for modern graphics cards until now.
Deep Learning Super-Sampling (DLSS) is arguably the more attractive feature. It’s a new form of antialiasing (AA) that — in practice — utilizes AI-learning and auxiliary RTX tensor cores. This technique that produces a smoother and incredibly detailed imaging effect if an algorithm is co-developed for specific titles. In order words: it’s supposed to exceed current antialiasing methods (TAA, SMAA) by scanning PC games, and then being remotely distributed by a supercomputer back to the end-user. The immediate tradeoff for DLSS AA is a better-looking and improved performance by roughly 30%-45%, theoretically reducing the needed amount of rendering power and physical resources involved.
NVIDIA addresses numerous criticisms with older Founders Editions. The OEM versions are meant to be centerpieces of how their own cards would perfectly appear, with an understated design that fits within any serious machine. However, the exterior functionality of previous cards, though tasteful weren’t great when it came to cooling, due to the reliance of a single (HSF) blower fan. Older units loud and had a strict aversion to overclocking.
That old layout is ditched and finally adopts a dual blower setup that breathes easier and runs a lot more quietly. The external appearance also scales back the flashiness for subdued practicality, with an aluminum backplate that’s less angularly flamboyant and functional to dispel heat. One trait that is faithfully retained is the illuminated outward-top side that proudly displays the name, although now it says “GeForce RTX”.
All the Ports
You get one HDMI (2.0b) and three DisplayPort (1.4a) ports with HDCP 2.2 DRM encryption, the latter outputs are theoretically able to handle up to 8K (7680×4320) of pixel resolution through a single DSC 1.2 cable. VR-ready users are also accounted for with an additional VirtualLink USB-C connector meant for next generation headsets.
Specs and Power
Compared to Pascal-based GeForce/TITAN X models, the RTX is a moderate step-up when it comes to technical figures. The reference specs for RTX 2080 Ti are plentiful with 4352 CUDA cores, 1350MHz base/1545MHz (1635OC) boost clocks, Micron 11GB GDDR6/14Gbps, and memory bandwidth of 616GB/s with 352-bit interface. What is unique is the addition of 78 trillion (FE) or 76 trillion (AIB), and RTX-OPS at 10 Giga Rays/s, which is solely provisioned for ray-tracing performance.
Another thing to note is required power, with the RTX 2080 Ti needing quite a bit of it run properly. In fact, NVIDIA recommends you have a 650W minimum PSU because a 16 ATX pin (8 pin + 8 pin) connection must be hooked up in order to boot, for example: something like an EVGA SuperNOVA 1000 G+ or minimum 800W power supply is ideal. Standard wattage at initial load is 260 watts and idling draws a noticeable 45 watts of power consumption, that’s a lot of juice although there are plans on getting the voltage down later on.
NVIDIA wanted to know how I was going to test the RTX because these cards are aimed to be the absolute choice for both 4K gaming and HDR fidelity, whether you have a benchmark DIY machine or wanting a legitimate alternative to regular home entertainment arrangements. For this, I borrowed both a 27-inch ASUS ROG Swift PG27UQ monitor and Epson Pro Cinema LS10000 4Ke Laser Projector as these displays could accentuate the demands of current AAA titles. This is expensive gear largely reserved to the immersive PC/AV elite, and timely loaners paired with my current computer for exaggerated purposes.
So how well does this run in 4K? I’ll immediately say before continuing that if you’re comfortable with the 1440p standard for the indefinite future, then you might be happier saving your money and getting the RTX 2080 or sticking with the existing Pascal GPUs for tighter budgets. No matter which one you look at from the GTX 1070, 1080 Ti, and especially TITAN Xp the overall experience probably doesn’t warrant an upgrade here. Those are excellent cards today. That said, the RTX 2080 Ti also puts those card to shame if you want no questions on what enhancements can be enabled beyond maximum, so it’s going to be a matter of preference if you want the newest thing in your life.
I tested a number of games using Shadow of the Tomb Raider, Call of Duty: Black Ops 4, Final Fantasy XV, DOOM and Far Cry 5. Technical demonstrations of Star Wars Reflections DLSS Demo, Final Fantasy XV Benchmark, and Infiltrator DLSS Demo from Epic Games’ were also provided to experience the RTX in action under controlled elements.
That said, the RTX models can handle 4K PC gaming and handily trounces its predecessors…but for whatever reason still inadequate for graphically intensive titles in 4K/60Hz. In fact, only a few of the games were able to effectively hit and maintain 4K/60fps minimum at the default highest (non-customized) presets, let alone reach the coveted 144Hz value.
Only the RTX 2080 Ti is appropriate for the task, as the regular RTX 2080 matches the GTX 1080 Ti and certainly won’t cut it without some compromise.
Tomb Raider is billed as a flagship title to show off the card’s best visuas, but I wasn’t able to reach max capability as ray-tracing and DLSS won’t be active until next month at the earliest (but responsibility and implementation still falls on game developers for individual titles). NVIDIA informed the press that we’ll have to wait until Microsoft rolls out its Windows 10 October 2018 Update. Otherwise, Lara’s latest adventure proved to be a challenge even for the 2080 Ti averaging between 37-51fps in both highest and optimized settings. All we have is speculation to say that the advent of ray-tracing will improve things.
Final Fantasy XV is another title being pushed by Square Enix for RTX 4K DLSS, but only through a press benchmark demo. Regardless, the DLSS showcase topped at around 57fps while actual in-game figures maxed out at 44fps. The 2080 Ti fared better when playing Far Cry 5 at an average of 71fps, 69fps in CoD:BO4, and 176fps in DOOM (Vulkan API).
The other RTX video demos such as Infiltrator, had the 2080 Ti tap out at 76fps then gradually level to 55fps for the remainder. Meanwhile, the entertaining Star Wars Reflections real-time DLSS video was locked at 24fps but looked great whether it was viewed in 2560×1440 or 3840×2160. The takeaway is that both consumer-grade cards appear to be comfortable processing monster workload combinations in soft shadows, reflective lighting, and hue intensity all at once.
After testing and playing around with the GeForce RTX 2080 Ti for a week, it’s apparent that NVIDIA is betting a lot on the pure ‘potential’ of their Turing architecture. Since its global debut at Gamescom, there’s been growing concern that the RTX series is more proof-of-concept than worthwhile investment. These assumptions aren’t helped by the fact that people are eagerly picking the current GTX lineup and getting much of the performance they want right now. This is either an amazing time for the majority of PC gamers out there, or a very uncertain jump for enthusiasts.
Another clash is the ray-tracing technology, or lack thereof at the time of this writing. The headlines made everything sound like the final mile in graphical hardware, and the added implementation of DLSS does look impressive. However, the practical benefits remain unknown beyond the meticulous demos I’ve seen so far. This also has the unusual predicament of competition, not from AMD this time, but from NVIDIA themselves. All of this seems to be based on the promise of better, more realistic presentations sometime later, with 25 games in the pipeline so far. A gamble if you already own a 1080 Ti or any contemporary iteration of the TITAN.
Currently, both cards executed wonderfully and should please most who adopt them; they just don’t represent that profound an evolution of high-end graphics at the moment. Regardless, you’ll be paying for the privilege since the Founders Edition model cost a few hundred dollars extra at launch; specifically $1,199 for the RTX 2080 Ti and $799 RTX 2080 for the chance at pure 4K gaming. I know NVIDIA is hard at work to deliver with the RTX and it is an improvement. For now, a little patience might go a long way in seeing how things turn out later this year.