But beyond the pure raw performance expressed as a combination of core throughput and memory bandwidth, NVIDIA is promoting four rendering “features” that it has worked on. None of them are key hardware features, but rather very well polished software features that have a true impact on gaming. Let’s go over the lis
This is a new twist on anti-aliasing (AA), or the ability to suppress and filter out unwanted “jaggies” that indicate that the rendering is done in real-time. Over the years, 3D vendors have invented quite a few techniques, which have been steadily improved upon and augmented.
The overall principle is to detect where AA needs to be applied, then blend in the surrounding pixels (or “samples”) to make the image as smooth as possible, while not making it blurry in the process. This is hard stuff and NVIDIA now licenses its MSAA technology. Multi-Frame AA (MFAA) still work on the same principle, but the surrounding pixels can be read from multiple frame.
This should provide a significant performance boost (30%, says NVIDIA) because instead of using more “samples” in the current frame, you can read data from a previous frame. That’s pretty smart, and not really obvious to implement (lots of potential caveats), even if you know how the camera has moved from one frame to another. Also, there may be cases where there were no available samples in the previous frame – in which case, it is always possible to revert back to regular MSAA…
What NVIDIA means by Dynamic Super Resolution is that the image is rendered in a large buffer –let’s say 4K, then down-sampled back in to 1080p by applying a filter to the image to take advantage of the extra information when computing the final image seen by the user. This is not unlike the super-sampling AA technique from a while ago, and the principle has proven to work: by having more information, it is possible to compute a much cleaner image – there is no question about it, and the good news is that this is something that can be turned on and off in the driver.
As its name indicates, this is a technique that would let developers use global illumination (which means that the light bounces from one surface to another and is “transported” across the scene) to compute much more realistic images. Back in the days, even pre-rendered computer graphics (CG) movies often emulated global illumination by placing lights in key locations. This is certainly not something that is desired in real-time, so having the real thing is the best way to go.
Voxel Global Illumination is not something that can be turned on in the driver since it is more of a library that developers can use. The technique is based on the work done by Cyril Crassin, an NVIDIA researcher who has published about global illumination (GI) at least since 2011. Here’s a better overview of his work.
The project eventually ended up becoming a library that NVIDIA licenses to its partners. The research is quite impressive and has solved or worked around nearly all the biggest problems associated with GI in real-time, so it is production-ready. It is also somewhat complicated and was considered, then removed from Unreal Engine 4 at some point. Others like Crytek use simpler forms of GI like light propagation volumes, with several evolutions. I believe that Fable uses a technique similar to Crytek’s.
In any case, and assuming that Epic’s decision was based on the fact that consoles were underpowered for game consoles, this should work really well on PC, especially going forward as the speed and memory capacity continues to increase fast.
With virtual reality (VR) in full-swing, it’s nice to see NVIDIA add extra support for this very specific use case. According to NVIDIA, VR Direct tries to reduce latency as much as possible, which is key to VR since latency is the main “mortal enemy” of VR. NVIDIA had already addressed the latency issues related to streaming games with success, and it continues to attack this problem, with VR this time. The latency is the main culprit of VR motion-sickness, so it’s a big deal! NVIDIA also points out that it will offer support for multi-GPUs in the context of virtual reality. Since there’s stereoscopic rendering involved, having more GPUs can always help boost the performance.
It is clear that the new Geforce GTX 970 and GeForce GTX 980 GPUs deliver a significant speed increase at a reasonable power budget, thanks to architectural hardware improvements. However, it is equally clear that NVIDIA is aggressively (and has been for some time) moving into making its software much smarter and more accessible to developers. Before, “graphics software” was often associated with “driver”, which in essentially tried to run apps faster, even if that meant doing things in the back of developers.
Today, “software” means that NVIDIA has to leverage its huge pool of graphics R&D talent to enable developers with things like physics, global illumination and more, in a package like Gameworks. That is the right move as it becomes increasingly difficult for many developers to justify the cost of doing such research themselves, and therefore to use GPUs to their full potential. As a side effect, developers who use those technologies give NVIDIA a built-in advantage which compounds to the one they may already have in technical terms.