Nvidia Reflex Analyzer CUP
: Test |CUP | Specs |Config
tl;dr: Nvidia focuses on the system latency in games, i.e. the delay between command input and image output: 360 Hz monitors with Reflex Latency Analyzer make the latency visible, the Reflex SDK can specifically reduce it in selected games. A look at the possibilities and the great potential.
Update 10/22/2020 9:38 p.m.
The common currency for assessing a system’s performance in games is the frames per second (FPS) and the variance of the distance between the individual images (frame times). Both key figures can be easily measured and understood with the appropriate tools or even directly in the game: the higher the FPS and the more evenly the distance between them, the smoother the game feels.
FPS can be understood as a measure of the system throughput. There is, however, another currency that many players are familiar with, but which is usually not measurable: the latency, i.e. the delay between the input of a command (e.g. mouse click for a shot) and the output of the resulting action on the screen. Latency is the measure of how responsive the system is. Latency and FPS are not to be considered separately, but are also related to each other.
Latency has always been omnipresent when it comes to the question “VSync on or off?”. Activated VSync, i.e. the output of an entire image with each refresh cycle of the display, tends to increase the latency (“input lag”) because finished rendered images with (then old) mouse and keyboard inputs wait for the next refresh cycle or In extreme cases, the same image has to be output again because the PC cannot yet deliver a new one for the display refresh. With a 60 Hertz monitor, 16.7 ms or even a multiple of that can be added to the system latency. The discussion also focuses on the latency of the mouse sensor and its polling rate or the speed of the panel.
As a rule, however, players can only feel or experience the latency in the game, because to measure it requires that the time of the input can be recorded with the time of the output with millisecond accuracy.
The following graphic illustrates the complexity of the chain from mouse click or keyboard input to visible output on the screen. The delay and its influencing factors can be roughly divided into four areas:
It becomes very clear that the actual rendering process is only a very small link in the chain: If a game is running at 100 FPS, rendering an image on the GPU only takes 10 ms. The delay that is added on a 60 Hertz display due to a refresh cycle that has just been missed when VSync is active is almost 70 percent greater at 16.7 ms.
Even without knowing the latency of each link in the chain, some measures to reduce the overall latency are obvious: A faster CPU, a faster GPU and a faster display or VSync. However, a look at the chain also makes it clear that chain links can definitely influence each other and that changing a chain link can have far greater effects than the isolated observation suggests.
A prominent link in this context is the queue for the images transferred from the CPU to the graphics card and to be rendered, the so-called “render queue”. The weaker the GPU, the longer this queue is potentially – and the older the user inputs contained in the images are when rendering. It becomes apparent that a faster GPU can not only reduce the delay caused by rendering as such, but also the delay caused by the queue. The influence of the operating system, driver and game engine on the latency is also obvious. What was missing so far was transparency.
Nvidia Reflex attacks precisely on this point. With the Reflex Latency Analyzer and the new overlay in GeForce Experience, latencies are visible for the first time without expensive external equipment and the Reflex SDK should be able to reduce the “render queue” by intervening directly in the engine.
Describing the Nvidia Reflex is easy at first glance. Nvidia itself says: Reflex includes technologies for optimizing and measuring latencies and thus targets esports. Because the technologies range from measuring instruments in software to SDKs for game optimization to new monitors and mice, a few more words are needed in the end to fully answer the question.
Basically, Nvidia Reflex technologies can be divided into three categories:
Technologies from all three categories can be used together, but do not have to be. An example: Fortnite.
The title already integrated the Nvidia Reflex SDK when the platform was launched. All users of a GeForce GTX 900 (Maxwell) or newer can use this mode to reduce the latency, and with the new GeForce Experience, the render latency can also be displayed. However, only one of the new 360 Hertz monitors with Nvidia G-Sync can measure the effect of the Nvidia Reflex SDK on the overall system latency.
This display of the render latency is in turn available immediately on every system that supports GeForce Experience, while with the new monitors and mice, the latency can be determined on every computer regardless of the graphics card and software installed – i.e. also in Windows the desktop or with graphics cards from AMD.
With the Latency & Display Analysis Tool (LDAT), Nvidia equipped selected editorial offices (ComputerBase was not among them) with measuring equipment with which the system latency can be measured. The system records the time of the mouse click and a resulting high-contrast change on the screen.
The first 360 Hertz monitors with Nvidia G-Sync, announced for the end of 2020, will bring such a device with them with the Reflex Analyzer. For this purpose, the mouse is integrated into the system via a special USB port on the display and an area on the screen can be defined via the OSD in which the display should watch for changes – usually a muzzle flash resulting directly from the mouse click. The monitoring function is in the new G-Sync module.
The monitors can display the time delay between the input of the mouse input and the output of a result. Alternatively, the result is reported via the performance overlay in GeForce Experience.
Asus ROG SWIFT PG259QNR (Image: Asus)
picture 1 Of 3
The new performance overlay for latencies is available from version 3.20.6.5. which will be available for everyone to download from October 21st. From a GeForce GTX 900 (Maxwell), the overlay in games shows the current FPS and the render latency (“render queue” + rendering). If a new monitor with Nvidia Reflex is used, the combined PC display latency and – potentially – the mouse latency are added. The latter is either reported by a mouse with Nvidia Reflex itself or taken as an average value from a database for 30 known esports mice. GeForce Experience also uses the last 20 measured values to calculate an average value for the combined PC display latency and – including the average mouse latency – the system latency.
The new latency overlay for GeForce Experience
picture 1 from 2
The third pillar of the Nvidia Reflex ecosystem is an SDK available to game developers. With the Nvidia Reflex SDK, the ultra-latency function introduced in the driver in August 2019 will be relocated to the game engine: games themselves should use Reflex to request input for the next frame to be rendered at the latest possible time – namely shortly before the GPU is ready for the next render job.
Ideally, Nvidia wants to be able to completely prevent the so-called “render queue”, ie the queue of frames waiting in front of the GPU with input commands that are older from frame to frame. Reflex acts like G-Sync between game / CPU and GPU. The relief of the CPU should also mean that the input commands can also be sampled later.
Gamers can activate Nvidia Reflex Low Latency in the graphics menu of the supported games. The greatest benefit should come from the GPU limit. It’s obvious: If the GPU is the bottleneck, the render jobs pile up in front of the graphics processor and the player’s inputs that are taken into account get older and older in the queue.
The FPS remain unaffected, because the bottleneck GPU remains untouched. Alternatively, Reflex can also be activated including “Boost”, which slightly overclocks the GPU. That could then slightly affect the FPS.
On the next page: test results, benchmarks and conclusion
Mining on RTX 3070. Overclocking, tuning, profitability, consumption: If you are interested in finding more…
Mining with GTX 1660, 1660 Ti, 1660 Super. Overclocking, settings, consumption, profitability, comparisons - If…
Mining with RTX 2070 and 2070 Super. Overclocking, profitability, consumption, comparison What the RTX 2070…
Mining with RTX 3060, 3060 Ti. Limitations, overclocking, settings, consumption, profitability, comparison Let's look at…
Alphacool Eisblock Aurora Acryl GPX-A (2022) with Sapphire Radeon RX 6950 XT Nitro+ Pure in…
In the ever-evolving landscape of business strategy, Bitcoin has emerged as a pivotal asset. With…
This website uses cookies.