The Nvidia Geforce RTX 3000 series has hidden hotspot sensors

As circuits have acquired increasingly advanced turbo functions, the number of sensors has skyrocketed. From energy consumption to temperatures are constantly measured to find an optimal balance between those parameters and to offer as high clock frequencies as possible. It is in itself a separate science in circuits consisting of billions of transistors and it is therefore not obvious which values ​​should be reported to users.

Now Igor’s Lab has taken a closer look at Nvidia’s graphics circuit GA102 and managed to register values ​​from its hotspotsensors. As the name suggests, these sensors are located in the hottest parts of the circuit, and thus form an important basis when Nvidia’s algorithms are to regulate voltages and clock frequencies. That Nvidia’s graphics circuits have such temperature sensors is no surprise, as they have been in circuits for many years.

Traditionally, temperatures are measured and reported on the underside of the circuit or along its edges, something AMD, for example, calls “Edge Temperature”. It is also such a value used by Nvidia to control the fan curve on Geforce graphics cards and the value you as an end customer can take part in reading programs, such as GPU-Z and Hwinfo.

Read This Now:   Nvidia Geforce GTX Titan X gets semi-passive cooling - even in games

In connection with the 2019 launch of Radeon VII, AMD chose to start reporting hotspot temperatures, something they themselves call “Junction Temperature”. It was also the starting shot for temperature values ​​from these sensors to become the basis for controlling the fan curve on the company’s graphics card.

Temperature measurements from SweClocker’s test lab

Graphics card

Temperatur (edge)

Temperature (hotspot)

AMD Radeon VII

76 °C

110 °C

AMD Radeon RX 6900 XT

82 °C

94 °C

AMD Radeon RX 6800 XT

77 °C

96 °C

AMD Radeon RX 6800

67 °C

80 °C

AMD Radeon RX 5700 XT

84 °C

100 °C

AMD Radeon RX 5700

78 °C

89 °C

Nvidia Geforce RTX 3090

68 °C

Nvidia Geforce RTX 3080

76 °C

Nvidia Geforce RTX 3070

74 °C

The difference from the traditional way of measuring is in some cases enormous. When AMD launched the Radeon VII, it was a delta of a massive 34 ° C, while the difference of the more modern Radeon RX 6000 series is 12-19 ° C. The fact that AMD has shrunk the delta between Edge and Junction is due to an optimized circuit design, where they have made sure to avoid too high a transistor density with high voltages and clock frequencies on too small an area.

When Igor reports the results with a Geforce RTX 3090, the difference between the traditional value and the hotspot is between 11 and 14 degrees, ie about the same differences as with AMD’s graphics cards. The practical difference here, however, is that AMD uses its hotspot sensors to regulate the fan curve, while Nvidia uses the more traditional measured value for the purpose.

Read This Now:   Nvidia Geforce RTX 2080 Ti more popular than RTX 2080 in pre-bookings

There is no right or wrong in what value is reported. However, it can be stated that this type of hotspot sensors has been used by AMD, Intel and Nvidia to an increasing extent since the introduction of turbo frequencies. The difference is that AMD has chosen to start reporting these values ​​and uses them in practical applications relevant to end users. Whether Nvidia will follow or remain confident in using the traditional way of measuring to control fan curves remains to be seen.

For those who want to be able to enjoy even more temperature sensors with Nvidia’s graphics cards, an upcoming version of Hwinfo will receive support for reading these.

More reading:


Notice: ob_end_flush(): failed to send buffer of zlib output compression (1) in /home/gamefeve/bitcoinminershashrate.com/wp-includes/functions.php on line 5420

Notice: ob_end_flush(): failed to send buffer of zlib output compression (1) in /home/gamefeve/bitcoinminershashrate.com/wp-includes/functions.php on line 5420