Igor’s Lab dissects Intel graphics cards

Two is well on its way to becoming three when Intel puts in new gears towards the goal of entering the graphics card market, to join AMD and Nvidia. Intel’s scalable graphics architecture Xe has been on the agenda for a couple of years and more information is being leaked at regular intervals. At present, the architecture takes place in both server applications and as integrated graphics in the mobile “Tiger Lake” family. However, variants that appeal to performance-oriented SweClockers members are still missing.

In November, however, Intel took action and launched the first dedicated graphics circuit in over two decades – the Iris Xe Max. With its low-power Xe-LP architecture and 96 computing units (EU), it matches its integrated sibling. Since the circuit launch, it has emerged that both Intel and partner-made dedicated graphics cards, which have the code name DG1, will be exclusively available for ready-made systems. Now Igor Wallossek has managed to get his hands on one.

The veteran from Igor’s Lab has through contacts procured hardware to take the pulse of, but to protect the source, it is about images and information that do not reveal important details about the surrounding system. Wallossek describes that it is a locked Core i7 model with integrated graphics and a Z390 motherboard in Mini ITX format, which gives an idea of ​​the system’s capacity on the processor side.

Read This Now:   Nvidia demonstrates RTX features on ARM platform

One of the most outstanding details of the test process, however, is that the surrounding prefabricated system is absolutely necessary for the card to be usable. The locks mean that special Intel processors and motherboards with image outputs must be used together with the card, which cements the use in OEM systems.

Also, all 3D tests died either right after loading or shortly after with capital errors. Real game benchmarks of older and undemanding titles? Wrong. Office, browser and a bit of media clutter were all that could be coaxed out of the DG 1 with any stability at all. more unfortunately did not work, whether 3DMark or something else.

Despite this, the intertwined experience is not a dance on roses, where Wallossek, for example, describes that the card’s image outputs cannot be used – the screen must be connected to the motherboard. In addition, drivers and performance tests are in trouble. 3DMark and GPGPU mode in Aida64 are examples that do not roll, while easier media and office work works.

Under the shell of DG1

Igor Wallossek also takes the screwdriver and opens up the Intel-signed graphics card, to take a peek under the hood. It is clear that this is a sparsely populated circuit board, with four LPDDR4 memory circuits for a total of 8 GB of video memory in 2,133 MHz. The Xe circuit is housed in the center and components for voltage conversion are scattered on the surface.

Read This Now:   Nvidia releases driver for Fortnite and the Specter vulnerability

One short side is adorned with the four unusable HDMI and Displayport connections, while the long side has a PCI Express 4.0 x16 connection. This is responsible for all power supply and communication, which, however, has a limit of 8 lanes PCI Express – a reminder of the mobile kinship. On top of the circuit rests a heat sink and a metal cover equipped with an 80 mm fan.

At low load it spins at 850 RPM and the circuit maintains a temperature of 30 ° C. The circuit is then chewed at 600 MHz with a power output of about 4 watts. Under load, the clock frequencies rise to the modest peak level of 1,550 MHz. With about 20 watts of power for just the circuit, the temperature rises up to 50 ° C, despite the backing of the loud 1,800 RPM fan.

DG1 – a dubious product

Whether Intel has done itself any favours at all with the DG 1 can actually be doubted in view of the lousy (non-)performance, because this is by no means a figurehead, but rather a deterrent example of how you construct something that you then can’t even use for lack of suitable drivers. Meaningless wasted resources, that’s all. I really would have preferred to write something more edifying, and in the end I can only hope that Xe will at least perform much better and more flawlessly as a component of Intel’s new mobile CPUs. As a dedicated graphics card, I don’t even want the thing as a gift, even for the office.

After the test, Intel’s DG1 leaves Wallossek with more questions than answers. The purpose of the dedicated card is unclear and the experience is unpolished and unreliable. He hopes that the integrated graphics will encounter a smaller patrol and underlines the dissatisfaction that he can not even imagine a DG1 card, if it is donated.

Read This Now:   "The last Kepler driver will be released in August"

Source: Igor’s Lab

Read more about Intel:


Notice: ob_end_flush(): failed to send buffer of zlib output compression (1) in /home/gamefeve/bitcoinminershashrate.com/wp-includes/functions.php on line 5373

Notice: ob_end_flush(): failed to send buffer of zlib output compression (1) in /home/gamefeve/bitcoinminershashrate.com/wp-includes/functions.php on line 5373