Intel Arc Graphics are finally here!

After so many years of teases and leaks, Intel has finally unveiled oficially the new Arc Alchemist graphics. There is a lot to talk about here, but first lets go down to the specs and then look at Intel’s claims about this new product.

For me, the timing on this reveal is very interesting. AMD revealed, along with their new mobile 6000 series CPUs, a completely new integrated graphics and discrete graphics options, where the new Radeon RX680M iGPU took the spotlight by, for the first time, giving the perspective of the future for gaming without a discrete GPU. Intel’s Iris Xe iGPU was completely obliterated by this new competitor, so it was expected a response, and there we have it.

In terms of naming, Intel has opted by a very simple core designation:

Arc 3 is given as a less powerful, budget oriented level, and it will be readily available from this month onwards. Arc 5 and 7 will begin to be available from the end of Q2 of this year.

Intel was very careful in this presentation, by giving objective indication of the product placement in the market, neither claiming to want to take down Nvidia’s or AMD’s products. Baby steps to get more competition in the market for the joy of us, consumers.

Now for the dies that will be the base of these new GPUs. We will have two different dies, codenamed ACM-G10 and ACM-G11, detailed below.

To be noted that both these dies will be based on TMSC’s N6 node. There is a slight conversion to be made, as one may not know or remember what exactly the 32 or 8 Xe cores represent. 32 Xe cores represent 512 execution units, and 8 Xe cores represent 128 execution units, and the amalgamation of these cores also give a certain number of render and machine learning hubs. The rest of the die specification is pretty self-explanatory. Another very clear distancing from the upper class of the GPU market can be seen on die dimension. Intel, on the G10 die (the biggest) comes in at around GA104 numbers (in paper), the die used on RTX 3070Ti, for example. Intel does not make the somewhat very silly mistake of dropping the PCIe lanes to x4 like AMD did on their lacklustre 6500XT entry level graphics card. Another aspect that stands out is that despite the G11 die having a lot less power in it’s specs, it still contains the same Media Engines and Display engines as the bigger G10 die, which should make it still viable to a lot of content creators. It sits, in terms of dimensions, between the Navi 24 AMD design and the GA107 Nvidia design, and given it only has 33% more execution units than an Alder Lake 12th Gen iGPU (96), GPUs equipped with this die should be placed firmly on the entry level market, even if it has some advantages as Ray Tracing Units.

The image above shows the Intel SKUs offering. First thing to note to comprehend these specs fully is that Arc 3 will use the G11 die and both Arc 5 and 7 will use the G10 die. One weird thing that immediately stands out is even if Intel shows the G11 die as being capable of a 96 bit bus operation, the SKUs show both options with this die as having 64 bit bus. This can be explained by memory alignment, as using 96 bit buses would require 3 or 6GB of GDDR6 memory, so Intel opted by cutting down the bus to accommodate 4GB of memory. Just before that, we see Graphics Clock. One could ask: what does that mean? Basically, it is something equivalent to AMD’s game clock. Simply, it corresponds to the average clock speed measured along a series of workloads, not to the maximum frequency it can reach. Usually, the graphics clock will be achieved with the lowest TDP the graphics card has. For example, the A550M should have a 900 Mhz reading at 60W. This explanation is very welcome and transparent from Intel. The Arc 7 A770M takes full advantage of the G10 die configuration, and seems like a very capable SKU with 16GB of VRAM, 256 bit bus and a 150W maximum power limit. One can predict that maybe at 150W we can see a 2Ghz sustained clock on the GPU.

For performance claims, the Arc3 A370M was shown to be beating very comfortably the Iris Xe graphics that one can find on a new 12th Gen Alder Lake CPU. However, these results, given the quality setting and resolutions claimed, appear to be putting the A370M trading blows with the new Radeon iGPUs, which is maybe something Intel wasn’t desiring with this launch. We will have to wait until having proper numbers to evaluate all of this.

The presentation then proceeded to show other capabilities of the new Arc GPUs. The main points I found relevant were:

  • The presence of a full encoding-decoding system (unlike competitors that offer just decoding capabilities) on the Xe Media Engine with AV1 hardware encoding acceleration present, and attracting even more attention from content creators.
  • The bizarre omission of HDMI 2.1 support from the Display Engine connectivity specification, which stands out even more by thinking that AMD and Nvidia release HDMI 2.1 compatible graphics cards since 2020.
  • XeSS technology (the Intel equivalent of DLSS and FSR) is launching with exclusivity for Intel GPUs. Given that AMD will be launching FSR 2.0 that should be supported by all graphics cards, this feature may be completely empty before even launching. One may think: but Nvidia did it with DLSS. However, Nvidia did that knowing that it had a substantial market share to take that risk.
  • Intel went the same way as Nvidia and AMD by announcing Dynamic Power Share, which balances CPU and GPU wattage according to the workload implemented.

Before this article is wrapped up, it is critical to mention that there are some worthwhile impressions left by youtubers to allow fellow customers to better understand what Intel is offering,

So, to wrap this Intel Arc article, some thoughts that are shared among us and the remaining community. More competition is very good, and Intel appears to have a very solid foundation to create it’s GPU market share along the years. Also, we were very pleased by the transparency given by Intel, which was simplistic, with no crazy graphics and marketing craziness.

However, it feels that Intel rushed this launch to fulfil promises. Launching only Arc 3 with Arc 5 and 7 still moths away is disappointing, and some features showed are very underwhelming or even bizarre. To be clear, though, Intel is coming to the GPU fight, and we, as consumers, should be happy with it and appreciate the effort of Intel. Any criticisms in the article are most welcome to provide a levelled playing filed for readers as well as critiques.