One of the most important specs for the first Intel Xe discrete graphics card, the DG1, has now been confirmed by development kit listed on the EEC database. The post confirms the rumoured 96 execution units previously speculated as the base configuration for the initial Xe GPU, the same as has been found in the upcoming top Tiger Lake processors.
The Intel Xe graphics cards are launching this year. It’s pretty wild being able to type that safe in the knowledge that it’s probably true. The first of the Intel Xe discrete GPUs will appear on the the DG1 card expected to be released in the summer. And, with the latest iterations of development kits looking closer to final launch spec cards, it definitely seem like Intel is still on track to deliver by the middle of 2020.
Interestingly, the latest listings (via Komachi) don’t just detail the individual DG1 cards going out to developers in the wild, there are also two full software development platform kits too. One with an eight-core CPU and another with a six-core chip inside it. And that could suggest something far more interesting than Xe’s DG1 being just another low-end graphics card…
The solo DG1 developer kit makes sense, enabling folk to create software or driver stacks for their systems that work with Intel’s new discrete GPU. But why would Intel need to ship out software development platforms (SDP) with both an eight-core and a six-core CPU? My suggestion is that it’s because these are unreleased processors designed to work especially well with the new GPU architecture. Maybe those are engineering samples of eight- and six-core Tiger Lake CPUs, and maybe the Xe GPU isn’t always quite as discrete as we once thought.
We’ve previously seen Intel developers talking about having to rewrite driver code to support multiple GPUs, which has lead to speculation that multi-GPU support could be the killer feature for Intel’s Xe graphics card. If you can double the available GPU hardware by seamlessly linking the DG1’s 96 execution units with a Tiger Lake CPU’s own 96 EUs then you’re looking at a far more tantalising gaming prospect.
That could make both the Tiger Lake CPUs and Xe GPUs far more attractive options if they’re both better together. A 192 EU graphics pool could make for a genuinely powerful array for high-end 1080p gaming. It probably needs to be said that this isn’t really a desktop platform – Intel’s Tiger Lake is the successor to the current 10nm Ice Lake mobile range. But pairing these laptop chips with a Xe GPU could make for a genuinely powerful little gaming ultrabook setup.
The Intel DG1 chip is said to be designed with a 25W TDP, and if that little extra juice enables you to double your graphics array you could end up with some tasty little notebooks with serious gaming chops. And if Intel wants to wrest the mid-range gaming laptop market away from either Nvidia or AMD then offering the Xe GPU for a bargain price to bolster design wins seems like a smart play.
A 96 EU discrete DG1 card on the desktop, however, is going to look like pretty weak sauce. Dropped into a standard PC it’s not exactly going to set the world alight, and with the Gen12 Xe GPU architecture not filtering into desktop processor graphics any time soon, any potential multi-GPU shenanigans Intel wants to pull in the laptop space aren’t going to work on the desktop.
So maybe DG1 is a mobile-only part? That would make a whole lot more sense than shipping an add-in board that no-one wants.
But all this potentially positive Xe promise is predicated on Intel actually being able to get multiple graphics chips working together seamlessly. And that’s something neither AMD or Nvidia have been able to do successfully in all their years making GPUs.
Making the dividing line between discrete and CPU graphics invisible is the vital component in taking the burden of support away from game developers… which is where the other two graphics giants have placed it since effectively retiring both SLI and Crossfire. And that approach has basically killed any reason for creating a dual-GPU system today.
If Intel can’t make the 192 EU pool of graphics goodness appear as a single addressable graphics chip, however, then Xe’s multi-GPU feature will end up being DOA. Intel might be able to convince (read: pay) one developer to code support for the feature, but without a larger install base it would struggle to encourage other devs to spend the time and money necessary to factor it into their games.
And then it will just be another one of those Intel features that gets championed around launch, and is never seen again…