DirectX 12 arrives at last with Ashes of the Singularity, AMD and Nvidia go head-to-head
Ever since Microsoft announced DirectX 12, gamers have clamored for hard facts on how the new API would impact gaming.
Ever since Microsoft announced DirectX 12, gamers have clamored for hard facts on how the new API would impact gaming. Unfortunately, hard data on this topic has been difficult to come by — until now. Oxide Games has released an early version of its upcoming RTS game Ashes of the Singularity , and allowed the press to do some independent tire-kicking.
Before we dive into the test results, let’s talk a bit about the game itself. Ashes is an RTS title powered by Oxide’s Nitrous game engine. The game’s look and feel somewhat resemble Total Annihilation , with large numbers of on-screen units simultaneously, and heavy action between ground and flying units. The game has been in development for several years, and it’s the debut title for the new Nitrous engine.
An RTS game is theoretically a great way to debut an API like DirectX 12. On-screen slowdowns when the action gets heavy have often plagued previous titles, and freeing up more CPU threads to attend to the rendering pipeline should be a boon for all involved.
Bear in mind, however, that this is a preview of DX12 performance — we’re examining a single title that’s still in pre-beta condition, though Oxide tells us that it’s been working very hard with both AMD and Nvidia to develop drivers that support the game effectively and ensure the rendering performance in this early test is representative of what DirectX 12 can deliver.
Nvidia really doesn’t think much of this game
Nvidia pulled no punches when it came to its opinion of Ashes of the Singularity. According to the official Nvidia Reviewer’s Guide, the benchmark is primarily useful for ascertaining if your own hardware will play the game. The company also states: “ We do not believe it is a good indicator of overall DirectX 12 performance. ” (emphasis original). Nvidia also told reviewers that MSAA performance was buggy in Ashes , and that MSAA should be disabled by reviewers when benchmarking the title.
Oxide has denied this characterization of the benchmark in no uncertain terms. Dan Baker, co-founder of Oxide Games, has published an in-depth blog poston Ashes of the Singularity , which states:
“There are incorrect statements regarding issues with MSAA. Specifically, that the application has a bug in it which precludes the validity of the test. We assure everyone that is absolutely not the case. Our code has been reviewed by Nvidia, Microsoft, AMD and Intel. It has passed the very thorough D3D12 validation system provided by Microsoft specifically designed to validate against incorrect usages. All IHVs have had access to our source code for over year, and we can confirm that both Nvidia and AMD compile our very latest changes on a daily basis and have been running our application in their labs for months. Fundamentally, the MSAA path is essentially unchanged in DX11 and DX12. Any statement which says there is a bug in the application should be disregarded as inaccurate information.
“So what is going on then? Our analysis indicates that the any D3D12 problems are quite mundane. New API, new drivers. Some optimizations that that the drivers are doing in DX11 just aren’t working in DX12 yet. Oxide believes it has identified some of the issues with MSAA and is working to implement work arounds on our code. This in no way affects the validity of a DX12 to DX12 test, as the same exact work load gets sent to everyone’s GPUs. This type of optimizations is just the nature of brand new APIs with immature drivers.”
AMD and Nvidia have a long history of taking shots at each other over game optimization and benchmark choice, but most developers choose to stay out of these discussions. Oxide’s decision to buck that trend should be weighed accordingly. At ExtremeTech, we’ve had access to Ashes builds for nearly two months and have tested the game at multiple points. Testing we conducted over that period suggests Nvidia has done a great deal of work on Ashes of the Singularity over the past few weeks. DirectX 11 performance with the 355.60 driver, released on Friday, is significantly better than what we saw with 353.30.
Is Ashes a “real” benchmark?
Baker’s blog post doesn’t just refute Nvidia’s MSAA claims; it goes into detail on how the benchmark executes and how to interpret its results. The standard benchmark does execute an identical flyby pass and tests various missions and unit match-ups, but it doesn’t pre-compute the results. Every aspect of the game engine, including its AI, audio, physics, and firing solutions is executed in real-time, every single time. By default, the benchmark is designed to record frame time data and report a play-by-play report on performance in every subsection of the test. We only had a relatively short period of time to spend with the game, but Ashes records a great deal of information in both DX11 and DX12.
Ashes of the Singularity also includes a CPU benchmark that can be used to simulate an infinitely fast GPU — useful for measuring how GPU-bound any given segment of the game actually is.
In short, by any reasonable meaning of the phrase, Ashes is absolutely a real benchmark. We wouldn’t recommend taking these results as a guaranteed predictor of future DX12 performance between Red and Green — Windows 10only just launched, the game is still in pre-beta, and AMD and Nvidia still have issues to iron out of their drivers. While Oxide strongly disputes that their MSAA is bugged for any meaningful definition of the word, they acknowledge that gamers may want to disable MSAA until both AMD and NV have had more time to work on their drivers. In deference to this view, our own benchmarks have been performed with MSAA both enabled and disabled.
Test setup
Because Ashes is a DirectX 12 title, it presents different performance considerations than we’ve previously seen, and unfortunately we only have time to address the most obvious cases between AMD and Nvidia today. As with Mantle before it, we expect the greatest performance improvements to show up on lower-core CPUs, or CPUs with weak single-threaded performance. AMD chips should benefit dramatically, as they did in Mantle, while Intel Core i3’s and Core i5’s should still see significant improvements.
With that said, our choice of a Core i7-5960X isn’t an accident. For these initial tests, we wanted to focus on differences in GPU performance. We compared the Nvidia GTX 980 Ti using the newly-released 355.60 drivers. These drivers dramatically boost Ashes of the Singularity performance in DX11 and are a must-download if you plan on playing the game or participating in its beta. AMD also distributed a new beta Catalyst build for this review, which was also used here. Our testbed consisted of an Asus X99-Deluxe monitor, Core i7-5960X, 16GB of DDR4-2667, a Galax SSD, and the aforementioned GTX 980 Ti and R9 Fury X video cards.
We chose to test Ashes of the Singularity at both 1080p and 4K, with 4x MSAA enabled and disabled. The game was tested at its “High” default preset (note that the “High” preset initially sets 2x MSAA as default, but we changed this when testing with MSAA disabled).
Batches? We don’t need no stinkin’ batches!
As we step through the game’s performance, we should talk a bit about how Oxide breaks the performance figures down. In Ashes, performance figures are given as a total average of all frames as well as by batches. Batches, for our purposes, can be thought of as synonymous with draw calls. “Normal” batches contain a relatively light number of draw calls, while heavy batches are those frames that include a huge number of draw calls. One of the major purposes of DirectX 12 is to increase how many draw calls the system can handle simultaneously without bogging down.
Test results: DirectX 11
We’ll begin with DirectX 11 performance between the AMD Radeon R9 Fury X and the GeForce GTX 980 Ti. The first graph is the overall frames-per-second average between AMD and Nvidia, the next two graphs show performance broken out by batch type.
Overall performance.
Batch performance at 1080p
Batch performance at 4K
Nvidia’s DirectX 11 performance makes hash of AMD in DX11. Looking at the graph breakdowns for Normal, Medium, and High batches, we can see why – Nvidia’s performance lead in the medium and heavy batches is much greater than in normal batches. We can see this most clearly at 4K, where Nvidia leads AMD by just 7% in Normal batches, but by 84% in Heavy batches. Enabling 4x MSAA cuts the gap between AMD and Nvidia, as has often been the case.
Overall performance with MSAA enabled
Batch performance at 1080p
Batch performance at 4K
Note that while these figures are comparatively stronger for AMD on the whole, they still aren’t great. Without antialiasing enabled, Nvidia’s GTX 980 Ti is 1.42x faster than AMD in 4K and 1.78x faster in 1080p. With MSAA enabled, that gap falls to 1.27x and 1.69x respectively. The batch breakouts show these trends as well, though it’s interesting that the Fury X closes to within 13% of the GTX 980 Ti at4K, Medium batches.
The gap between AMD and Nvidia was smaller last week, but the 355.60 driver improved Team Green’s performance by an overall 14% and up to 25% in some of the batch-specific test. Oxide told us it has worked with Nvidia engineers for months to ensure the game ran optimally on DirectX 11 and 12, and these strong results bear that out.
1 of 3
Comments
Post a Comment