Amidst talk that the PC's days as a major gaming platform could be counted blockbuster titles such as Assassin's Creed are welcome signs that show just the opposite.
Sadly, it is very likely that this game will be remembered for a controversy that dances around a strange decision to remove support for DirectX 10.1 and handed transferred an initial performance advantage for ATI's Radeon cards over to Nvidia. Did Nvidia have its hands in this one? We looked a bit closer to find out.
In the beginning, everything looked perfect. The DX10.1 API included in Assassin's Creed enabled Anti-Aliasing in a single pass, which allowed ATI Radeon HD 3000 hardware (which supports DX10.1) to flaunt a competitive advantage over Nvidia (which support only DX10.0).
But Assassin's Creed had problems. We noticed various reports citing stability issues such as widescreen scaling, camera loops and crashes - mostly on Nvidia hardware.
Ubisoft became aware of these complaints, which ultimately led to the announcement of a patch. According to Ubisoft Montreal, this patch will remove support for DX10.1 and exactly this result caused Internet forums to catch fire.
So, what is it that convinced Ubisoft to drop the DirectX 10.1 code path? Here is the official explanation:
"We're planning to release a patch for the PC version of Assassin's Creed that addresses the majority of issues reported by fans. In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin's Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly."
We certainly could eat this statement with a grain of salt, but the game developer did not address a statement made by ATI's Developer Relations team, released when the game was introduced:
"Ubisoft [is] at the forefront of technology adoption, as showcased with the fantastic Assassin's Creed title. In this instance our developer relations team worked directly with the developer and found an area of code that could be executed more optimally under DX10.1 operation, thus benefiting the ATI Radeon HD 3000 Series."
Let's get this straight: On the one hand, the game was released, worked better on ATI hardware, supported an API that Nvidia or Intel didn't and still do not support and then game developer is releasing a patch that will kill that advantage. This brought back memories of Tomb Raider: The Angel of Darkness, a TWIMTBP-supported game that worked better on ATI hardware at time of release, because Nvidia was plagued with performance issues of GeForce FX series.
Assassin's Creed is an Nvidia-branded "The Way It's Meant To Be Played" title, and it didn't take long until rumors about a possible foul-play by Nvidia started surfacing. Some voices on Internet forums allege that Nvidia threatened Ubisoft and requested the developer to remove DirectX 10.1. We kept our eyes on this development, but when we started to receive e-mails from graphics card manufacturers (both ATI and Nvidia customers), adding to the already heated discussion of what may have happened in the background, we decided to shift our investigation into a higher gear and talk to all parties involved.
DirectX 10.0 vs. DirectX 10.1 in Assassin' Creed effects
The difference that developers failed to explain is the way how AntiAliasing is controlled in DirectX 10.0 and 10.1. In DX10.0, it was impossible to access information for each sample from a depth buffer. This actually led to a costly slowdown in AntiAliasing operations. 10.1 allows shader units to access all AntiAliasing buffers. All of this was brought to limelight by article an over at Rage3D.
Following three quotes from software developers, this effect was experienced with all DirectX 10 titles, and there is a good chance that you've already played their games. We talked to a (DX10.0) game developer close to Ubisoft, who requested to remain anonymous, told us that Ubisoft's explanation walks on thin ice. Here is what he responded to our inquiry and his take on Ubisoft's statement:
"Felt you might want to hear this out. Read the explanation and laughed hard ... the way how DX10.1 works is to remove excessive passes and kill overhead that happened there. That overhead wasn't supposed to happen - we all know that DX10.0 screwed AA in the process, and that 10.1 would solve that [issue]. Yet, even with DX10.0, our stuff runs faster on GeForce than on Radeon, but SP1 resolves scaling issues on [Radeon HD 3800] X2."
We received a second reply from another game developer, who is currently a DirectX 10.1 title that fully compliant with DX10.0 hardware:
"Of course it removes the render pass! That's what 10.1 does! Why is no one pointing this out, that's the correct way to implement it and is why we will implement 10.1. The same effects in 10.1 take 1 pass whereas in 10 it takes 2 passes."
A third email reply reached us from a developer a multiplatform development studio:
"Our port to DX10.1 code does not differ from DX10.0, but if you own DX10.1-class hardware from either Nvidia or ATI, FSAA equals performance jump. Remember "Free FSAA"?"
Michael Beadle, senior PR representative at Ubisoft and Jade Raymond, producer of the game, told us in a phone call that the decision to remove DX10.1 support was made by game developers. Both told us that there was no external influence which would mean that Nvidia did not participate in this decision. It was explained to us that features were implemented and tested on a platform with DirectX 10.1 graphics during the development process, which led to an implementation in the final code. However, that code was untested on a large number of DX10 systems and that ultimately led to crashes or system instability.
Ubisoft's explanation indicates a classic case of QA failure that already happened to EA's Crysis as well. Unfinished code was released as the final version. The changes developer made caused instabilities with GeForce hardware, but owners of older ATI products should also be affected (Radeon 2900, for instance) and expect crashes or camera lockdowns.
Money? What Money? Oh, that money.
There is no information whether DX10.1 will be re-implemented and that fact makes the story look fishy. There are rumors that Nvidia may have threatened Ubisoft to pull from co-advertising deals, which are said to have a value of less than $2 million. As a sane businessman, you don't jeopardize a cooperation because of one title - and those $2 million are just one component in the cooperation between these two companies. Of course, we asked both companies for comment, which delivered two different answers.
Derek Perez, director of public relations at Nvidia said that "Nvidia never paid for and will not pay for anything with Ubi. That is a completely false claim." Michael Beadle from Ubisoft stated that "there was a [co-marketing] money amount, but that [transaction] was already done. That had nothing to do with development team or with Assassin's Creed."
So, Nvidia appears to have some sort of financial relationship with Ubisoft, just like EA, Activision and other top publishers. Yes, AMD has a similar cooperation in place, but it is not as extensive as Nvidia's program.
We leave it up to you to draw your own conclusion. Our take is that the Ubisoft team could have done a better to bringing the game to the PC platform. The proprietary Scimitar engine showed a lot of flexibility when it comes to advanced graphics, but the team developed the DirectX 10.1 path without checking the stability of DirectX 10.0 parts, causing numerous issues on the most popular DX10 hardware out here - the GeForce 8 series.
The new patch will kill DX10.1 support and it is unclear when DX10.1 will see another implementation. The first "victim" of this battle was Nvidia (on a title they support), the second victim was AMD. Who is really at a loss here, however, are gamers, who are expected to pony up $50 or $75 (depending on where you live) for a title that was not finished.
Sadly, this is the way PC gaming is: There is a great game, but it is burdened with technical issues and ends up getting caught in marketing wars. We have no doubt that the development team made a mistake in QA process and integrated a feature that caused instabilities with Nvidia cards. Given Nvidia's market share, Ubisoft had to patch it. What we do not understand is the reason for an explanation that left more questions than answers.
But then again, only the developers at Ubisoft know what the Scimitar engine can and cannot do.
Solution is simple: if you run ATI hardware, just don't patch the game (once that the patch comes out) and enjoy all the advantages of DirectX 10.1 - e.g. one pass less to render.
More PlayStation 3 News...