A war of words recently broke out between AMD and nVidia over Eidos using nVidia’s code for Anti-Aliasing implementation in Batman: Arkham Asylum. There is a large discussion going on over at various Internet forums, but we decided to take a deep breath and get to the bottom of Batmangate.
In the past 48 hours, we talked with Chris Hook [Senior Manager Public Relations, AMD], Richard Huddy [WW Developer Relations Manager, AMD], Bryan Del Rizzo [GeForce Public Relations Manager, nVidia] and Brian Burke [Public Relations Manager, nVidia]. We have also discussed the situation with Tim Sweeney, the creator of the game engine and three game developers who commented on the matter under the condition of anonymity. In order to keep the matters unbiased, one developer comes from a team which recently released a The Way It’s Meant To Be Played title that scored excellent reviews and runs great on both ATI and nVidia hardware. In fact, runs on ATI’s Eyefinity technology even though the developer didn’t work with ATI on implementation of Eyefinity. The second developer is working on a DirectX 11 title set to be released in 2010. And finally, the third developer may even come on-the-record [in that case, we'll update the article].
Batman: Arkham Asylum
Developed by Rocksteady Studios Ltd. and published by Eidos Interactive Ltd., Batman: Arkham Asylum was designed as an multiplatform AAA title with all the bells and whistles modern technology can offer.
The effort placed in this title is a direct consequence of a decision brought forward by Warner Brothers five years ago. Jason Hall [Senior VP, Warner Bros Interactive Entertainment] stated that "the game industry has had its time to exploit movie studios all day long and to get away with producing inferior products. But, with Warner Brothers, no more. Those days are over. And we mean it. This isn’t just lip service. Honestly, the bad games are over."
Warner Bros decoupled the usual requirement that the game is just the tie-in to the movie release and went for a unique experience instead. The approach was criticized by other Hollywood studios and many games developers but with Batman: Arkham Asylum – it was quite clear that the path taken half a decade ago was the right one.
From what we saw while playing Batman: Arkham Asylum, this is an excellent example of how the game can be enhanced on a PC platform, regardless of hardware being used. The game uses CPU PhysX on supported consoles [PlayStation 3, Xbox 360] and only an nVidia GPU for the PC Platform. We have played the game on a PS3 console and various hardware from both parties, including activated and de-activated PhysX GPU-acceleration and 3D Vision glasses.
Do note that on the PC platform you cannot use a combination of old AGEIA PhysX cards and ATI Radeon cards, due to interesting vendor lock mechanisms nVidia deployed of late. It is exactly those Vendor ID mechanisms that are the topic of today’s analysis.
Tim Sweeney comments on Unreal Engine 3 and MSAA implementation
Before you read any of the sides involved in this feud, I would kindly ask you to read what Tim Sweeney had to say about his engine. This analysis will go in depth from both AMD and nVidia sides, and I have asked Tim could he explain how the MSAA baseline looks inside the Unreal Engine 3.
According to Tim, he was not explained the specifics of how nVidia implemented MSAA mod in Batman: Arkham Asylum. However, this is how Unreal Engine 3 works:
- UE3 does not support MSAA on Windows DirectX 9, because UE3’s use of deferred shadowing and post-processing techniques requires explicit support for frame buffer resolves, which DirectX9 lacks.
- UE3 does support MSAA on Windows DirectX 10, using its explicit support for frame buffer resolves. The support is general to all compliant DirectX 10 hardware. However, because of the way it operates on full-resolution MSAA buffers, there’s room for improvement, and good chance a typical UE3 licensee shipping a major Windows game will customize it.
- UE3 does support MSAA on Xbox 360, using its explicit support for frame buffer resolves. This is very well optimized for our usage scenario (MSAA scene rendering followed by non-MSAA shadowing and post-processing).
The reason for this controversy lies in the second point Tim made – while Epic’s Unreal Engine 3 does support Multi-Sampling Anti-Aliasing out of the box on DirectX 10 hardware, so it is up to the 3rd party to write the code that would expose AA inside the game on any of the supported platforms. In the case of Batman:Arkham Asylum, there are three sides that could write AA code:
- Rocksteady Studios / Eidos
- Advanced Micro Devices
- nVidia Corporation
Note that you can turn Anti-Aliasing on any Unreal Engine 3-based game from the nVidia ForceWare or ATI Catalyst Control Panels. Thus, thiswhole affair is a matter of user convenience and the ability to select an AA setting inside the game.
Next Page: AMD Perspective with Chris Hook and Richard Huddy
For starters, we talked to Chris Hook, who told us that AMD’s standpoint is: "Eidos had been compensated by nVidia to put a piece of code in the game that would recognize an ATI Vendor ID and then activated what was a very average implementation of Anti-Aliasing on nVidia hardware alone.
The controversy raised questions such as whether AMD’s ISV team [Independent Software Vendor] was being aggressive enough, was prepared enough, and if games really run better on nVidia hardware. We?d like to address those questions publically."
The reason for Batmangate: Different Control Panels for nVidia and ATI
Over the past couple of years, we have continuously asked why is AMD continuously reluctant to speak on-the-record about the issues [when they appear], and finally received a following answer: "AMD has been a little reluctant to tell our side of the story, because unlike nVidia, we prefer to work with our partners to find and settle these issues privately. Today, we are telling our side of the story what really went in the background and why we think that perhaps some of Eidos customers who are buying Batman: AA and running them on ATI hardware are being disadvantaged perhaps not so much by Eidos but by nVidia."
As you can imagine, hearing such harsh statements from AMD is not something we are used to, and we decided to dig in deeper. According to information given to us, Eidos went shopping for a co-marketing deal around GDC 2009 conference in San Francisco [March 23-27, 2009]. This is a standard business practice – co-marketing contracts are usually signed six months prior to title release. Eidos offered the deal to AMD and nVidia on the graphics side, with Dolby already being signed on the audio side. On March 28, 2009 Richard Huddy received confirmation inside AMD that the company taped out their DirectX 11 chips and that the situation is looking really good for the company. As a subsequent decision, AMD decided "not to engage heavily with Batman: Arkham Asylum from Rocksteady and EIDOS as we have done with other DirectX 11 game developers."
Naturally, we asked for a definition of how AMD exactly stayed in touch with both Rocksteady and Eidos, as nVidia can easily cry foul on this one. According to Richard Huddy, "It should not come as a surprise that [Batman is] one of the games that we have not quite walked away from, but decided it wasn’t our priority to work with them and on a real day-to-day, engineer-on-site kinda basis that we have with DirectX 11 games. So, we did continue to receive builds coming from both Rocksteady and Eidos, we have worked directly with Rocksteady on solving issues on AMD hardware. In fact, one of AMD’s very best and most proactive engineers, Mr. Nick Thibieroz maintained regular dialog with Rocksteady, a lot of backs and forward feedback on how to tune the game for AMD hardware I n all sorts of scenarios, DirectX 9 and DirectX 10 hardware, with Nick taking a quiet look on how it behaved on our then upcoming DirectX 11 hardware, too. That dialogue progressed in absolutely natural and cooperative kinda way up until around mid-August."
As you might imagine, it seems that mid-August is the time when the ‘sh** hit the fan’. Richard continued on saying that "We got the build and it was the first time that MSAA [Multi Sample Anti-Aliasing] was supported inside the game. On the front end, it was labeled ‘MSAA Trademark NVIDIA Corporation’."
Naturally, seeing the code that states MSAA being a trademark of any company would raise the alarm in competing technologies. According to Richard, both ATI and nVidia recommend the following standard technique for enabling MSAA: "storing linear depth into the alpha channel of the RenderTarget and using that alpha value in the resolve pass. You only need to linearize it if you are storing something like 8-bit color so that you can get depth approximation, stuff it into alpha channel and then when you’ve finished rendering at high resolution you simply filter down the color values and use depth value maintain some kind of quality to your Anti-Aliasing so you don’t just average."
We have asked our "panel of independent judges" i.e. developers about this method to ensure objectivity of this article and in both cases our developers told us that the description above is a standard technique advocated to gamers by both AMD and nVidia. With the description above being explained to us as "standard", it turns out that the code that was labeled "MSAA Trademark NVIDIA Corporation" was identical to the one recommended by both sides with one very important difference: nVidia’s code features a Vendor ID lock that allows for AA menu inside the game only if nVidia hardware is detected.
What got AMD seriously aggravated was the fact that the first step of this code is done on all AMD hardware: "’Amusingly’, it turns out that the first step is done for all hardware (even ours) whether AA is enabled or not! So it turns out that NVidia’s code for adding support for AA is running on our hardware all the time – even though we’re not being allowed to run the resolve code!
So? They’ve not just tied a very ordinary implementation of AA to their h/w, but they’ve done it in a way which ends up slowing our hardware down (because we’re forced to write useless depth values to alpha most of the time…)!"
The explanation given by Rocksteady was that the nVidia "used some special technique that couldn?t reliably be expected to work on AMD hardware and that EIDOS has done some sort of QA test on AMD hardware and found problems." On August 12, 2009 – less than 48 hours after the initial version of the code marked with "MSAA Trademark NVIDIA Corporation" was uploaded to AMD’s FTP server – Rocksteady uploaded new version of the code that still didn’t allow for in-game MSAA of ATI hardware.
AMD contacted Rocksteady and Eidos again and the immediate response from the developer was that there was at least one more build before they went final, and that Rocksteady would try to get that [code] in. The odd situation that we have here is that code was proprietary to nVidia is identical to the standard advice both companies used ever since DirectX 10.1 came out and sorted out Anti-Aliasing calls. As we all know, DirectX 10.1 removes excessive passes and kills the overhead that happened in development of the original DirectX 10.0. Even if you have DirectX 10.0 hardware, in most cases that DX10.0 overhead is run over by DX10.1 if developers had adjusted their approach to MSAA.
By late August, Rocksteady got back to AMD and stated "that they [Rocksteady] will look back and try to sort that one out for Day One patch, but that it was too late to hit the Golden Master. We were expecting that Day One patch would contain that MSAA-enabler. They made a clear statement it is Rocksteady's intention to enable MSAA on our hardware. When the Day One patch arrived, we were disappointed that it didn't contain that support."
Rocksteady gave a clear intention to enable in-game AA code on AMD cards with a statement that found its way to Beyond3D Forum: "The form of anti-aliasing implemented in Batman: Arkham Asylum uses features specific to NVIDIA cards. However, we are aware some users have managed to hack AA into the demo on ATI cards. We are speaking with ATI/AMD now to make sure it?s easier to enable some form of AA in the final game."
Things turn out even more interesting when the game was released. On Batman: Arkham Asylum Forums, people experimented with nVidia code and got it to run on ATI hardware in a very trivial way - by changing vendorID from ATI to nVidia. On September 11, AMD's Ian McNaughton posted a very interesting post claiming that by changing the VendorID from ATI to nVidia; you would not only get AA selection inside the game, but also higher performance. There was a thread on Eidos' forums that explained how to enable in-game AA on ATI hardware, but it went the way of dodo birds:
"We don't know what happened there, we don't know did nVidia went back and stated "Hey, we wrote this code and pushed through our own QA and it is not O.K. for you to loosen it up and let it run on anyone's hardware but ours."
"We don't know if EIDOS or Rocksteady went back and think up some sort of excuse - I don't know, but proprietary claim I suspect, comes from some kind of track record from nVidia that the code was made by them and that they didn't wanted to allow it to run on our hardware.
I am not sure why Eidos came to such a conclusion that there was a rendering problem. People have been able to experiment to run that nVidia code on AMD hardware and as far as I am aware, never met a problem making it run. If you were to go to EIDOS discussion forum about this for the moment, they have posted how to turn the Anti-Aliasing on through the Catalyst Control Center, you can notice that one forum poster posted immediately after how to make in-game control work on AMD hardware. As far as I know, there is no issue about it. So, I suspect that EIDOS must have got confused, they probably had reports of corruption because of some other problem we had on that particular build of the game. They also might have pressure; it is my guess from nVidia - not to allow this MSAA code to run on our hardware."
As you can see in Ian's statement, there was enough dynamite to blow the thing out of proportion by fanboys, but at the core of the matter lies in the fact that if you change vendor ID from AMD to nVidia, you get that in-game option for MSAA. Richard Huddy went on to say that he alleges how nVidia might have played a murky game "by giving them that MSAA support extremely late in the game, which is trivial, standard implementation that both nVidia and we would recommend any time in the last year or longer, being that late in the game they created a situation in which they put EIDOS and Rocksteady in the middle of a very difficult situation."
Richard continued on to say that "I have sat down with EIDOS on multiple occasions and said okay, we fix this now, we are happy there is a way forward, you might as well get a build out to players in the future. But the fact that it isn't fixed now doesn't mean it won't happen in the future."
According to Richard, who was the former European head of Developer Relations at nVidia [it is little known that Richard Huddy was nVidia's Employee #94 and he stayed for years until he departed from the company 'on moral grounds']
"When I was at nVidia, I was instructed to do these kinds of things. It is very much a style they tend to be involved in. You could say it's just me whining and coming to the conclusion that nVidia are bad people and so on, but if I am looking for differences in style, the one really obvious one is this: there I was, with my team, working primarily on DirectX 11 titles. And there is nVidia working on Batman: Arkham Asylum. I could try to lock DX11 functions and content to AMD hardware. I can do that today on the basis that nVidia can't do any QA on their DirectX 11 hardware. It is irresponsible? I could come up with this speech about nVidia not having good hardware but ultimately, it is irresponsible and plain wrong. We are spending the time with DirectX 11 developers; they use our hardware, so we should not allow for that code to run on nVidia DX11 hardware when it appears next year? No one can do QA on something like Unigine unless they are using AMD hardware. No nVidia hardware can be used for S.T.A.L.K.E.R.: Call of Pripyat, none can be done on DiRT 2 because their hardware is still in the development phase. If I were to follow their style, I could argue that DX11 games should not be able to run on future nVidia hardware. In my opinion, it would just be irresponsible thing to do so."
Do note that AMD is not doing the same thing, and if AMD would turn rogue, you could use Richard's quote against them: "We haven't locked a single line of code to AMD hardware. No fragments of code anywhere that I am aware of, is locked to any AMD hardware. That is a really big difference in style. Blocking functions that are absolutely standard DX code to nVidia hardware is just putting a games developer, Rocksteady and games publisher, EIDOS - into the middle of kinda marketing battle between two GPU manufacturers is doing disservice to gamers. That's my position, obviously."
From the explanation above, it looks that AMD is crying foul for all the right reasons. This reminds us of shady games that Intel used to play with their compilers requiring "vendorID=GenuineIntel" in order to enable standard x86 multimedia extensions such as SSE. All in all, the move looked quite backward to us. Nick Thibieroz of AMD immediately contacted Rocksteady directly and gave them "feedback on how it was a trivial matter to open that [code] to run on AMD hardware and it would in fact, run on all AMD hardware I can think of, since the introduction of the 9700. Everything look supported, DirectX 9, DirectX 10, DirectX 10.1, DirectX 11... There would be no problem."
Next Page: Technical issue turns into a legal one, AMD-EIDOS e-mail exchange
Technical turns into legal issue: EIDOS-AMD e-mail exchange
Two days ago, nVidia's own Lars Weinand came out on Hexus forums and published a statement which re-ignited the flame wars: "Batman AA is not our property. It is owned by Eidos. It is up to Eidos to decide the fate of a feature that AMD refused to contribute too and QA for their customers, not NVIDIA.
If it is relatively trivial, Mr. Huddy should have done it himself. The Unreal engine does not support in game AA, so we added it and QA'ed it for our customers. As Eidos confirmed (Not allowed to post links here, but check PCper for Eidos' statement) AMD refused the same opportunity to support gamers with AA on AMD GPUs. I'm sure Mr. Huddy knows how important QA is for game developers. I recommend AMD starts working with developers to make their HW work in a proper way. That's not our job. We added functionality for NVIDIA GPUs into the game. We did not lock anything out. AMD just did not do their work. This happened with previous UE3 engine titles before, where ATI owners had to rename the executable to make AA work on that title (Bioshock in example). It?s not NVIDIA to blame here."
As you can see here, an nVidia representative stated that Batman AA code is not nVidia's property, but Eidos one. Secondly, Lars stated that nVidia "did not lock anything out." Unfortunately, neither of those two statements are true.
Richard Huddy then followed up with a post revealing a following comment: "AMD received an email dated Sept 29th at 5:22pm from Mr. Lee Singleton General Manager at Eidos Game Studios who stated that Eidos? legal department is preventing Eidos from allowing ATI cards to run in-game antialiasing in Batman Arkham Asylum due to NVIDIA IP ownership issues over the antialiasing code, and that they are not permitted to remove the vendor ID filter.
NVIDIA has done the right thing in bowing to public pressure to renounce anti-competitive sponsorship practices and given Eidos a clear mandate to remove the vendor ID detect code that is unfairly preventing many of Eidos? customers from using in-game AA, as per Mr. Weinand?s comments. I would encourage Mr. Singleton at Eidos to move quickly and decisively to remove NVIDIA?s vendor ID detection."
We asked AMD to release us the complete e-mail conversation and the company the following e-mail thread to Hexus and BSN*:
From: Lee Singleton
Sent: 29 September 2009 18:06
To: Huddy, Richard
Subject: RE: Multisampling Anti-Aliasing in Batman: Arkham Asylum
I have taken legal advice from our general council who have advised us not to pursue a route which involves changing code that nVidia wrote, I am not prepared to go into any further details and share privileged information. We are working very hard to find a solution for ATI so please respect our position in this situation.
From: Huddy, Richard
Sent: 29 September 2009 17:30
To: Lee Singleton
Subject: RE: Multisampling Anti-Aliasing in Batman: Arkham Asylum
Can you please be very specific about what your legal staff has recommended.
I believe that NVIDIA's code path is using only the DirectX API (except where it specifically shuts out AMD hardware), so I'm unclear why this would be an issue.
Ideally I'd like to understand this response so I can explain it to my management, rather than simply parrot it to my management. I suspect that if I just recite this to my management they'll treat it with great skepticism.
Richard "7 of 5" Huddy
Worldwide Developer Relations Manager, AMD's GPU Division
From: Lee Singleton
Sent: Tuesday, September 29, 2009 5:22 PM
To: Huddy, Richard
Subject: RE: Multisampling Anti-Aliasing in Batman: Arkham Asylum
We have worked closely with our local legal team today and we have been advised that we should not reuse or change the code written by nVidia. If ATI have robust sample code we can use it will accelerate any fix, if not Rocksteady will need to start from scratch.
From: Huddy, Richard
Sent: 29 September 2009 17:09
To: Lee Singleton
Subject: RE: Multisampling Anti-Aliasing in Batman: Arkham Asylum
I believe this technique is very closely related to a technique which we've seen NVIDIA recommend before now - so actually it may well fit very well with the code that they've given you...
Richard "7 of 5" Huddy
Worldwide Developer Relations Manager, AMD's GPU Division
If you re-read the e-mail communication, you can see two things. First of all, Batman's AA code is nVidia's ownership, not Eidos ? thus rebutting the statement at the top of this page.
Secondly, Eidos asked AMD to provide "robust sample code". To this date, AMD failed to do so, arguing that nVidia's method is the same as AMD's sample code. Given the fact that you can turn in-game AA by changing the vendor lD to nVidia, there is nothing given by Eidos nor nVidia that would prove otherwise.
Assassins' Creed and the DirectX 10.1 Affair
As you can read from the e-mail exchange above, there are multiple ways you can interpret what is going on. But the underlying problem of this whole Batmangate affair was that a similar situation happen a year and a half ago. In spring 2008, Assassin's Creed was the hot title and at first hand, it had everything: superb gameplay, excellent graphics, used DirectX 10.1 API... and was nVidia's TWIMTBP title that ran better on ATI hardware. My analysis of Assassin's Creed DirectX 10.1 issue was published on May 8th on TG Daily and Tom's Hardware.
No question about the statement above - DirectX 10.0 hardware does another pass when AA is enabled and in the case of 4x AA, use of DirectX 10.1 code path saves around 25%. By some odd case, this was the difference in performance between ATI Radeon 3800 and nVidia GeForce 8800/9800 generation. Ubisoft reacted in a very interesting way? the company announced that the code is breaking apart on nVidia hardware due to use of DX10.1 and that the first patch will disable DirectX 10.1. Indeed, when the patch arrived, any issues that nVidia hardware had disappeared, but so did DX10.1 support.
During that case, I spoke with Michael Beadle [PR, Ubisoft] and Jade Raymond [Producer, Assassin's Creed] who went on-the-record and stated that nVidia was not the factor in the case for DX10.1 removal. Unfortunately for both Ubisoft and nVidia, that situation didn't develop well when Derek Perez, then Director of Public Relations at nVidia stated that "nVidia never paid for and will not pay for anything with Ubi[soft]. That is a completely false claim."
The case of Inconvenient Truth came out when Ubisoft's own Michael Beadle stated that "there was a [co-marketing] money amount, but that [transaction] was already done. That had nothing to do with development team or with Assassin's Creed." Back then, we were a part of a heated off-the-record conversation where I was told that Roy Taylor, then lead man for TWIMTPB program contacted Ubisoft and threatened to pull marketing support for all Ubisoft titles, and the sum in question ranged at around two million dollars - unless the case of Assassin's Creed is remedied. You can treat this off-the-record conversation as hear-say or the truth, but neither nVidia nor Ubisoft issued a public rebuttal of the analysis published on May 8, 2008.
Was Batman: Arkham Asylum and this whole MSAA affair the case of Assassin's Creed in 2009? AMD is crying foul as the party that was damaged in the past. Unfortunately for AMD, every story has two sides.
Yes, Ubisoft moved to remove the DirectX 10.1 path but the issues experienced by nVidia users weren't exactly "nothing to sneeze at": first of all, one of render passes went missing as a consequence of DX10.1 path. That render pass featured dust particles, lights were bleeding through the walls and there were reported cases of instability i.e. game crashing on nVidia hardware. Naturally, Ubisoft had to move to correct those issues. The inconvenient truth for AMD - the company cried foul over nVidia instead of stepping to the plate and fixing the DirectX 10.1 code. It was much easier to create a stir among AMD fanboys and press instead of working closely with the developer to get DirectX 10.1 working properly. The question of those two million dollars will remain open for eternity, though.
Next page: What does nVidia think about all of this?
When the issue of MSAA menu first arose, nVidia issued the following statement:
"A representative of AMD recently claimed that NVIDIA interfered with anti-aliasing (AA) support for Batman: Arkham Asylum on AMD cards. They also claimed that NVIDIA?s The Way It?s Meant to be Played Program prevents AMD from working with developers for those games.
Both of these claims are NOT true. Batman is based on the Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to add AA and QA the feature on GeForce. Nothing prevented AMD from doing the same thing.
Games in The Way It?s Meant to be Played are not exclusive to NVIDIA. AMD can also contact developers and work with them.
We are proud of the work we do in The Way It?s Meant to be Played. We work hard to deliver kickass, game-changing features in PC games like PhysX, AA, and 3D Vision for games like Batman. If AMD wants to deliver innovation for PC games then we encourage them to roll up their sleeves and do the same."
Now, we called the statement a nice PR one, both calling out AMD's FUD strategy on a lot of things, but also not revealing what happened a while ago with Assassin's Creed. What actually matters is a phone call we had with nVidia last night. According to nVidia, the problem that Eidos allegedly discovered is "lack of Anti-Aliasing support on FP16 textures in R:G:B:A format on specific ATI hardware."
This is somewhat of a big issue here for both companies. All ATI hardware between Radeon 9700 and Radeon X1000 series indeed didn't support MSAA while running HDR [High Dynamic Range], as stated by Tim Sweeney himself. ATI Radeon X1800 was the first GPU that supported "HDR+AA", which was public information at the time when first Radeon X1800 reviews came out. Unfortunately for nVidia, the company didn't support HDR+AA until the launch of GeForce 8800, one year and two weeks after ATI launched their HDR+AA part. As you might have already guessed - Batman: Arkham Asylum utilizes HDR for better gaming experience and if you start Batman: Arkham Asylum on such an old hardware, such as Radeon 9700/9800/X800/X850 or indeed, nVidia GeForce FX5900/6800/7800/7900 - you will get the same problem. AA yes, HDR no. Or HDR yes, AA no.
Furthermore, we have checked Microsoft's DirectX table with all supported graphics cards and Anti-Aliasing modes, and we could not find a single card in the past three years that does not support AA on FP16 RGBA textures. Furthermore, we asked game developers to answer our queries and the answer from a developer from an AAA Company who delivered multiple TWIMTBP titles was quite interesting: "If Rocksteady implemented FP16 texture that ATI hardware does not support, it is just something that would be implied by NVidia. In any case, if true - I call this move bullshit. How many UE-based games have AA regularly supported by both ATI and NVidia when enabling AA in Control Panel. This is too murky."
You might wonder who this developer is and we hope to get him on the record, since the situation would really get complicated.
Furthermore, following our talk with Bryan Del Rizzo and Brian Burke, we were referred to a following quote published on PC Perspective:
"In the case of Batman's AA support, NVIDIA essentially built the AA engine explicitly for Eidos - AA didn't exist in the game engine before that. NVIDIA knew that this title was going to be a big seller on the PC and spent the money/time to get it working on their hardware. Eidos told us in an e-mail conversation that the offer was made to AMD for them to send engineers to their studios and do the same work NVIDIA did for its own hardware, but AMD declined."
We can debate this over and over and over again - was nVidia right to use Vendor ID trick to lock the nVidia built AA AMD out or not? After all, if the issue was so trivial, why AMD didn't offer the same code, but without the vendor lock?
The problem with this situation is what if AMD decides to retaliate - and lock out specific Tessellation code path on Fermi. In all benchmarks so far [even the non-AA ones], ATI would produce lesser results than nVidia hardware as the direct result of the way how nVidia hardware runs games powered by Unreal Engine 3. Do bear in mind that old Unreal Tournament 3 ran at 45 frames a second with all the bells and whistles at 1920x1200 - on GeForce 8600 GTS 256MB. This was nothing but a wet dream on ATI card from equivalent performance range.
Additionally, nVidia repeatedly asked what about all the other titles where AMD failed to shine. The latest example is HardOCP's article where they compared the gaming performance of Resident Evil 5. As you probably imagined, the results are less than stellar for ATI hardware including the latest Radeon HD 5870. In the conclusion, Mark Warner as another independent journalist concluded that: "While we are sure AMD currently has a better GPU than NVIDIA, this game goes to show that AMD needs to step its game up when it comes to drivers. 45 days after launch and here will still have AA title with graphical bugs galore on its new hardware."
Bryan commented that it "seems that their decision to not invest in any actual developer relations initiatives is coming back to bite them." Followed by a well positioned question - "So, why are we responsible for their misdeeds exactly?"
Game Developers speak out
As you might agree with us, this discussion can go on and on and on until somebody comes and cuts the Gordian Knot of Batmangate. Our sword is no other than game developers. As always, when this kind of issue arises I go and ask several key developers for their comments. In fact, this article was almost published yesterday, but I decided to wait for the answers from the people that understand all aspects of this situation.
Now, all developers we spoke with have shipped titles measured in hundreds of thousands of units and even millions over the course of years, and even though they all requested to remain anonymous, I cannot express my gratitude for openness in their comments. This is what an engine developer of multiple TWIMTBP titles had to say about Batmangate: "Yep, nVidia is being 'brilliant' all over again! But don't forget this is not the first time, you also have that Crysis affair that never got serious media traction." Personally, I think that Crysisgate wasn't as chewed upon by anybody - given the fact that the game didn't even properly worked on nVidia hardware and nVidia invested heavily in the development and moved their engineer to Germany for couple of months. But then again, we all know the story how EA screwed up CryTek and used the wrong [un-optimized] version for gold master. The rest, as they say, is history.
On the subject of Vendor ID locking, their words were even less kind "Personally, I despise such things and if they would appear with such attitude of us putting a Vendor ID lock in the game, I would throw them out of the office. They can f**k off with such a low attitude... but I guess they know that so they're not even daring to ask! :)"
The question we had also touched the subject of Shader AA [software] versus ROP [fixed-function] approach. Back at the day, ATI was known to love pushing for Shader AA because the execution was botched on nVidia hardware, thus ATI was scoring nicely in titles that used Shader AA, but ATI's experiment fell flat with the introduction of Shader AA with the release of Radeon 2000 series. Interestingly though, two developers gave their comments on the subject of Shader AA: "ATI learned that ROP Units are much better [for AA] than shaders with the 2900XT."
The re-appearance of fixed-function Anti-Aliasing on subsequent ATI architectures confirms that matter, as confirmed by the other developer: "When it comes to maturing in the proper direction they learned their lessons on Xenos [Xbox 360] having hardware AA and R600 [HD 2000] going for shader AA. They fried their arses on R600 and went running back to ROP AA as fast as they could"
Another developer who works for a publisher that licensed Unreal Engine 3.0 for their titles got back to us and gladly admitted that they use nVidia help in developing the code: "we work on multiples [consoles, PCs] and their [nVidia's] support is invaluable to us. Console debugging is worth a million - seriously. But no, nobody from nVidia told me a damn thing about putting FSAA code in. Their Control Panel is good for us."
The discussion even brought some developers out of the woodworks who went on the record and disclosed the ways how nVidia is supporting them through TWIMTBP program. On XtremeSystems, developer known as DilTech said that he "can tell you first hand NVidia supports developers with things besides money. Certain people here will tell you about the toy they had in our hands back when the 7800GTX 512MB came out. They give developers hardware that never sees the light of day as far as the consumers know, just to make sure they have a way to test their code that isn't a software renderer. I can tell you right now that ATi have never put a single piece of hardware in our hands... and it's not like we haven't asked for anything to test on."
On the subject of that particular toy, I still have it. If we are on the same page, that "toy" used a very cool looking dual-slot cooler and was clocked that nobody believed it. Naturally, it could never pass FCC certification or OEM qualy process.
"They have a test lab with just about every possible configuration when it comes to nVidia hardware to test your application on to make sure it's going to work across a wide spectrum of hardware. You also have to remember that nVidia employ a lot more people than ATI. They have people, whose sole job IS testing said applications, finding the bugs, verifying if it's the game code or the driver, and giving a list of possible fixes to the developer. If anything, nVidia does more for the game industry than ATi has ever dreamed of doing, and pays for it with the money the consumer spends on their video card... How is that bad for the consumer?"
During the past 14 years of work experience, I also had the privilege working for two major publishers on their simulation titles and had a stint as CTO [Chief Technology Officer] in a then Croatian start-up developer. Back in the day [2003-2004], I worked with Intel, AMD, ATI and nVidia, so I had tasted the experience working with all four vendors. Being a small start-up that was yet to sign a publisher, we didn't exactly have high hopes of getting the level of support from giants in the industry.
After contacting AMD, the company offered us AMD Athlon 64 based systems for development at no cost to us as soon as we sign a publisher. Intel replied with an application to Intel Developer Program. By paying $500/year membership, we were entitled to use all the Intel software [regardless of what you may or may not think - Intel has the best C++ compiler in the industry] and stand to receive an Intel Development machine every 12 months. The machines consisted out of top of the pops CPU, motherboard, memory filled to the brink, multiple hard drives and top of the pops graphics card. We are talking about a three grand-worth system for 500 bucks. For some reason, Intel preferred to use ATI cards even back then. Coming to GPU vendors, things are really interesting and you might get a context of what happened back in 2003 - and compare it with what happened with Batman: Anti-Aliasing... pardon, Arkham Asylum.
We received visits from Cyril and Karen from nVidia, and Richard and Kevin from ATI. The results from first visits were interesting - nVidia sent us an application to TWIMTBP even though we didn't have a publisher. We were told that nVidia usually doesn't do that [offer TWIMTBP membership to start-ups unless they were promising], but those first visits resulted in nVidia sending us shipload of hardware - each of our systems ended up with GeForce FX 5850, 6800 - even a Quadro SDI for post-production. When it came to ATI, the company also sent out several cards, but the level of involvement was nowhere near the attention the company got from nVidia, followed by Intel and AMD.
To this date, that developer has yet to receive a single dime of marketing money from nVidia, yet the support given to the team from the company was invaluable in creating a casual game vendor that released eight games since 2003 [and several others under anonymity contracts]. This independent team cannot compete on AA or AAA titles, but yet, nVidia is there to support this kind of teams in that crucial incubating stage.
Number of games AMD is currently focusing on - DX11 titles for 2009 and 2010
To round this part, nVidia's TWIMTBP program features over 380 titles on the nZone website, and we know they missed on a few titles [add-ons for WoW, for instance], while AMD is touting nine DirectX 11 titles the company is supporting. Then again, independent analysts such as Jon Peddie cited AMD's effort as a key instrument for the fastest adoption of new API in the history of Microsoft Windows as a gaming platform. We do tend to agree with Jon on this one - but the other reason is that more than 10% of all Steam users ran Windows 7 beta and RTM versions months prior to release. For instance, nVidia is currently focusing on no less than several dozen titles that will be released in 2010.
Next Page - Conclusion
Conclusion: My own two cents
There are only two conclusions you can make. The First one is the subject of this analysis, Batmangate. After carefully reviewing statements released by all sides, talking to developers, there isn't much left. Forum members on Batman: Arkham Asylum Forums and AMD themselves all changed the vendor ID on their cards and got equal functionality as the one experienced on nVidia cards. This happened at the time when nVidia members both claimed that Batman AA code is proprietary to Eidos and that no vendor ID locks is implemented in the code. "Hands in a cookie jar" is the only parallel we can draw here.
With their engineering resources spread on DirectX 11 titles, it is natural that AMD could not dedicate themselves on making Batman: AA code. But what is unforgiveable for both sides is the fact that Unreal Engine 3 is out for four years now and nor AMD nor nVidia didn't develop a way for in-game AA selection for engine itself. Ultimately, the responsibility for FSAA doesn't lie neither with nVidia nor AMD. For a feature that became a standard in 2000, i.e. nearly 10 years ago, it is irresponsible that Epic Games didn't offer built-in AA support as the new versions of Unreal Engine rolled out. Being forced to develop multi-platform titles to survive, developers such as Rocksteady have thin resources to spend additional time on developing PC-only features, especially in a recession year when so many great studios closed their doors.
Secondly, if AMD is constantly criticizing nVidia as a company and deployed strategies, it would be nice to finally hear some positive AMD-related experience from the developers. Naturally, AMD needs to earn that respect first - while on USS Hornet, AMD disclosed the list of DirectX 11 titles that are coming out and if those developers don't start publicly saying that AMD's support is great - something will have to change.