When the issue of MSAA menu first arose, nVidia issued the following statement
: "A representative of AMD recently claimed that NVIDIA interfered with anti-aliasing (AA) support for Batman: Arkham Asylum on AMD cards. They also claimed that NVIDIA’s The Way It’s Meant to be Played Program prevents AMD from working with developers for those games. Both of these claims are NOT true. Batman is based on the Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to add AA and QA the feature on GeForce. Nothing prevented AMD from doing the same thing. Games in The Way It’s Meant to be Played are not exclusive to NVIDIA. AMD can also contact developers and work with them. We are proud of the work we do in The Way It’s Meant to be Played. We work hard to deliver kickass, game-changing features in PC games like PhysX, AA, and 3D Vision for games like Batman. If AMD wants to deliver innovation for PC games then we encourage them to roll up their sleeves and do the same."
Now, we called the statement a nice PR one, both calling out AMD's FUD strategy on a lot of things, but also not revealing what happened a while ago with Assassin's Creed. What actually matters is a phone call we had with nVidia last night. According to nVidia, the problem that Eidos allegedly discovered is "lack of Anti-Aliasing support on FP16 textures in R:G:B:A format on specific ATI hardware."
This is somewhat of a big issue here for both companies. All ATI hardware between Radeon 9700 and Radeon X1000 series indeed didn't support MSAA while running HDR [High Dynamic Range], as stated by Tim Sweeney himself. ATI Radeon X1800 was the first GPU that supported "HDR+AA", which was public information at the time when first Radeon X1800 reviews came out. Unfortunately for nVidia, the company didn't support HDR+AA until the launch of GeForce 8800, one year and two weeks after ATI launched their HDR+AA part. As you might have already guessed - Batman: Arkham Asylum utilizes HDR for better gaming experience and if you start Batman: Arkham Asylum on such an old hardware, such as Radeon 9700/9800/X800/X850 or indeed, nVidia GeForce FX5900/6800/7800/7900 - you will get the same problem. AA yes, HDR no. Or HDR yes, AA no.
Furthermore, we have checked Microsoft's DirectX table with all supported graphics cards and Anti-Aliasing modes, and we could not find a single card in the past three years that does not support AA on FP16 RGBA textures. Furthermore, we asked game developers to answer our queries and the answer from a developer from an AAA Company who delivered multiple TWIMTBP titles was quite interesting: "If Rocksteady implemented FP16 texture that ATI hardware does not support, it is just something that would be implied by NVidia. In any case, if true - I call this move bullshit. How many UE-based games have AA regularly supported by both ATI and NVidia when enabling AA in Control Panel. This is too murky."
You might wonder who this developer is and we hope to get him on the record, since the situation would really get complicated.
Furthermore, following our talk with Bryan Del Rizzo and Brian Burke, we were referred to a following quote published on PC Perspective
: "In the case of Batman's AA support, NVIDIA essentially built the AA engine explicitly for Eidos - AA didn't exist in the game engine before that. NVIDIA knew that this title was going to be a big seller on the PC and spent the money/time to get it working on their hardware. Eidos told us in an e-mail conversation that the offer was made to AMD for them to send engineers to their studios and do the same work NVIDIA did for its own hardware, but AMD declined."
We can debate this over and over and over again - was nVidia right to use Vendor ID trick to lock the nVidia built AA AMD out or not? After all, if the issue was so trivial, why AMD didn't offer the same code, but without the vendor lock?
The problem with this situation is what if AMD decides to retaliate - and lock out specific Tessellation code path on Fermi. In all benchmarks so far [even the non-AA ones], ATI would produce lesser results than nVidia hardware as the direct result of the way how nVidia hardware runs games powered by Unreal Engine 3. Do bear in mind that old Unreal Tournament 3 ran at 45 frames a second with all the bells and whistles at 1920x1200 - on GeForce 8600 GTS 256MB. This was nothing but a wet dream on ATI card from equivalent performance range.
Additionally, nVidia repeatedly asked what about all the other titles where AMD failed to shine. The latest example is HardOCP's article where they compared the gaming performance of Resident Evil 5
. As you probably imagined, the results are less than stellar for ATI hardware including the latest Radeon HD 5870. In the conclusion, Mark Warner as another independent journalist concluded that: "While we are sure AMD currently has a better GPU than NVIDIA, this game goes to show that AMD needs to step its game up when it comes to drivers. 45 days after launch and here will still have AA title with graphical bugs galore on its new hardware."
Bryan commented that it "seems that their decision to not invest in any actual developer relations initiatives is coming back to bite them."
Followed by a well positioned question - "So, why are we responsible for their misdeeds exactly?" Game Developers speak out
As you might agree with us, this discussion can go on and on and on until somebody comes and cuts the Gordian Knot
of Batmangate. Our sword is no other than game developers. As always, when this kind of issue arises I go and ask several key developers for their comments. In fact, this article was almost published yesterday, but I decided to wait for the answers from the people that understand all aspects of this situation.
Now, all developers we spoke with have shipped titles measured in hundreds of thousands of units and even millions over the course of years, and even though they all requested to remain anonymous, I cannot express my gratitude for openness in their comments. This is what an engine developer of multiple TWIMTBP titles had to say about Batmangate: "Yep, nVidia is being 'brilliant' all over again! :( But don't forget this is not the first time, you also have that Crysis affair that never got serious media traction."
Personally, I think that Crysisgate wasn't as chewed upon by anybody - given the fact that the game didn't even properly worked on nVidia hardware and nVidia invested heavily in the development and moved their engineer to Germany for couple of months. But then again, we all know the story how EA screwed up CryTek and used the wrong [un-optimized] version for gold master. The rest, as they say, is history.
On the subject of Vendor ID locking, their words were even less kind "Personally, I despise such things and if they would appear with such attitude of us putting a Vendor ID lock in the game, I would throw them out of the office. They can f**k off with such a low attitude... but I guess they know that so they're not even daring to ask! :)"
The question we had also touched the subject of Shader AA [software] versus ROP [fixed-function] approach. Back at the day, ATI was known to love pushing for Shader AA because the execution was botched on nVidia hardware, thus ATI was scoring nicely in titles that used Shader AA, but ATI's experiment fell flat with the introduction of Shader AA with the release of Radeon 2000 series. Interestingly though, two developers gave their comments on the subject of Shader AA: "ATI learned that ROP Units are much better [for AA] than shaders with the 2900XT."
The re-appearance of fixed-function Anti-Aliasing on subsequent ATI architectures confirms that matter, as confirmed by the other developer: "When it comes to maturing in the proper direction they learned their lessons on Xenos [Xbox 360] having hardware AA and R600 [HD 2000] going for shader AA. They fried their arses on R600 and went running back to ROP AA as fast as they could"
Another developer who works for a publisher that licensed Unreal Engine 3.0 for their titles got back to us and gladly admitted that they use nVidia help in developing the code: "we work on multiples [consoles, PCs] and their [nVidia's] support is invaluable to us. Console debugging is worth a million - seriously. But no, nobody from nVidia told me a damn thing about putting FSAA code in. Their Control Panel is good for us."
The discussion even brought some developers out of the woodworks who went on the record and disclosed the ways how nVidia is supporting them through TWIMTBP program. On XtremeSystems, developer known as DilTech
said that he "can tell you first hand NVidia supports developers with things besides money. Certain people here will tell you about the toy they had in our hands back when the 7800GTX 512MB came out. They give developers hardware that never sees the light of day as far as the consumers know, just to make sure they have a way to test their code that isn't a software renderer. I can tell you right now that ATi have never put a single piece of hardware in our hands... and it's not like we haven't asked for anything to test on.
On the subject of that particular toy, I still have it. If we are on the same page, that "toy"
used a very cool looking dual-slot cooler and was clocked that nobody believed it. Naturally, it could never pass FCC certification or OEM qualy process. ;) "They have a test lab with just about every possible configuration when it comes to nVidia hardware to test your application on to make sure it's going to work across a wide spectrum of hardware. You also have to remember that nVidia employ a lot more people than ATI. They have people, whose sole job IS testing said applications, finding the bugs, verifying if it's the game code or the driver, and giving a list of possible fixes to the developer. If anything, nVidia does more for the game industry than ATi has ever dreamed of doing, and pays for it with the money the consumer spends on their video card... How is that bad for the consumer?"
During the past 14 years of work experience, I also had the privilege working for two major publishers on their simulation titles and had a stint as CTO [Chief Technology Officer] in a then Croatian start-up developer. Back in the day [2003-2004], I worked with Intel, AMD, ATI and nVidia, so I had tasted the experience working with all four vendors. Being a small start-up that was yet to sign a publisher, we didn't exactly have high hopes of getting the level of support from giants in the industry.
After contacting AMD, the company offered us AMD Athlon 64 based systems for development at no cost to us as soon as we sign a publisher. Intel replied with an application to Intel Developer Program. By paying $500/year membership, we were entitled to use all the Intel software [regardless of what you may or may not think - Intel has the best C++ compiler in the industry] and stand to receive an Intel Development machine every 12 months. The machines consisted out of top of the pops CPU, motherboard, memory filled to the brink, multiple hard drives and top of the pops graphics card. We are talking about a three grand-worth system for 500 bucks. For some reason, Intel preferred to use ATI cards even back then. Coming to GPU vendors, things are really interesting and you might get a context of what happened back in 2003 - and compare it with what happened with Batman: Anti-Aliasing... pardon, Arkham Asylum.
We received visits from Cyril and Karen from nVidia, and Richard and Kevin from ATI. The results from first visits were interesting - nVidia sent us an application to TWIMTBP even though we didn't have a publisher. We were told that nVidia usually doesn't do that [offer TWIMTBP membership to start-ups unless they were promising], but those first visits resulted in nVidia sending us shipload of hardware - each of our systems ended up with GeForce FX 5850, 6800 - even a Quadro SDI for post-production. When it came to ATI, the company also sent out several cards, but the level of involvement was nowhere near the attention the company got from nVidia, followed by Intel and AMD.
To this date, that developer has yet to receive a single dime of marketing money from nVidia, yet the support given to the team from the company was invaluable in creating a casual game vendor that released eight games since 2003 [and several others under anonymity contracts]. This independent team cannot compete on AA or AAA titles, but yet, nVidia is there to support this kind of teams in that crucial incubating stage.Number of games AMD is currently focusing on - DX11 titles for 2009 and 2010
To round this part, nVidia's TWIMTBP program features over 380 titles on the nZone website, and we know they missed on a few titles [add-ons for WoW, for instance], while AMD is touting nine DirectX 11 titles the company is supporting. Then again, independent analysts such as Jon Peddie
cited AMD's effort as a key instrument for the fastest adoption of new API in the history of Microsoft Windows as a gaming platform. We do tend to agree with Jon on this one - but the other reason is that more than 10% of all Steam users ran Windows 7 beta and RTM versions months prior to release. For instance, nVidia is currently focusing on no less than several dozen titles that will be released in 2010.Next Page - Conclusion
© 2009 - 2013 Bright Side Of News*, All rights reserved.