Titan has arrived. World’s largest and most advanced piece of semiconductor excellence is becoming a consumer product. Just weeks after the chip codenamed GK110 (G=GPU K=Kepler, 110) became available as Tesla K20, K20C and K20X, Nvidia unveiled gaming version dubbed GeForce GTX Titan. With a launch sequence like that, there’s no doubt in our minds that Quadro-powered GK110 is coming to market as well. In this article, we will focus on the new features and differences between the Tesla K20 and the GeForce GTX Titan.

Meet the Nvidia GeForce GTX Titan

First, we have to clarify this analysis. As you can see on the marked slides in this article, Nvidia is releasing only the information about the actual product, dubbed the GeForce GTX Titan. Actual performance benchmarks will be released on Thursday, February 21st at market open (6AM Pacific, 9AM Eastern, 3PM Central European, 9PM China).

Why (not) Titan(ium)?

Typically, the name "Titanium" in Nvidia’s product branding was reserved for special editions and extra value items. The tradition began with the GeForce 3 Titanium 200 and 500 and continued with Titanium branding carried to GeForce 4 Ti etc. Today, 11 years later, you can buy Titanium edition of the GeForce GTX 660, for example. However, that is not the case with the latest board from Nvidia. The name Titan a proper tribute to the Titan Supercomputer located at the Oak Ridge National Laboratories in Tennessee.

Our 3D Battle Royale: 7970 GHz Editions vs. GTX Titan

Furthermore, if our conversations with members of the Nvidia team are correct, the actual project targeted to be a present for the 50th birthday of Jen-Hsun Huang, venerable co-founder and CEO of Nvidia – the present gamers with deep pockets and video studio owners could purchase themselves. Unfortunately the product hasn’t made the launch in time for the birthday (3 days overdue), but the delay was a welcomed one – all in a bid to ensure the best possible state of software and hardware, which you will have the opportunity to look at in our comprehensive review.

GeForce GTX Titan: The Board
At the very first look, the heatsink bears resemblance to the GeForce GTX 690 4GB but this is where the similarities end. If we would compare Titan against the Tesla, we can say that "Titan is a Tesla with a beach body." The complex PCB background is almost identical to the Tesla K20 – the back is uncovered in order to accelerate the heat dissipation, while the front is a continuation of design language introduced with the GeForce GTX 690: unlike the sturdy plastic used on Tesla and Quadro boards, this heatsink is made of chromium-plated aluminum and magnesium.

Titan GPU Specifications

Beneath the heatsink, you’ll find a GK110 GPU chip and no less than 6GB of GDDR5 memory, same amount as on the Tesla K20X, a $4500 product. The back features two Dual-Link DVI ports, a welcomed move for rare owners of older 4K displays. Furthermore, you’ll find HDMI 1.4 3GHz, capable of driving a 4K panel all on each own, as well as DisplayPort 1.2. Naturally, the display limitations from GeForce GTX 680 and 690 are still present: you can either drive regular four displays or two 4K panels. Still, 8K is a great improvement over Fermi graphics architecture.
As you can see in our pictures, the board definitely carries serious oomph, especially when paired with a second or a third board.

The Heart of the Titan
There’s no doubt that the center piece of attention on the GTX Titan comes from its massive piece of silicon. The GK110 represents the finest what the semiconductor industry can offer with 7.08 billion transistors (Nvidia cites 7.1B for consumer and 7.08 for engineering-minded audience). This chip is probably the most difficult chip to have ever entered TSMC’s GigaFab facilities in Taiwan. The massive array of 15 SMX clusters has just a single SMX cluster disabled, and you can count on 2688 CUDA Cores. The cores are clocked at 837 MHz, almost 100MHz faster than the stability-focused Tesla cards. The chip also packs 384-bit memory controller. Given that the memory is operating at full 1.5GHz QDR, we are looking at six billion transfers per second, or 288GB/s. This is identical to AMD’s Radeon HD 7970 GHz Edition , which also carries a 384-bit memory controller.

When it comes to capability, the Titan and K20 differ greatly. While it is praiseworthy that Nvidia did not touch the raw power of the card, bringing 4.5TFLOPS of single precision FP32 or 1.3 TFLOPS double precision FP64. This is where the similarities end, as the company disabled much-hyped HyperQ queuing feature, as well as the Error Checking and Correcting (ECC).

Overclocker-friendly: Overvoltage with default 50 Watt Overclocking Range

Look, the overvolting is back
Look, the overvolting is back

The ban on overvolting the partner boards which Nvidia’s vendors blamed on so-called Project Green Light is finally overturned and the GTX Titan is coming back to its roots. The actual hardware is now supporting "OverVoltaged" feature, which has to be visibly marked for the users. At the time of writing, it wasn’t known how high can the boards be overclocked, but don’t be surprised to see 1GHz+ clock among the overclocking community. We know partners are already preparing stock overclocked boards, with the 915MHz clock being mentioned as the "safe operating range".

Good Bye 60Hz on LCDs! Nvidia Introduces Display Overclocking

Driving the display at 80Hz, for example.
Driving the display at 80Hz, for example.

Maybe the most important part of GTX Titan to the gaming experience is the fact that the GK110 visual hardware with actively track the computer display you are using, and adjust to that. With Titan, the company finally addresses the 60Hz gameplay bottleneck. According to Nvidia, playing games with Vsync on at 60Hz fixes the framerate at 60Hz (60fps) and the gamers never see the missing 20 frames or so. Display overclocking changes that. If you drive the display with the GTX Titan, the framerate will change subject to the actual framerate achieved by the game. Thus, you can turn all the bells and whistles on, and enjoy the gameplay at 80Hz/fps, should your graphics card be able to achieve 8fps. In any case, we believe this is a great feature and the continuation of the Nvidia’s Vsync research which started several years ago. If your display is capable, you get 20-30Hz "for free."

The Future of GTX 690

The GTX Titan at a glance... impressive specs indeed.
The GTX Titan at a glance… impressive specs indeed.

According to Nvidia representatives we spoke with, the company has no plans to cancel the production of GeForce GTX 690 4GB (the same PCB is used in production of the Tesla K10). Thus, the future will see the GTX Titan and the GTX 690 all being a part of the same line-up, at the same price point and similar availability: $999. It is up to you to decide which you prefer the best. If four or compact-er 2GPU configs are your thing, then GTX 690 might prove worthy investment. In most other cases, GK110 wins hands down just on features and lower power consumption.