The website ExpertReviews has leaked the announcement of the GTX Titan along with multiple pictures confirming what has been said so far.
Perhaps someone forgot to send the memo about the postponed release of the GTX Titan to ExpertReviews! Or more likely they received it, but probably forgot to re-arm their launch-counter, so their article just hit the front page. It didn’t take long for someone to notice. Thankfully, we have a full copy of their article, along with all the pictures they posted. They are surely NVIDIA’s slides.
NVIDIA has confirmed the details of it’s cutting edge GTX Titan, the fastest single GPU powered video card in the world. The card is powered by the GK110 GPU, which holds 7.1 billion transistors. It’s is now confirmed that it will produce 4.5 TFLOPs of single precision power, which answers the questions of Titan’s compute power design. It is now official, that Titan will require only 250 W of power (through 8+6pin power connectors).
Moreover, EgyptHardware released a new performance chart comparing: GTX Titan, HD 7970 GE and GTX 680. If this chart is valid, then you can clearly see why synthetic benchmarks are not always the best way to compare cards.
Vsync and GPU-B 2.0 – Keith H.
There are two features mentioned in these slides and previous slides that are “new” to the Kepler family. The first is GPU Boost 2.0, which uses the terms Vrel (Relative Voltage) and Vrelnew (New Relative Voltage). It is still uncertain if these are autonomic processes or if they are user controlled. In previous Kepler architectures the max Power Target could be adjusted to the GPU BIOS specified amount through software (each vendor could set this to any amount), but the new slides show this under “Advanced Controls” which include Power Target and GPU Temp Target. Power Target is nothing new to anyone that uses GPU OCing Applications, but Temperature Target is. Previously Kepler would boost according to TDP and Temperature, which is exactly what these controls would presumably adjust. In the Advanced Controls it also has an “Unlock” section that contains GPU Max Voltage, this is also common to people using GPU OCing Apps. However the question is, has NVIDIA added these options to the stock controls of Titan? If so, have they been capped as with previous Kepler releases, or are they truly “unlocked” as the slide says? This is all very exciting for the overclocking crowd because in the past to get past the stock NVIDIA limited Power Target you had to unlock the BIOS with something like the Kepler Bios Editor/Unlocker (developed by XtremeSystems user “CrazyNutz”). The GPU Temperature Target has never been adjustable and this could greatly increase achievable Boost Speeds; although the risk of too high a setting could have dire consequences on your card and wallet. These new slides make you wonder if these previously modifiable features may now be available to Titan owners without any outside adjustment.
The second feature shown in the previous article is about VSync, which I assume is a new version of Adaptive Vsync. It claims that you can run Vsync on and “enjoy 80 fps” (while the card is producing 90 fps). Does this mean that NVIDIA has adapted a technology similar to Lucid Logix’s Virtual-VSync, which allows Vsync to be on with no tearing at any FPS? Is this the new iteration of Adaptive Vsync that has been rumored? If so, why did NVIDIA limit it to 80 fps? What importance does 80 FPS play in the technology (assuming this is the case) vs. Lucid Logix’s technology which allows any FPS, especially if they are telling us the card is producing 90 fps. Is it relative by 10 FPS as to what can be displayed or is 80 FPS the new 60 FPS?
Designed as a single-GPU replacement for the current top-end GTX 690, which is actually two GTX 680 cores bolted to one PCB, the GTX Titan promises improved performance while using less power and producing less heat. A redesigned cooler with an extended aluminium heat stack dissipates heat faster than Nvidia’s current design, while the 90mm fan is tied to both RPM and voltage control to more accurately determine when to kick in. With a TDP of 250w, you’ll certainly need it.
SLI is fully supported, so if you have a capable power supply and bottomless pockets you could potentially run multiple Titans for high frame rates even at multi-monitor resolutions. Although Nvidia has yet to share exact benchmark results, some rough figures suggest games like Crysis 3, Far Cry 3 and Max Payne 3 can expect roughly twice the performance over a GTX690 setup.
Perhaps more exciting news is the addition of GPU Boost 2.0, an evolution of the software introduced with Nvidia’s 600-series graphics cards. Built into the video driver, GPU Boost 2.0 will let Titan owners overclock and olvervolt their cards, with higher limits than with previous cards and optimisations for water-cooling setups.
It will also allow you to “overclock” your display, running it at a faster sync rate than it officially supports to squeeze out some extra frames per second. As an example, a monitor rated for 60Hz refresh only could run at up to 80Hz, meaning twenty extra frames per second are being displayed.
The one sticking price will almost certainly be the price – Nvidia would only confirm RRP pricing with us today, as it will be up to its hardware partners to set their own prices when the cards launch later this week, but you’ll easily be paying over £800 per card. We’ll have to wait until then to see whether the benchmark scores can back up Nvidia’s claims that the Titan is the fastest card around, but the early indications look promising.