Please note that this post is tagged as a rumor.
It has been a very quiet season for NVIDIA. Unlike AMD, NVIDIA is very careful releasing any information about its future architecture called Pascal.
Luckily, the so called leakers just can’t live without speculating and creating their own charts, because I suppose that’s exactly what we are seeing on this graph.
GeForce X
I’ve decided to post this chart only for two reasons. First I think the idea of GeForce X is quite interesting (but not necessarily true). The idea of GeForce X makes a lot of sense, but the problem starts when you start thinking about the next series (aka GeForce 1100).
Presumably the GeForce X family would include such graphics cards:
GeForce X series | |||
---|---|---|---|
GeForce X series | GeForce X100 series | GeForce X200 series | GeForce X300 series |
GeForce X80 TITAN | GeForce X180 TITAN | GeForce X280 TITAN | GeForce X380 TITAN |
GeForce X80 TI | GeForce X180 TI | GeForce X280 TI | GeForce X380 TI |
GeForce X80 | GeForce X180 | GeForce X280 | GeForce X380 |
GeForce X70 | GeForce X170 | GeForce X270 | GeForce X370 |
GeForce X60 | GeForce X160 | GeForce X260 | GeForce X360 |
It looks familiar with GeForce GTX series, we are basically using shorter names, but it’s hard to see what would happen to GT, GTS and non-GT series. Also, do we really want to get rid of GTX branding?
The move to 16nm FinFET is without a doubt a good reason to start with new naming schema. NVIDIA skipped desktop 800 series for a reason — to deliver something fresh with FinFET chips.
NVIDIA has a choice of succeeding GeForce GTX 980 series with GeForce GTX 1800 (judging from with Quadro line), or start something complete new, like GeForce X.
As far as I’m skeptical about this ‘leak’, it would actually explain what could happen to TITAN series. Rather than dropping the idea altogether, shorter naming could still ‘accommodate’ TITAN branding, avoiding the problem of card model sounding weird.
GeForce X80
Now let’s look at the chart. According to the data GeForce X80 would be the first GP104 based graphics cards. Presumably the GPU would have 4096 CUDA cores which is actually not that far from our predictions.
The person who posted this chart has without a doubt did some homework. It is assumed that full GP104 would feature twice as many CUDA cores as GM204, but it is immediately suggested that full GP100 has 6144 CUDA cores. Does this sound familiar to you? That’s exactly what happened with GM204 and GM200, since Big Maxwell had 1.5x of GM204 cores.
GeForce X series | |||
---|---|---|---|
GeForce X80 | Geforce X80Ti | GeForce X80 Titan | |
GPU | GP104 | GP100 | GP100 |
Fab. Node. | 16nm | 16nm | 16nm |
CUDA Cores | 4096 | 5120 | 6144 |
TMUs | 256 | 320 | 384 |
ROPs | 128 | 160 | 192 |
Base Clock | 1000 | 1025 | 1025 |
Memory | 6GB GDDR5 | 8GB GDDR5 | 16GB HBM2 |
Memory Bus | 384-bit | 512-bit | 4096-bit |
Memory Clock | 8000 MHz | 8000 MHz | 1000 MHz |
Memory Bandwidth | 384 GB/s | 512 GB/s | 1024 GB/s |
TDP | 175 | 225 | 225 |
To my surprise leaker suggest that Pascal GP100 would support both GDDR5 and HBM2. Would NVIDIA design one chip with two different memory controllers on the same die?
Another thing to consider is memory bus on GDDR5 parts. Would NVIDIA try to compensate the availability of HBM2 and GDDR5X chips by increasing clock speed and making memory bus wider? The 512-bit bus and 8 GHz clock already give us Fury X bandwidth, but NVIDIA has not used 512-bit in a long time, so is it really necessary?
The 8000 MHz clock
It gets more interesting when you notice memory clock of 8000 MHz. And that’s the second reason why I’m posting this chart. I’ve actually found some benchmarks of GPUs running at 2000 MHz (8000 MHz effective) and these were definitely not performed by the overclocker Kingpin..
It never ring a bell to me that these could be new Pascal boards. The scores are not impressive (graphics scores), and GPU clock speeds are actually quite very low, so these could be a mobile engineering samples (usually tested on desktops).
Unfortunately this post does not deliver super exclusive leak to you today, but it should start a discussion of what is the future of GeForce series, especially how is NVIDIA going to call them?
We expect to see more such rumors before GPUTech conference in April, where Jen-Hsun has his ‘opening keynote’, and hopefully has something to say about Pascal.
Source: Imgur