NVIDIA GeForce X80(TI) and GeForce X80 TITAN specs hit the web, or do they?

Published: Mar 17th 2016, 08:17 GMT

Please note that this post is tagged as a rumor.

It has been a very quiet season for NVIDIA. Unlike AMD, NVIDIA is very careful releasing any information about its future architecture called Pascal. 

Luckily, the so called leakers just can’t live without speculating and creating their own charts, because I suppose that’s exactly what we are seeing on this graph.

NVIDIA GeForce X80 X80Ti X80 TITAN

GeForce X

I’ve decided to post this chart only for two reasons. First I think the idea of GeForce X is quite interesting (but not necessarily true). The idea of GeForce X makes a lot of sense, but the problem starts when you start thinking about the next series (aka GeForce 1100).

Presumably the GeForce X family would include such graphics cards:

GeForce X series
GeForce X seriesGeForce X100 seriesGeForce X200 seriesGeForce X300 series
GeForce X80 TITANGeForce X180 TITANGeForce X280 TITANGeForce X380 TITAN
GeForce X80 TIGeForce X180 TIGeForce X280 TIGeForce X380 TI
GeForce X80GeForce X180GeForce X280GeForce X380
GeForce X70GeForce X170GeForce X270GeForce X370
GeForce X60GeForce X160GeForce X260GeForce X360

It looks familiar with GeForce GTX series, we are basically using shorter names, but it’s hard to see what would happen to GT, GTS and non-GT series. Also, do we really want to get rid of GTX branding?

The move to 16nm FinFET is without a doubt a good reason to start with new naming schema. NVIDIA skipped desktop 800 series for a reason — to deliver something fresh with FinFET chips.

NVIDIA has a choice of succeeding GeForce GTX 980 series with GeForce GTX 1800 (judging from with Quadro line), or start something complete new, like GeForce X.

As far as I’m skeptical about this ‘leak’, it would actually explain what could happen to TITAN series. Rather than dropping the idea altogether, shorter naming could still ‘accommodate’ TITAN branding, avoiding the problem of card model sounding weird.

GeForce X80

Now let’s look at the chart. According to the data GeForce X80 would be the first GP104 based graphics cards. Presumably the GPU would have 4096 CUDA cores which is actually not that far from our predictions.

The person who posted this chart has without a doubt did some homework. It is assumed that full GP104 would feature twice as many CUDA cores as GM204, but it is immediately suggested that full GP100 has 6144 CUDA cores. Does this sound familiar to you? That’s exactly what happened with GM204 and GM200, since Big Maxwell had 1.5x of GM204 cores.

GeForce X series
GeForce X80Geforce X80TiGeForce X80 Titan
Fab. Node.16nm16nm16nm
CUDA Cores409651206144
Base Clock100010251025
Memory Bus384-bit512-bit4096-bit
Memory Clock8000 MHz8000 MHz1000 MHz
Memory Bandwidth384 GB/s512 GB/s1024 GB/s

To my surprise leaker suggest that Pascal GP100 would support both GDDR5 and HBM2. Would NVIDIA design one chip with two different memory controllers on the same die?

Another thing to consider is memory bus on GDDR5 parts. Would NVIDIA try to compensate the availability of HBM2 and GDDR5X chips  by increasing clock speed and making memory bus wider? The 512-bit bus and 8 GHz clock already give us Fury X bandwidth, but NVIDIA has not used 512-bit in a long time, so is it really necessary?

The 8000 MHz clock

It gets more interesting when you notice memory clock of 8000 MHz. And that’s the second reason why I’m posting this chart. I’ve actually found some benchmarks of GPUs running at 2000 MHz (8000 MHz effective) and these were definitely not performed by the overclocker Kingpin..
It never ring a bell to me that these could be new Pascal boards. The scores are not impressive (graphics scores), and GPU clock speeds are actually quite very low, so these could be a mobile engineering samples (usually tested on desktops).

NVIDAI Pascal at 8000 MHz NVIDIA Pascal at 8000 MHz (2)

Unfortunately this post does not deliver super exclusive leak to you today, but it should start a discussion of what is the future of GeForce series, especially how is NVIDIA going to call them?

We expect to see more such rumors before GPUTech conference in April, where Jen-Hsun has his ‘opening keynote’, and hopefully has something to say about Pascal.

Source: Imgur

Comment Policy
  1. Comments must be written in English and should not exceed 1000 characters.
  2. Comments deemed to be spam or solely promotional in nature will be deleted. Including a link to relevant content is permitted, but comments should be relevant to the post topic. Discussions about politics are not allowed on this website.
  3. Comments and usernames containing language or concepts that could be deemed offensive will be deleted.
  4. Comments complaining about the post subject or its source will be removed.
  5. A failure to comply with these rules will result in a warning and, in extreme cases, a ban. In addition, please note that comments that attack or harass an individual directly will result in a ban without warning.
  6. VideoCardz has never been sponsored by AMD, Intel, or NVIDIA. Users claiming otherwise will be banned.
  7. VideoCardz Moderating Team reserves the right to edit or delete any comments submitted to the site without notice.
  8. If you have any questions about the commenting policy, please let us know through the Contact Page.
Hide Comment Policy