NVIDIA announces Tesla P100 with PCI-Express interface

Published: 20th Jun 2016, 08:05 GMT

NVIDIA Tesla P100 PCIE (1)

NVIDIA finally releases PCI-Express based solution equipped with Big Pascal.

TESLA P100 with PCI-Express

Interestingly P100 will now be offered in three variants. Two of them are PCI-based, but they offer different memory configuration. There’s 16GB HBM2 model and 12GB HBM2 model, so that probably means 3 stacks enabled or present. The latter model also has lower bandwidth (720 GB/s vs 540 GB/s). Compared to NVLink based solution Tesla P100 with PCI-Express will be slightly slower in terms of computing performance. In single-precision computing we get 9.3 TFLOPs, while NVLink model offers 10.6 TFLOPs. The PCI-Express Teslas will also use passive cooler with TDP around 250W.

Tesla P100 PCi-E will be available around fourth quarter, most likely even before NVLink models will ship with DGX-1 servers. Pricing was not confirmed.

Tesla P100 Most PCI VS NVLINK

NVIDIA Tesla P100 PCIE (2) P100Cards

Source: AnandTech

Comment Policy
  • Comments must be written in English.
  • Comments deemed to be spam or solely promotional in nature will be deleted. Including a link to relevant content is permitted, but comments should be relevant to the post topic.
  • Comments containing language or concepts that could be deemed offensive will be deleted. Note this may include abusive, threatening, pornographic, offensive, misleading or libelous language.
  • A failure to comply with these rules will result in a warning and, in extreme cases, a ban.
  • Please note that comments that attack or harass an individual directly will be deleted and such comments will result in a ban.
  • VideoCardz Moderating Team reserves the right to edit or delete any comments submitted to the site without notice.
  • If you have any questions about the commenting policy, please let us know through the Contact Page.
Hide Comment Policy