NVIDIA finally releases PCI-Express based solution equipped with Big Pascal.
TESLA P100 with PCI-Express
Interestingly P100 will now be offered in three variants. Two of them are PCI-based, but they offer different memory configuration. There’s 16GB HBM2 model and 12GB HBM2 model, so that probably means 3 stacks enabled or present. The latter model also has lower bandwidth (720 GB/s vs 540 GB/s). Compared to NVLink based solution Tesla P100 with PCI-Express will be slightly slower in terms of computing performance. In single-precision computing we get 9.3 TFLOPs, while NVLink model offers 10.6 TFLOPs. The PCI-Express Teslas will also use passive cooler with TDP around 250W.
Tesla P100 PCi-E will be available around fourth quarter, most likely even before NVLink models will ship with DGX-1 servers. Pricing was not confirmed.