NVIDIA’s Turing GPU architecture
NVIDIA (NVDA) stock reached its 52-week high on October 1 as the company launched its much-awaited next-generation Turing-based GeForce RTX 2080 Ti and 2080 GPUs (graphics processing unit).
NVIDIA built the GeForce RTX series on TSMC’s (TSM) 12 nm (nanometer) node and integrated innovations such as Tensor Cores for AI, RT cores for ray tracing, and a VirtualLink port for next-generation virtual reality. It’s also the first chip to use Micron’s (MU) GDDR6 (graphics double data rate) memory.
NVIDIA priced the GeForce RTX 2080 Ti Founders Edition at $1,200, the GeForce RTX 2080 Ti at $999, the RTX 2080 at $699, and the RTX 2070 at $499. The first three GPUs hit the shelves on September 20, and the fourth GPU was released on October 17. The company stated that the Turing-based GeForce RTX GPUs delivered six times the ray-tracing performance of Pascal-based GPUs.
NVIDIA is offering both Turing and Pascal GPUs in the second half as the software ecosystem and games that support ray tracing start arriving in the holiday season.
Problems with NVIDIA’s Turing-based RTX gaming GPUs
Initial reviews showed that the new GeForce RTX 2080 GPU provided only a 3% performance boost over the previous generation GeForce GTX 1080Ti card in terms of traditional gaming performance despite higher clock speeds, a higher core count, and higher memory bandwidth. Things started to get serious as early adopters of the RTX 2080 Ti and 2080 started reporting performance issues.
The NVIDIA GeForce forums and Reddit were filled with people reporting problems such as crashes, black screens, artifacts, blue screens, and the GPU’s complete failure to work. Digital Trends noted that the majority of the above-mentioned issues had been reported for the Founders Edition of the RTX 2080 Ti and some third-party 2080 Ti cards from Gigabyte and Asus.
NVIDIA replaced some GPUs, but the replacement cards suffered from similar problems, indicating that the issue was not associated with a particular production batch of GPUs but could be related to the Turing architecture itself or some component used inside the GPU. We’ll dig deeper into the problem in the next article.