AMD’s GPU roadmap
Advanced Micro Devices (AMD) is moving ahead with its 2017 strategic goals to launch its next-generation computing and graphics processors.
At CES (Consumer Electronics Show) 2017, AMD unveiled its next-generation Vega GPU (graphics processing unit) for the high-end market.
First in the Vega series will be the Vega 10 GPU, which will be made available in three variants: Radeon for the consumer PC, FirePro WX for the workstation PC, and Instinct for the server. Vega 10 will be built on Samsung (SSNLF) and Global Foundries’ 14nm (nanometer) node, the same node on which the Polaris 10 and 11 were built.
According to media reports, AMD will also release Vega 11 and Vega 20 later. There are rumors that AMD may launch Polaris 12 before launching Vega 11. Separate rumors state that Vega 20 may be built on a 7nm node and launched in 2018.
Key features of Vega
Vega 10 is being built on the latest 14nm GFX9 core architecture, which adopts the NCU (next compute engine) design. The GPU will feature 64 computing units, or 4,096 stream processors, which will increase its clock speed. The NCU can deliver a speed of 512 8-bit operations per clock, 256 16-bit operations per clock, or 128 32-bit operations per clock.
While 8- and 16-bit operations are ideal for tasks such as machine learning and computer vision, 16-bit operations can also be used for certain gaming tasks that don’t require high accuracy. For instance, Sony’s (SNE) PlayStation 4 Pro supports 256 16-bit operations per clock.
AMD’s Radeon Vega 10 versus NVIDIA’s GTX 1080
In December 2016, AMD showcased the Vega 10 GPU running 2016’s Doom. The Vega 10 ran between 60 and 70 fps (frames per second) on “ultra” graphics quality and at 4k resolution. On the other hand, NVIDIA’s (NVDA) GTX 1080 couldn’t run at 60 fps, according to Techspot. This indicates that the Vega 10 is faster than the GTX 1080 based on its Pascal architecture.
There are some new features that make Vega different from its predecessor, Polaris, and help it to perform better than NVIDIA’s Pascal GPU. We’ll look at these features in the next part of the series.