Data center capital spending
Cloud companies are investing capital in infrastructure. Micron Technology (MU) expects cloud capital spending to increase from $41 billion in 2017 to over $108 billion by 2021.
There was a slowdown in capital spending in the fourth quarter of 2018, and it’s expected to continue in the first half of 2019 as Chinese (FXI) hyperscalers optimize their purchased capacity. However, capital spending should rise again in the second half of 2019 as 5G’s rollout boosts data adoption.
Micron’s opportunity in the data center market
Micron has two opportunities in the data center market. The first is the increasing shift of servers to the cloud, as cloud servers have more memory content than standard servers. At its 2018 Investor Day event, Micron stated that a standard server requires 145 GB of DRAM (dynamic random-access memory) and 2 TB (terabyte) of NAND (negative-AND). It expects the growing contribution of the cloud in server sales to increase DRAM and NAND content per server to 366 GB and 11 TB, respectively, by 2021.
The second data center opportunity is the increasing number of AI-capable servers. Micron expects AI servers to account for ~10% of total server shipments in 2021 and ~50% by 2025. AI servers require six times more DRAM than and double the SSD (solid-state drive) of a standard server. The company expects AI training to drive DRAM and NAND content per server to 2.5 TB and 20 TB, respectively, by 2021.
Micron expands its data center product offerings
To address the growing needs of data centers, Micron is developing specialized memory solutions for AI and data analytics.
The increasing amount of data is increasing the need for NAND storage, and the need to process these data is increasing the need for DRAM. Micron, in collaboration with Intel (INTC), has developed 3D XPoint technology, which offers the benefits of both DRAM and NAND and is ideal for big data and AI applications. Micron is set to launch its 3D XPoint products in 2019.
The increasing adoption of GPUs (graphics processing unit) in AI servers has encouraged Micron to develop its own HBM (high-bandwidth memory) to be used in these GPUs. At present, HBM is offered by Samsung (SSNLF) and SK Hynix.
Next, we’ll look at Micron’s Internet of Things and automotive opportunities.