Behind Intel’s Plans to Make Artificial Intelligence Mainstream



Intel aims to bring make artificial intelligence mainstream

In the previous part of this series, we saw that Intel (INTC) is developing AI (artificial intelligence) hardware using Nervana and other deep-learning platforms. In the GPU[1.graphics processing unit]-dominated AI market, Intel is looking to bring its FPGA[2.field-programmable gate array]-backed Xeon CPUs[3.central processing units] to mainstream users by engaging with open-source frameworks such as Caffe2 and Chainer.

Intel is looking to run these open-source frameworks on its hardware. It is also using these frameworks to highlight that developers can easily use AI with its Xeon CPUs.

Article continues below advertisement


Intel has participated in Facebook’s (FB) new deep-learning, open-source, cross-platform framework, Caffe2, which aims to optimize deep learning for cloud and mobile environments. Caffe2 has a production-ready, lightweight, high-performance, scalable, deep-learning framework that focuses on portability.

In its blog, Intel stated that its MKL (Math Kernel Library) functions would boost Caffe2’s inference performance on CPUs. It has also incorporated its Skylake Xeon processors to improve Caffe2’s performance. The Skylake architecture is built on a larger 512-bit wide AVX[4.Advanced Vector Extension] engine, which provides a significant performance improvement from the previous Haswell/Broadwell architecture built on a 256-bit wide AVX2.


Intel has partnered with Japan’s (EWJ) Preferred Networks’ open-source framework, Chainer, which mainly uses NVIDIA’s (NVDA) GPUs for its AI workloads. Under the deal, Chainer will support Intel’s Xeon alongside NVIDIA’s platform, CUDA.

Intel could engage with more such frameworks to gain market share and become the preferred architecture for AI among developers. In an interview with El Reg, Intel Accelerator Workload Group general manager Barry Davis stated that most AI workloads would come from big cloud companies such as Google, Amazon, and Microsoft. When these companies launch AI-as-a-service, Intel wants to stand alongside NVIDIA as the preferred support.


While Intel is increasing its engagement in open-source frameworks for AI, the company reduced its funding support for the OSIC (OpenStack Innovation Center). OpenStack provides companies with software tools to build cloud infrastructure and OSIC aims to encourage more enterprises to use OpenStack.

Several vendors, including Hewlett Packard Enterprise (HPE) and Cisco Systems (CSCO), have pulled back their OpenStack projects in the last six months. However, the adoption of OpenStack private clouds is growing at a CAGR (compound annual growth rate) of 39%, and this market is expected to reach $5.7 billion by 2020, according to 451 Research. In addition to accelerators, processors, and software, Intel is developing memory and storage that supports AI, which we’ll discuss in the next part.


More From Market Realist