In 2009, Intel gave up on developing Larrabee, a homegrown discrete GPU, targeted at PC gaming systems.
GPUs aren’t a big presence at the ongoing Intel Developer Forum, which is centered around the chip maker’s A.I. and VR strategy. Intel is highlighting its CPUs and FPGAs (field programmable gate arrays) for those categories, and the lack of a GPU to chase the hot markets could be a hole in the company’s product line.
Intel has never been a leader in graphics, and it hasn’t aggressively competed in the area, like Nvidia and AMD have. This week, it showed some progress in its graphics technology, saying its upcoming Kaby Lake PC chips will have integrated graphics processors with 4K support.
Now, some analysts are questioning whether Intel needs its own high-performance GPU, a driving force behind the fast-growing gaming, virtual reality, and artificial intelligence markets.
For A.I., Intel is pitching high-performance chips called Xeon Phi, which were derived from Larrabee. At IDF, the company announced a specialized Xeon Phi chip called Knights Mill for A.I. It also pitching FPGAs, which can be reprogrammed for specific machine-learning tasks.
With its current chip lineup, Intel doesn’t believe it needs a pure-play GPU to pursue its A.I. strategy.
“Really, what most high-performance computers need — they don’t need a GPU — they need parallel application performance, there are many ways to get that,” said Jason Waxman, corporate vice president and general manager of Intel’s Data Center Solutions Group.
But in multimedia applications, a homegrown GPU could help. If Intel had highest-performance GPUs like AMD and Nvidia do, the company would be able to participate in a wider capacity in VR and AR, said Patrick Moorhead, principal analyst at Moor Insights and Strategy.
“With what they have now, they can participate in VR and AR all the way from head-mounted displays to mainstream A.I. and VR … with their CPUs,” Moorhead said. But to use VR headsets like the Oculus Rift or HTC Vive, PCs with Intel CPUs will still need high-end GPUs from Nvidia or AMD.
Intel may not need a GPU for A.I. with its Xeon Phi and software stack, but time will tell, Moorhead said.
Having one quality GPU to tackle the A.I., gaming, and VR and AR markets could help Intel kill many birds with one stone, said Jim McGregor, principal analyst at Tirias Research.
But the company has already burned a lot of money trying to build a GPU and hasn’t seen success, McGregor said. Intel may not have the appetite to make a high-performance GPU, which can be challenging.
Moreover, Intel doesn’t believe in the idea of a co-processor, or a separate processor, to accelerate certain workloads. It believes in building fully capable host chips that boot up computers and take on all workloads, McGregor said.
In the meanwhile, it can turn to AMD, Nvidia, or even Imagination Technologies for its graphics needs, McGregor said. Intel in the past also turned to ARM for graphics on its now-defunct low-end Atom X3 chip code-named Sofia.