How successful Intel’s efforts prove to be will be crucial not only for the company but also for the long-term future of the computer chip industry.

“We’re seeing a lot more competition in the data-center market than we’ve seen in a long time,” said Linley Gwennap, a semiconductor expert who leads a technology research firm in Mountain View, Calif.

Intel has long dominated the business for central processing chips that control industry-standard servers in data centers. Matthew Eastwood, an analyst at IDC, said the company controlled about 96 percent of such chips.

But others are making inroads into advanced data centers. Nvidia, a chip maker in Santa Clara, Calif., does not make Intel-style central processors. But its graphics-processing chips, used by gamers in turbocharged personal computers, have proved well suited for A.I. tasks. Nvidia’s data-center business is taking off, with the company’s sales surging and its stock price nearly tripling in the last year.

Big Intel customers like Google, Microsoft and Amazon are also working on chip designs. AMD and ARM, which make central processing chips like Intel, are edging into the data-center market, too. IBM made its Power chip technology open source a few years ago, and Google and others are designing prototypes.

To counter some of these trends, Intel is expected on Tuesday to provide details about the performance and uses of its new chips and its plans for the future. The company is set to formally introduce the next generation of its Xeon data-center microprocessors, code-named Skylake. And there will be a range of Xeon offerings with different numbers of processing cores, speeds, amounts of attached memory, and prices.

Yet analysts said that would represent progress along Intel’s current path rather than an embrace of new models of computing.

Stacy Rasgon, a semiconductor analyst at Bernstein Research, said, “They’re late to artificial intelligence.”


Chips made by Nvidia, a rival of Intel. Nvidia’s sales have been surging, and its stock price has nearly tripled in the last year.

Tyrone Siu/Reuters

Intel disputes that characterization, saying that artificial intelligence is an emerging technology in which the company is making major investments. In a blog post last fall, Brian Krzanich, Intel’s chief executive, wrote that it was “uniquely capable of enabling and accelerating the promise of A.I.”

Intel has been working in several ways to respond to the competition in data-center chips. The company acquired Nervana Systems, an artificial intelligence start-up, for more than $400 million last year. In March, Intel created an A.I. group, headed by Naveen G. Rao, a founder and former chief executive of Nervana.

The Nervana technology, Intel has said, is being folded into its product road map. A chip code-named Lake Crest is being tested and will be available to some customers this year.

Lake Crest is tailored for A.I. programs called neural networks, which learn specific tasks by analyzing huge amounts of data. Feed millions of cat photos into a neural network and it can learn to recognize a cat — and later pick out cats by color and breed. The principle is the same for speech recognition and language translation.

Intel has also said it is working to integrate Nervana technology into a future Xeon processor, code-named Knight’s Crest.

Intel’s challenge, analysts said, is a classic one of adapting an extraordinarily successful business to a fundamental shift in the marketplace.

As the dominant data-center chip maker, used by a wide array of customers with different needs, Intel has loaded more capabilities into its central processors. It has been an immensely profitable strategy: Intel had net income of $10.3 billion last year on revenue of $59.4 billion.

Yet key customers increasingly want computing designs that parcel out work to a collection of specialized chips rather than have that work flow through the central processor. A central processor can be thought of as part brain, doing the logic processing, and part traffic cop, orchestrating the flow of data through the computer.

The outlying, specialized chips are known in the industry as accelerators. They can do certain things, like data-driven A.I. tasks, faster than a central processor. Accelerators include graphics processors, application-specific integrated circuits (ASICs) and field-programmable gate arrays (F.P.G.A.s).

A more diverse set of chips does not mean the need for Intel’s central processor disappears. The processor just does less of the work, becoming more of a traffic cop and less of a brain. If this happens, Intel’s business becomes less profitable.

Intel is not standing still. In 2015, it paid $16.7 billion for Altera, a maker of field-programmable gate arrays, which make chips more flexible because they can be repeatedly reprogrammed with software.

Mr. Gwennap, the independent analyst, said, “Intel has a very good read on data centers and what those customers want.”

Still, the question remains whether knowing what the customers want translates into giving them what they want, if that path presents a threat to Intel’s business model and profit margins.

Continue reading the main story

Source link