By DON CLARK
Chip makers soon will deliver one of biggest advances in years in the technology that powers laptop and desktop computers. But how much consumers—and the chip companies—will benefit is in question.
Chip makers soon will deliver one of biggest advances in years in the technology that powers laptop and desktop computers. But how much it will benefit consumers is still to be determined. WSJ's Don Clark reports on Digits.
The design trend, expected to be the focus of announcements by Intel Corp. and Advanced Micro Devices Inc. at the Consumer Electronics Show early next month, is based on bringing together two long-separate classes of products: microprocessors, the calculating engines that run most PC software; and graphics processing units, which render images in videogames and other programs.
Putting the two technologies on one piece of silicon reduces the distance electrical signals must travel and speeds up some computing chores. It also lowers the number of components computer makers need to buy, cutting production costs and helping to shrink the size of computers. Such integrated chips are expected to allow low-priced systems to carry out tasks that currently add hundreds of dollars to the price of a personal computer, such as the ability to play high-definition movies and videogames and to convert video and audio files to different formats quickly.
The approach "is going to change the way people build PCs and buy PCs," Paul Otellini, Intel's chief executive, predicted at an investor conference early this month.
But the benefits won't be measurable until after the CES show, when computer makers are expected to disclose their plans for using the technology. And some industry executives insist that many PC users will continue to seek even better performance by picking systems with separate graphics-processing-unit chips.
Intel, which supplies roughly four-fifths of the microprocessors used in PCs, is using the event to introduce a broad overhaul of its flagship Core product line using a design that is code-named Sandy Bridge. The products add GPU circuitry that Intel has offered in companion chipsets, as well as video-processing features and other undisclosed features aimed at improving the visual experience of using PCs—technologies Intel plans to market as part of a campaign called Visibly Smart.
Mr. Otellini said demand is "very, very strong" for the chips, which are expected to be used in hundreds of new designs for laptop and desktop PCs at various price points. Intel also is expected to offer a new version of a technology known as Wi-Di, which allows laptop users to wirelessly display images on high-definition TV sets.
The trend is at least as important for AMD, perennial underdog to Intel in the microprocessor market. AMD spent $5.4 billion in 2006 to buy ATI Technologies, one of two big makers of GPUs, and vowed then to combine that technology with its microprocessors by early 2009 in an initiative it calls Fusion.
That effort took longer than the company anticipated. AMD is using the CES trade show to introduce microprocessors with GPU circuitry that are targeted at laptops in the $200 to $500 range. But it doesn't expect to offer high-end Fusion chips that could directly compete with Intel's overhauled Core line until the middle of next year.
AMD expects the chips being introduced at the CES show to add much better capabilities for playing games and high-definition videos to a low-end portable category known as netbooks, a market Intel has dominated. "We are bringing just this incredible amount of visual and computing power to segments where it hasn't been seen before," said Rick Bergman, an AMD senior vice president who is general manager of its products group.
The third player affected by the trend is Nvidia Corp. The Silicon Valley company competes fiercely with AMD in sales of GPUs, but agrees with its rival on one point: The graphics circuitry added in Sandy Bridge—though an improvement over Intel's past efforts—still isn't adequate for many applications.
Both companies cite that the new Intel chips don't support a Microsoft Corp. programming technology called DirectX 11, needed for some popular videogames, while their products do. An Intel spokesman responded, saying that many widely used games will work fine using Sandy Bridge, which the company predicts will make GPUs unnecessary in low-end PCs.
Nvidia says many PC makers don't seem to agree with Intel's assertion, with more than 200 forthcoming models based on Sandy Bridge also including its GPUs.
"We have more design wins in Sandy Bridge than any other platform," said Nvidia CEO Jen-Hsun Huang.
Mr. Huang says the new Intel chips with built-in graphics, instead of hurting Nvidia, will help the company by driving demand for PCs—largely because of other technology improvements. "I think this is the best microprocessor that's been built for quite a long time," he said.
Intel hasn't disclosed performance estimates for the new chips, which are expected to start with high-end models that have the equivalent of four calculating engines.
One person who has tested the technology is Kelt Reeves, president of the gaming-PC maker Falcon Northwest. While the graphics performance won't satisfy gamers, in Mr. Reeves's opinion, the four processors on Sandy Bridge chips top the performance of six processors on existing Intel products. The chips are "ridiculously good," he said.
Write to Don Clark at email@example.com
Read more: http://online.wsj.com/article/SB1000...#ixzz19zdvLdpO