Of the three leading GPU vendors for the PC, Intel’s story is perhaps the most peculiar. Intel’s presence in graphics starts with Lockheed Martin in the 1990’s. The military was always a leading customer for graphics technology – with applications ranging from flight simulators to visualization. In 1995, Lockheed Martin decided to try and adapt their graphics expertise to the consumer market and spun off Real3D. Early partners included SGI, Intel, Chips & Technologies and Sega and the company was one of the first adopters of the AGP bus. Unfortunately, the i740 discrete graphics card proved to be unattractive and by 1999 the company had gone under. Real3D was purchased by Intel, although some engineers departed to start ATI’s Florida design team.
Despite the i740’s failure as a discrete graphics card, it was integrated into Intel’s 810 chipset for the Pentium 3. The team at Intel has continued developing the Gen graphics architecture over the years. There have been several major iterations, the GMA 900 which was the first DX9 design, although it relied on the CPU for transform, lighting and vertex shading. The GMA X3000 series was Intel’s DX10 capable architecture with unified shaders – sharing the execution resources between pixel, vertex and media threads. The most recent iteration, the GMA X4500 codenamed Ironlake, was manufactured on 45nm, but integrated into the package of 32nm consumer microprocessors. Internally, Ironlake is referred to as the Gen 5.75 architecture (a significant enhancement of Gen 5), but for brevity’s sake we will simply refer to it as Gen 5. Ironlake relies on programmable (rather than fixed function) hardware to a much greater extent than GPUs from AMD or Nvidia.
Intel’s graphics is integrated into the chipset and the key focus has always been on mainstream uses. Additionally, since customers like HP or Dell were unwilling to pay extra for the integrated graphics performance, Intel focused on minimizing the die area and power cost. As a result, Intel has consistently emphasized multimedia and delivered relatively spotty 3D performance. Coupled with poor driver support, Intel’s graphics were often ridiculed by gamers and enthusiasts. Despite these handicaps, Intel has over 50% of the graphics market, demonstrating that buyers are often more concerned with price and power consumption than raw graphics performance.
However, the trend towards programmable GPUs has shifted Intel’s focus. It became clear that heterogeneous computing, using the CPU, GPU (and fixed function hardware) for highly parallel workloads is more efficient than a CPU alone. Rather than seeing the GPU as a simple checklist item in their platform, Intel began to view the GPUs as the modern day successor to the x87 floating point coprocessors of the 1980’s. To address their deficiencies, they pursued two parallel strategies. The first was the cancelled Larrabee, which was a multi-core x86-based discrete GPU targeted for late 2009 that lives on as a scientific computing product. The second was an announcement in 2007 that Intel would increase integrated graphics performance 10X by 2010. The Gen 6 graphics architecture in Sandy Bridge is the culmination of that promise and exceeds the target with performance gains more like 25X.
Sandy Bridge’s graphics is a tremendous move for Intel in many dimensions. The new Gen 6 architecture is a radical departure from the earlier Ironlake design (and the philosophy behind Larrabee). It heavily favors fixed function hardware to achieve power and area efficiency. The programmable shaders have been redesigned for higher performance and Intel has continued their tradition of media excellence. It is also the first GPU to be integrated into the same silicon as the CPU, using Intel’s cutting edge 32nm process technology. From a consumer standpoint, Sandy Bridge is the first integrated graphics with DX10 at launch and even offers performance that is competitive with low-end discrete graphics cards.
Discuss (65 comments)