The evolution of the modern graphics processor begins with the introduction of the first 3D add-in cards in 1995, followed by the widespread adoption of the 32-bit operating systems and the affordable personal computer.

The graphics industry that existed before that largely consisted of a more prosaic 2D, non-PC architecture, with graphics boards better known by their chip’s alphanumeric naming conventions and their huge price tags. 3D gaming and virtualization PC graphics eventually coalesced from sources as diverse as arcade and console gaming, military, robotics and space simulators, as well as medical imaging.

The early days of 3D consumer graphics were a Wild West of competing ideas. From how to implement the hardware, to the use of different rendering techniques and their application and data interfaces, as well as the persistent naming hyperbole. The early graphics systems featured a fixed function pipeline (FFP), and an architecture following a very rigid processing path utilizing almost as many graphics APIs as there were 3D chip makers.

While 3D graphics turned a fairly dull PC industry into a light and magic show, they owe their existence to generations of innovative endeavour. Over the next few weeks (this is the first installment on a series of four articles) we’ll be taking an extensive look at the history of the GPU, going from the early days of 3D consumer graphics, to the 3Dfx Voodoo game-changer, the industry’s consolidation at the turn of the century, and today’s modern GPGPU.

1976 – 1995: The Early Days of 3D Consumer Graphics

The first true 3D graphics started with early display controllers, known as video shifters and video address generators. They acted as a pass-through between the main processor and the display. The incoming data stream was converted into serial bitmapped video output such as luminance, color, as well as vertical and horizontal composite sync, which kept the line of pixels in a display generation and synchronized each successive line along with the blanking interval (the time between ending one scan line and starting the next).

A flurry of designs arrived in the latter half of the 1970s, laying the foundation for 3D graphics as we know them.

atari-2600
Atari 2600 released in September 1977

RCA’s “Pixie” video chip (CDP1861) in 1976, for instance, was capable of outputting a NTSC compatible video signal at 62×128 resolution, or 64×32 for the ill-fated RCA Studio II console.

The video chip was quickly followed a year later by the Television Interface Adapter (TIA) 1A, which was integrated into the Atari 2600 for generating the screen display, sound effects, and reading input controllers. Development of the TIA was led by Jay Miner, who also led the design of the custom chips for the Commodore Amiga computer later on.

In 1978, Motorola unveiled the MC6845 video address generator. This became the basis for the IBM PC’s Monochrome and Color Display Adapter (MDA/CDA) cards of 1981, and provided the same functionality for the Apple II. Motorola added the MC6847 video display generator later the same year, which made its way into a number of first generation personal computers, including the Tandy TRS-80.

ibm-pc-mda
IBM PC’s Monochrome Display Adapter

A similar solution from Commodore’s MOS Tech subsidiary, the VIC, provided graphics output for 1980-83 vintage Commodore home computers.

In November the following year, LSI’s ANTIC (Alphanumeric Television Interface Controller) and CTIA/GTIA co-processor (Color or Graphics Television Interface Adaptor), debuted in the Atari 400. ANTIC processed 2D display instructions using direct memory access (DMA). Like most video co-processors, it could generate playfield graphics (background, title screens, scoring display), while the CTIA generated colors and moveable objects. Yamaha and Texas Instruments supplied similar IC’s to a variety of early home computer vendors.

The next steps in the graphics evolution were primarily in the professional fields.

Intel used their 82720 graphics chip as the basis for the $1000 iSBX 275 Video Graphics Controller Multimode Board. It was capable of displaying eight color data at a resolution of 256×256 (or monochrome at 512×512). Its 32KB of display memory was sufficient to draw lines, arcs, circles, rectangles and character bitmaps. The chip also had provision for zooming, screen partitioning and scrolling.

SGI quickly followed up with their IRIS Graphics for workstations — a GR1.x graphics board with provision for separate add-in (daughter) boards for color options, geometry, Z-buffer and Overlay/Underlay.

Intel’s $1000 iSBX 275 Video Graphics Controller Multimode Board was capable of displaying eight color data at a resolution of 256×256 (or monochrome at 512×512).

Industrial and military 3D virtualization was relatively well developed at the time. IBM, General Electric and Martin Marietta (who were to buy GE’s aerospace division in 1992), along with a slew of military contractors, technology institutes and NASA ran various projects that required the technology for military and space simulations. The Navy also developed a flight simulator using 3D virtualization from MIT’s Whirlwind computer in 1951.

Besides defence contractors there were companies that straddled military markets with professional graphics.

Evans & Sutherland – who were to provide professional graphics card series such as the Freedom and REALimage – also provided graphics for the CT5 flight simulator, a $20 million package driven by a DEC PDP-11 mainframe. Ivan Sutherland, the company’s co-founder, developed a computer program in 1961 called Sketchpad, which allowed drawing geometric shapes and displaying on a CRT in real-time using a light pen.

This was the progenitor of the modern Graphic User Interface (GUI).

In the less esoteric field of personal computing, Chips and Technologies’ 82C43x series of EGA (Extended Graphics Adapter), provided much needed competition to IBM’s adapters, and could be found installed in many PC/AT clones around 1985. The year was noteworthy for the Commodore Amiga as well, which shipped with the OCS chipset. The chipset comprised of three main component chips — Agnus, Denise, and Paula — which allowed a certain amount of graphics and audio calculation to be non-CPU dependent.

In August of 1985, three Hong Kong immigrants, Kwok Yuan Ho, Lee Lau and Benny Lau, formed Array Technology Inc in Canada. By the end of the year, the name had changed to ATI Technologies Inc.

ATI got their first product out the following year, the OEM Color Emulation Card. It was used for outputting monochrome green, amber or white phosphor text against a black background to a TTL monitor via a 9-pin DE-9 connector. The card came equipped with a minimum of 16KB of memory and was responsible for a large percentage of ATI’s CAD$10 million in sales in the company’s first year of operation. This was largely done through a contract that supplied around 7000 chips a week to Commodore Computers.

ATI’s Color Emulation Card came with a minimum 16KB of memory and was responsible for a large part of the company’s CAD$10 million in sales the first year of operation.

The advent of color monitors and the lack of a standard among the array of competitors ultimately led to the formation of the Video Electronics Standards Association (VESA), of which ATI was a founding member, along with NEC and six other graphics adapter manufacturers.

In 1987 ATI added the Graphics Solution Plus series to its product line for OEM’s, which used IBM’s PC/XT ISA 8-bit bus for Intel 8086/8088 based IBM PC’s. The chip supported MDA, CGA and EGA graphics modes via dip switches. It was basically a clone of the Plantronics Colorplus board, but with room for 64kb of memory. Paradise Systems’ PEGA1, 1a, and 2a (256kB) released in 1987 were Plantronics clones as well.

ega-wonder-350px
ATI EGA 800: 16-color VGA emulation, 800×600 support

The EGA Wonder series 1 to 4 arrived in March for $399, featuring 256KB of DRAM as well as compatibility with CGA, EGA and MDA emulation with up to 640×350 and 16 colors. Extended EGA was available for the series 2,3 and 4.

Filling out the high end was the EGA Wonder 800 with 16-color VGA emulation and 800×600 resolution support, and the VGA Improved Performance (VIP) card, which was basically an EGA Wonder with a digital-to-analog (DAC) added to provide limited VGA compatibility. The latter cost $449 plus $99 for the Compaq expansion module.

ATI was far from being alone riding the wave of consumer appetite for personal computing.

Many new companies and products arrived that year.. Among them were Trident, SiS, Tamerack, Realtek, Oak Technology, LSI’s G-2 Inc., Hualon, Cornerstone Imaging and Winbond — all formed in 1986-87. Meanwhile, companies such as AMD, Western Digital/Paradise Systems, Intergraph, Cirrus Logic, Texas Instruments, Gemini and Genoa, would produce their first graphics products during this timeframe.

ATI’s Wonder series continued to gain prodigious updates over the next few years.

In 1988, the Small Wonder Graphics Solution with game controller port and composite out options became available (for CGA and MDA emulation), as well as the EGA Wonder 480 and 800+ with Extended EGA and 16-bit VGA support, and also the VGA Wonder and Wonder 16 with added VGA and SVGA support.

A Wonder 16 was equipped with 256KB of memory retailed for $499, while a 512KB variant cost $699.

An updated VGA Wonder/Wonder 16 series arrived in 1989, including the reduced cost VGA Edge 16 (Wonder 1024 series). New features included a bus-Mouse port and support for the VESA Feature Connector. This was a gold-fingered connector similar to a shortened data bus slot connector, and it linked via a ribbon cable to another video controller to bypass a congested data bus.

The Wonder series updates continued to move apace in 1991. The Wonder XL card added VESA 32K color compatibility and a Sierra RAMDAC, which boosted maximum display resolution to 640×480 @ 72Hz or 800×600 @ 60Hz. Prices ranged through $249 (256KB), $349 (512KB), and $399 for the 1MB RAM option. A reduced cost version called the VGA Charger, based on the previous year’s Basic-16, was also made available.

ati-mach8
ATI Graphics Ultra ISA (Mach8 + VGA)

ATI added a variation of the Wonder XL that incorporated a Creative Sound Blaster 1.5 chip on an extended PCB. Known as the VGA Stereo-F/X, it was capable of simulating stereo from Sound Blaster mono files at something approximating FM radio quality.

The Mach series launched with the Mach8 in May of that year. It sold as either a chip or board that allowed, via a programming interface (AI), the offloading of limited 2D drawing operations such as line-draw, color-fill and bitmap combination (Bit BLIT).

Graphics boards such as the ATI VGAWonder GT, offered a 2D + 3D option, combining the Mach8 with the graphics core (28800-2) of the VGA Wonder+ for its 3D duties. The Wonder and Mach8 pushed ATI through the CAD$100 million sales milestone for the year, largely on the back of Windows 3.0’s adoption and the increased 2D workloads that could be employed with it.

S3 Graphics was formed in early 1989 and produced its first 2D accelerator chip and a graphics card eighteen months later, the S3 911 (or 86C911). Key specs for the latter included 1MB of VRAM and 16-bit color support.

The S3 911 was superseded by the 924 that same year — it was basically a revised 911 with 24-bit color — and again updated the following year with the 928 which added 32-bit color, and the 801 and 805 accelerators. The 801 used an ISA interface, while the 805 used VLB. Between the 911’s introduction and the advent of the 3D accelerator, the market was flooded with 2D GUI designs based on S3’s original — notably from Tseng labs, Cirrus Logic, Trident, IIT, ATI’s Mach32 and Matrox’s MAGIC RGB.

In January 1992, Silicon Graphics Inc (SGI) released OpenGL 1.0, a multi-platform vendor agnostic application programming interface (API) for both 2D and 3D graphics.

Microsoft was developing a rival API of their own called Direct3D and didn’t exactly break a sweat making sure OpenGL ran as well as it could under Windows.

OpenGL evolved from SGI’s proprietary API, called the IRIS GL (Integrated Raster Imaging System Graphical Library). It was an initiative to keep non-graphical functionality from IRIS, and allow the API to run on non-SGI systems, as rival vendors were starting to loom on the horizon with their own proprietary APIs.

Initially, OpenGL was aimed at the professional UNIX based markets, but with developer-friendly support for extension implementation it was quickly adopted for 3D gaming.

Microsoft was developing a rival API of their own called Direct3D and didn’t exactly break a sweat making sure OpenGL ran as well as it could under the new Windows operating systems.

Things came to a head a few years later when John Carmack of id Software, whose previously released Doom had revolutionized PC gaming, ported Quake to use OpenGL on Windows and openly criticised Direct3D.

glquake
Fast forward: GLQuake released in 1997 versus original Quake

Microsoft’s intransigence increased as they denied licensing of OpenGL’s Mini-Client Driver (MCD) on Windows 95, which would allow vendors to choose which features would have access to hardware acceleration. SGI replied by developing the Installable Client Driver (ICD), which not only provided the same ability, but did so even better since MCD covered rasterisation only and ICD added lighting and transform functionality (T&L).

During the rise of OpenGL, which initially gained traction in the workstation arena, Microsoft was busy eyeing the emerging gaming market with designs on their own proprietary API. They acquired RenderMorphics in February 1995, whose Reality Lab API was gaining traction with developers and became the core for Direct3D.

At about the same time, 3dfx’s Brian Hook was writing the Glide API that was to become the dominant API for gaming. This was in part due to Microsoft’s involvement with the Talisman project (a tile based rendering ecosystem), which diluted the resources intended for DirectX.

As D3D became widely available on the back of Windows adoption, proprietary APIs such as S3d (S3), Matrox Simple Interface, Creative Graphics Library, C Interface (ATI), SGL (PowerVR), NVLIB (Nvidia), RRedline (Rendition) and Glide, began to lose favor with developers.

It didn’t help matters that some of these proprietary APIs were allied with board manufacturers under increasing pressure to add to a rapidly expanding feature list. This included higher screen resolutions, increased color depth (from 16-bit to 24 and then 32), and image quality enhancements such as anti-aliasing. All of these features called for increased bandwidth, graphics efficiency and faster product cycles.

By 1993, market volatility had already forced a number of graphics companies to withdraw from the business, or to be absorbed by competitors.

The year 1993 ushered in a flurry of new graphics competitors, most notably Nvidia, founded in January of that year by Jen-Hsun Huang, Curtis Priem and Chris Malachowsky. Huang was previously the Director of Coreware at LSI while Priem and Malachowsky both came from Sun Microsystems where they had previously developed the SunSPARC-based GX graphics architecture.

Fellow newcomers Dynamic Pictures, ARK Logic, and Rendition joined Nvidia shortly thereafter.

Market volatility had already forced a number of graphics companies to withdraw from the business, or to be absorbed by competitors. Amongst them were Tamerack, Gemini Technology, Genoa Systems, Hualon, Headland Technology (bought by SPEA), Acer, Motorola and Acumos (bought by Cirrus Logic).

One company that was moving from strength to strength however was ATI.

As a forerunner of the All-In-Wonder series, late November saw the announcement of ATI’s 68890 PC TV decoder chip which debuted inside the Video-It! card. The chip was able to capture video at 320×240 @ 15 fps, or 160×120 @ 30 fps, as well as compress/decompress in real time thanks to the onboard Intel i750PD VCP (Video Compression Processor). It was also able to communicate with the graphics board via the data bus, thus negating the need for dongles or ports and ribbon cables.

The Video-It! retailed for $399, while a lesser featured model named Video-Basic completed the line-up.

Five months later, in March, ATI belatedly introduced a 64-bit accelerator; the Mach64.

The financial year had not been kind to ATI with a CAD$2.7 million loss as it slipped in the marketplace amid strong competition. Rival boards included the S3 Vision 968, which was picked up by many board vendors, and the Trio64 which picked up OEM contracts from Dell (Dimension XPS), Compaq (Presario 7170/7180), AT&T (Globalyst),HP (Vectra VE 4), and DEC (Venturis/Celebris).

s3-vision-968
Vision 968: S3’s first motion video accelerator

Released in 1995, the Mach64 notched a number of notable firsts. It became the first graphics adapter to be available for PC and Mac computers in the form of the Xclaim ($450 and $650 depending on onboard memory), and, along with S3’s Trio, offered full-motion video playback acceleration.

The Mach64 also ushered in ATI’s first pro graphics cards, the 3D Pro Turbo and 3D Pro Turbo+PC2TV, priced at a cool $599 for the 2MB option and $899 for the 4MB.

mach64vt
ATI Mach64 VT with support for TV tuner

The following month saw a technology start-up called 3DLabs rise onto the scene, born when DuPont’s Pixel graphics division bought the subsidiary from its parent company, along with the GLINT 300SX processor capable of OpenGL rendering, fragment processing and rasterisation. Due to their high price the company’s cards were initially aimed at the professional market. The Fujitsu Sapphire2SX 4MB retailed for $1600-$2000, while an 8MB ELSA GLoria 8 was $2600-$2850. The 300SX, however, was intended for the gaming market.

S3 seemed to be everywhere at that time. The high-end OEM marked was dominated by the company’s Trio64 chipsets that integrated DAC, a graphics controller, and clock synthesiser into a single chip.

The Gaming GLINT 300SX of 1995 featured a much-reduced 2MB of memory. It used 1MB for textures and Z-buffer and the other for frame buffer, but came with an option to increase the VRAM for Direct3D compatibility for another $50 over the $349 base price. The card failed to make headway in an already crowded marketplace, but 3DLabs was already working on a successor in the Permedia series.

S3 seemed to be everywhere at that time. The high-end OEM marked was dominated by the company’s Trio64 chipsets that integrated DAC, a graphics controller, and clock synthesiser into a single chip. They also utilized a unified frame buffer and supported hardware video overlay (a dedicated portion of graphics memory for rendering video as the application requires). The Trio64 and its 32-bit memory bus sibling, the Trio32, were available as OEM units and standalone cards from vendors such as Diamond, ELSA, Sparkle, STB, Orchid, Hercules and Number Nine. Diamond Multimedia’s prices ranged from $169 for a ViRGE based card, to $569 for a Trio64+ based Diamond Stealth64 Video with 4MB of VRAM.

The mainstream end of the market also included offerings from Trident, a long time OEM supplier of no-frills 2D graphics adapters who had recently added the 9680 chip to its line-up. The chip boasted most of the features of the Trio64 and the boards were generally priced around the $170-200 mark. They offered acceptable 3D performance in that bracket, with good video playback capability.

Other newcomers in the mainstream market included Weitek’s Power Player 9130, and Alliance Semiconductor’s ProMotion 6410 (usually seen as the Alaris Matinee or FIS’s OptiViewPro). Both offered excellent scaling with CPU speed, while the latter combined the strong scaling engine with antiblocking circuitry to obtain smooth video playback, which was much better than in previous chips such as the ATI Mach64, Matrox MGA 2064W and S3 Vision968.

nvidia-1995Nvidia launched their first graphics chip, the NV1, in May, and became the first commercial graphics processor capable of 3D rendering, video acceleration, and integrated GUI acceleration.

They partnered with ST Microelectronic to produce the chip on their 500nm process and the latter also promoted the STG2000 version of the chip. Although it was not a huge success, it did represent the first financial return for the company. Unfortunately for Nvidia, just as the first vendor boards started shipping (notably the Diamond Edge 3D) in September, Microsoft finalized and released DirectX 1.0.

The D3D graphics API confirmed that it relied upon rendering triangular polygons, where the NV1 used quad texture mapping. Limited D3D compatibility was added via driver to wrap triangles as quadratic surfaces, but a lack of games tailored for the NV1doomed the card as a jack of all trades, master of none.

Most of the games were ported from the Sega Saturn. A 4MB NV1 with integrated Saturn ports (two per expansion bracket connected to the card via ribbon cable), retailed for around $450 in September 1995.

Microsoft’s late changes and launch of the DirectX SDK left board manufacturers unable to directly access hardware for digital video playback. This meant that virtually all discrete graphics cards had functionality issues in Windows 95. Drivers under Win 3.1 from a variety of companies were generally faultless by contrast.

Ati_3D_Rage_logo

ATI announced their first 3D accelerator chip, the 3D Rage (also known as the Mach 64 GT), in November 1995.

The first public demonstration of it came at the E3 video game conference held in Los Angeles in May the following year. The card itself became available a month later. The 3D Rage merged the 2D core of the Mach64 with 3D capability.

Late revisions to the DirectX specification meant that the 3D Rage had compatibility problems with many games that used the API — mainly the lack of depth buffering. With an on-board 2MB EDO RAM frame buffer, 3D modality was limited to 640x480x16-bit or 400x300x32-bit. Attempting 32-bit color at 600×480 generally resulted in onscreen color corruption, and 2D resolution peaked at 1280×1024. If gaming performance was mediocre, the full screen MPEG playback ability at least went some way in balancing the feature set.

The performance race was over before it had started, with the 3Dfx Voodoo Graphics effectively annihilating all competition.

ATI reworked the chip, and in September the Rage II launched. It rectified the D3DX issues of the first chip in addition to adding MPEG2 playback support. Initial cards, however, still shipped with 2MB of memory, hampering performance and having issues with perspective/geometry transform, As the series was expanded to include the Rage II+DVD and 3D Xpression+, memory capacity options grew to 8MB.

While ATI was first to market with a 3D graphics solution, it didn’t take too long for other competitors with differing ideas of 3D implementation to arrive on the scene. Namely, 3dfx, Rendition, and VideoLogic.
Screamer 2, released in 1996, running on Windows 95 with 3dfx Voodoo 1 graphics

In the race to release new products into the marketplace, 3Dfx Interactive won over Rendition and VideoLogic. The performance race, however, was over before it had started, with the 3Dfx Voodoo Graphics effectively annihilating all competition.

This article is the first installment on a series of four. If you enjoyed this, make sure to join us next week as we take a stroll down memory lane to the heyday of 3Dfx, Rendition, Matrox and young company called Nvidia.

Part 1: (1976 – 1995) The Early Days of 3D Consumer Graphics
Part 2: (1995 – 1999) 3Dfx Voodoo: The Game-changer
Part 3: (2000 – 2005) Down to Two: The Graphics Market Consolidation
Part 4: (2006 – Present) The Modern GPU: Stream processing units a.k.a. GPGPU

Joseph Forbes (691)

Information Technology Consultant. For SMB, SOHO, and Online business. From Computers to Telecommunications this guy has been into it since hippies made it hip. Drone Pilot and Tech Aficionado I get to travel the State of Texas to help businesses succeed.