3Dfx Voodoo: The Game-changer

Launched on November 1996, 3Dfx’s Voodoo graphics consisted of a 3D-only card that required a VGA cable pass-through from a separate 2D card to the Voodoo, which then connected to the display.

The cards were sold by a large number of companies. Orchid Technologies was first to market with the $299 Orchid Righteous 3D, a board noted for having mechanical relays that “clicked” when the chipset was in use. Later revisions utilized solid-state relays in line with the rest of the vendors. The card was followed by Diamond Multimedia’s Monster 3D, Colormaster’s Voodoo Mania, the Canopus Pure3D, Quantum3D, Miro Hiscore, Skywell (Magic3D), and the 2theMAX Fantasy FX Power 3D.

Voodoo Graphics revolutionized personal computer graphics nearly overnight and rendered many other designs obsolete, including a vast swathe of 2D-only graphics producers. The 3D landscape in 1996 favoured S3 with around 50% of the market. That was to change soon, however. It was estimated that 3Dfx accounted for 80-85% of the 3D accelerator market during the heyday of Voodoo’s reign.

3dfx-voodoo-1-monster3d-p

Diamond Multimedia’s Monster 3D (3dfx Voodoo1 4MB PCI)

Around that time VideoLogic had developed a tile based deferred rendering technology (TBDR) which eliminated the need for large scale Z-buffering (removing occluded/hidden pixels in the final render) by discarding all but visible geometry before texture, shading and lighting were applied to that which remained. The frame resulting from this process was sectioned into rectangular tiles, each tile with its own polygons rendered and sent to output. Polygon rendering commenced once the pixels required for the frame were calculated and polygons culled (Z-buffering only occurred at tile level). This way only a bare minimum of calculation was required.

The first two series of chips and cards were built by NEC, while Series 3 (Kyro) chips were fabricated by ST Micro. The first card was used exclusively in Compaq Presario PCs and was known as the Midas 3 (the Midas 1 and 2 were prototypes for an arcade based system project). The PCX1 and PCX2 followed as OEM parts.

The 3D landscape in 1996 favoured S3 with around 50% of the market. That was to change soon, however. It was estimated that 3Dfx accounted for 80-85% of the 3D accelerator market during the heyday of Voodoo’s reign.

Series 2 chip production initially went to Sega’s Dreamcast console, and by the time the desktop Neon 250 card hit retail in November 1999, it was brutally outclassed at its $169 price range, particularly in higher resolutions with 32-bit color.

Just before the Neon 250 became available, Rendition’s Vérité V1000 became the first card with a programmable core to render 2D + 3D graphics, by utilizing a MIPS-based RISC processor as well as the pixel pipelines. The processor was responsible for triangle setup and organizing workload for the pipelines.

Originally developed towards the end of 1995, the Vérité 1000 became one of the boards that Microsoft used to develop Direct3D. Unfortunately, the card required a motherboard chipset capable of supporting direct memory access (DMA), since the Rendition used this method to transfer data across the PCI interface. The V1000 fared well in comparison with virtually every other consumer graphics board prior to the arrival of the Voodoo Graphics, which had more than double the 3D performance. The board was relatively cheap and offered a good feature set, including edge antialiasing for the budget gamer and hardware acceleration of id Software’s Quake. Game developers, however, shied away from the DMA transfer model all too soon for Rendition’s liking.

Like 1996, 1997 proved to be another busy year in the consumer graphics industry.

ATI moved from strength to strength as they launched the Rage II, followed by the 3D Rage Pro in March. The latter was the first AGP 2x card and the first product to come out of ATI’s 3D Engineering Group formed in 1995.

ati-rage-pro-350px

ATI 3D Rage Pro

The Pro nearly equalled the Voodoo Graphics performance in 4MB form, and outperformed the 3Dfx card when using 8MB and the AGP interface. The card improved on the Rage II’s perspective correction, along with texturing ability and trilinear filtering performance thanks to an expanded 4kB cache and added edge anti-aliasing. There was also an incorporated floating-point unit to decrease reliance on the CPU as well as hardware acceleration and display support for DVD.

All in all, the Rage Pro added greatly to ATI’s bottom line, helping the company realise a CAD$47.7 million profit on sales exceeding 600 million dollars. Much of this success came from OEM contracts, integration on consumer and server motherboards, and mobile variants. Prices for the card (usually sold as the Xpert@Work and Xpert@Play) ranged from $170 for the 2MB version, to $200-230 for the 4MB model, and $270-300 for 8MB. The 16MB edition would exceed $400.

ATI bolstered their portfolio by acquiring Tseng Labs’ IP for $3 million and took in forty of the company’s engineers on December 1997. It was a bargain deal, as Tseng’s failure to integrate a RAMDAC into its cards had caused a sharp fall in sales, from $12.4 million in 1996 to $1.9 million in 1997.

3DLabs announced the revised Permedia (“Pervasive 3D”) series of boards in March 1997, built on Texas Instruments’ 350nm process instead of IBM’s for the previous revision Permedia and Permedia NT for workstations. Performance was substandard for the first, while the NT model improved somewhat thanks to the additional Delta chip for full triangle and AA setup, albeit at a $300 price tag. Permedia 2 based cards started shipping towards the end of the year, but rather than going head-to-head with the gaming heavyweights, they were marketed as semi-pro 2D cards with moderate 3D graphics ability.

A month after ATI and 3DLabs refreshed their line-ups, Nvidia replied with the RIVA 128 (Real-time Interactive Video and Animation accelerator) and added Direct3D compatibility by rendering triangular polygons.

The company maintained their association with ST Micro, who produced the chip on their new 350nm process and developed the RAMDAC and video converter. While initial drivers were problematic (notably with Unreal), the card showed enough performance in games such as Quake 2 and 3 to top many benchmark review charts.

nvidia-riva-128

Diamond Viper V330 PCI (Nvidia RIVA 128)

This proved to be the landmark card that Nvidia had been looking for since 1993. It was such an economic and critical success that Nvidia needed to look further afield to maintain supply, signing a manufacturing deal with TSMC to supply Riva 128ZXs alongside ST Micro. Nvidia’s 3D graphics market share at the end of 1997 was estimated at 24%,ranking second behind 3Dfx Interactive, largely due to the Riva 128/128ZX.

Nvidia’s coffers were also boosted by Sega’s funding of the NV2 as a possible graphics chip for the Dreamcast console, even though the final contract was to be awarded to NEC/VideoLogic.

Rival 3Dfx also collaborated with Sega on the project and was largely believed to be the one providing the hardware for the console until the latter terminated the contract. 3Dfx filed a $155 million lawsuit claiming it was misled by Sega into believing they were committed to using 3dfx hardware, and in turn gave them access to confidential materials relating to its graphics IP. They settled out of court for $10.5 million a year later.

blackbelt-prototype

3Dfx based Sega BlackBelt prototype

The Dreamcast “Black Belt” project was just one facet of a busy year for 3Dfx Interactive.

Quantum3D was spun off from 3Dfx on March 31, 1997. SGI and Gemini Technology partnered with the company to work on very high-end enthusiast and professional graphics solutions leveraging on 3dfx’s new SLI (Scan Line Interleave) technology. This involved using a daughter card with a second chipset and memory connected via header, or two or more cards connected via ribbon cable in the same way that Nvidia’s SLI and AMD’s Crossfire presently utilize the concept. Once connected together, each card – or logic block in the case of single board SLI cards – contributed half the scan lines per frame to the display.

SLI also increased maximum screen resolution from 800 x 600 to 1024 x 768 pixels. An Obsidian Pro 100DB-4440 (two single cards each with an Amethyst daughter card) retailed for $2500, while a single card SLI solution like the the 100SB-4440 and 4440V required an outlay of $1895.

3dfx’s SLI (Scan Line Interleave) technology involved two or more cards connected via ribbon cable in the same way that Nvidia and AMD Crossfire presently utilize the concept.

In the summer of 1997, 3Dfx’s announced its initial public offering and launched the Voodoo Rush in an attempt to offer a single card with 2D and 3D capability. The end product, however, unable to use the proposed Rampage chip ended up as a cut down Voodoo. The card’s SST-1 chip handled Glide API games, while a sub-standard Alliance — or even worse Macronix chip — handled other 3D games and 2D applications. This resulted in screen artifacting, as the 3dfx chip/memory ran at 50MHz, while the Alliance AT25 ran at 72MHz.

pure-rendition Did you know?
TechSpot was originally Julio’s personal project (a tech blog, if you will — circa 1998). The site was named “PURE Rendition” as it was dedicated to report the latest news on the Rendition Vérité 3D chips. Eventually it evolved into “3D Spotlight” as we moved on to cover the entire 3D graphics scene as well as the popular “3D” soundcards of the time. The TechSpot.com domain was acquired shortly after the 90s dot-com bubble for a handsome $200 as the original owners had no good use for it.

Things got worse as the Voodoo Rush’s framebuffer was essentially halved by being shared between the 3D and 2D chips, limiting resolution to around 512×384. Refresh rates took a nosedive as well, since the Alliance and Macronix chips were limited to 175 and 160MHz RAMDAC respectively.

Sunnyvale-based Rendition released the Vérité V2100 and V2200 shortly after the Voodoo Rush launched. The cards still couldn’t match the first Voodoo in performance, and were barely competitive with the budget-oriented Rush. The company’s R&D lagged significantly behind its competitors’, and with game developers showing little interest in the cards, these turned out to be Rendition’s last commercial graphics products.

Rendition had various other projects on the deck, including adding a Fujitsu FXG-1 geometry processor to the V2100/V2200 in a two-chip approach, which the other vendors were working to integrate into a single chip. The FXG-1 powered (and gloriously named) Hercules Thriller Conspiracy card thus remained uncompleted projects, along with the V3300 and 4400E, after Micron acquired the company in September 1998 for $93 million in the hopes of combining LSI’s embedded DRAM technology with Rendition’s graphics expertise.

rendition-v2200-ref

Rendition Verite V2200 reference board

As feature sets and performance increased, so did the prices for graphics cards, and a number of vendors who couldn’t outgun the rising tide of ATI, Nvidia, and 3Dfx rushed in to fill the sub-$200 market.

Matrox released the Mystique (hobbled by its lack of OpenGL support) for $120-150, while S3 had the ViRGE line starting at around $120 for the base model and going up to $150 or $200 for the DX or GX respectively. S3 diversified its line to ensure a steady stream of sales by adding a mobile card with dynamic power management (ViRGE/MX), and the desktop ViRGE/GX2 with TV-Out, S-Video and assisted DVD playback.

As feature sets and performance increased, so did the prices for graphics cards, and vendors who couldn’t outgun the rising tide of ATI, Nvidia, and 3Dfx rushed in to fill the sub-$200 market.

Slotting in below these were Cirrus Logic’s Laguna 3D series, Trident’s 9750/9850, and the SiS 6326, all of which fought for gamers’ attention. For the Laguna3D, a bargain price of $99 was not enough to cover for decreased performance, poor 3D image quality and consistency issues when compared to cards in a similar price range like S3’s ViRGE VX.

Cirrus Logic left the graphics industry fairly quickly after the Laguna3D launched. Prior to that they had also offered a range of budget 16-bit color graphics adapters in the $50 bracket, most notably the Diamond SpeedStar series and the Orchid Kelvin 64.

Trident also targeted the entry-level bracket with the 3DImage 9750 in May, and an updated 9850 shortly thereafter featuring AGP 2x bus support. The 9750 was a PCI or AGP 1x card and had a variety of graphics quality and rendering issues. The 9850 had remedied some of the quirks, but texture filtering was still a hit-or-miss proposition.

SiS added their entry to the budget 3D graphics market in June with the 6326, typically priced in the $40-50 range. The card offered good image quality and outperformed many other budget cards. While never a threat in the performance arena, the 6326 sold to the tune of seven million units in 1998.

A long running saga that would grow to encompass elements of myth and urban legend was born at the Assembly gaming event in June 1997, when BitBoys announced their Pyramid3D graphics to the world. The much-hyped project was a combined effort by Silicon VLSI Solutions Oy, TriTech and BitBoys.

But Pyramid3D never saw the light of day, with extensive debugging and revisions delaying the project, and TriTech losing a sound chip patent suit that eventually bankrupted the company.

glaze3d-demo

Demo screenshot showing the realism that Glaze3D cards were supposed to achieve.

Bitboys would go on to announce a second design, the Glaze3D chip, on May 15, 1998. They promised class-leading performance and a planned release by the end of 1999. As the time for the grand reveal approached, BitBoys announced a revised design at SIGGRAPH99 in October that did away with the RAMBUS memory and memory controller in favour of 9MB of embedded DRAM from Infineon.

Once again, bug-hunting and manufacturing problems led to the project’s cancelation.

The company was making a reputation for itself missing release dates and essentially producing nothing but vapourware. Glaze3D was later redesigned under the codename Axe, catching up to the competition with support for DirectX 8.1. The new chip was meant to debut as the Avalanche3D card by the end of 2001 and in the meantime a third development of Glaze3D codenamed Hammer was already promising DirectX 9 support.

Prototype boards of the Avalanche3D were built with the initial run of chips, but everything came to a halt when Infineon stopped producing embedded DRAM in 2001 due to mounting financial loses. Lacking a manufacturing partner Bitboys finally gave up on desktop graphics and instead focused on mobile graphics IP.

BitBoys’ exit and AMD’s blunder: In May 2006, ATI acquired BitBoys for $44 million and announced the opening of a European design center. Soon after, ATI and Nokia entered into a long term strategic partnership. Just a couple of months later a then-healthy AMD announced it would acquire ATI in a grossly overvalued $5.4 billion deal. The mobile unit that included the BitBoys personnel was renamed Imageon and in a major lack of management foresight, it was sold off for $65 million to Qualcomm in January 2009. The latter continues to produce graphics under the name Adreno (an anagram of Radeon) as an integral component of the hugely popular Snapdragon SoC.

Intel launched its first (and last so far) commercial discrete 3D desktop gaming chip back in January 1998. The i740 traces its origins from a NASA flight simulation project for the Apollo space program by General Electric and later sold to Martin Marietta who merged with Lockheed three years later. The project was repurposed by Lockheed-Martin as Real3D for professional graphics products, notably the Real3D/100 and the Real3D/Pro-1000. The Sega Model 3 arcade board featured two of the Pro-1000 graphics systems.

Lockheed-Martin then formed a joint project with Intel and Chips and Technologies, named Project Aurora. Intel bought 20 per cent of Real3D in January, a month before the i740 launched. By this stage Intel had already purchased a 100 per cent of Chips and Technologies in July 1997.

The i740 combined the resources of the two distinct graphics and texture chips on the R3D/100, but was somewhat of an oddity in that Intel implemented AGP texturing, where textures were uploaded to system memory (render buffer could also be stored in RAM). Some designs used the card’s frame buffer to hold textures, with texture swapping to system RAM utilized if the frame buffer became saturated or the texture was too large to be stored in local graphics memory.

To minimize latency, Intel’s design used the AGP Direct Memory Execute (DiME) feature, which called only those textures required for rasterisation and left the rest stored in system RAM. Performance and image quality were acceptable, with performance roughly matching high-end offerings of the previous year. At $119 for the 4MB model and $149 for 8MB, pricing reflected Intel’s aggressive marketing. The i740 was sold either as Intel branded cards, Real3D StarFighter, or the Diamond Stealth II G450.

intel-i740

Intel740 / i740 AGP graphics board

Intel designed a revised i752 chip, but lack of interest from OEMs and the gaming community in general, caused the company to cancel commercial production. A few boards made it out of the manufacturing plant, but like the i740, instead made it into integrated graphics chipsets.

Lockheed-Martin closed down Real3D in October 1999, with the related IP being sold to Intel. Many of the staff subsequently moved over to Intel or ATI.

ATI gave the Rage Pro a makeover in February 1998, which basically consisted of renaming the card Rage Pro Turbo and delivering a set of drivers highly optimized for synthetic benchmarks. There was little else to it except a price tag bumped to $449. Drivers from beta2 onwards improved gaming performance.

ATI followed up with the Rage 128 GL and VR in August – the first of the company’s products that former Tseng Labs engineers worked on. Supply was less than ideal for the retail channel until into the new year, however, which effectively killed any chance ATI had to stamp their mark on the gaming landscape as they had done with the OEM market. Specs included 32MB of on-board RAM (16MB and 32MB on the All-In-Wonder 128 version) and an efficient memory architecture, allowing the card to push past the Nvidia TNT when screen resolution increased and 32-bit color display was used. Unfortunately for ATI, many games and the hardware fit-out of many users at the time were geared for 16-bit color. Image quality was much the same as the mainstream competition from S3 and Nvidia, yet still lagged behind that of Matrox.

Nevertheless it was enough for ATI to become the top graphics supplier in 1998 with 27% of the market, and net income of CAD$168.4 million on sales of CAD$1.15 billion.

ATI announced the acquisition of Chromatic Research in October of that year for $67 million, whose MPACT media processors found favour in many PC TV solutions — notably Compaq and Gateway. The chips offered very good 2D graphics performance, excellent audio and MPEG2 playback, but limited 3D gaming performance and were pricey at around $200. In the end, insurmountable software issues doomed the company to a four-year lifespan.

Two months after the i740 had made a small splash in the graphics market, 3Dfx introduced the Voodoo 2. Like its predecessor, it was a 3D only solution, and while impressive it represented a complex system. The boards sported two texturing ICs, which allowed for the first example of multitexturing found in graphics cards, and as a result it used a total of three chips instead of one and combined 2D/3D capabilities like competing cards.

 

GLQuake showcased running on a Pentium MMX 225 MHz and 3Dfx Voodoo 2 graphics

Quantum3D’s implementations of the Voodoo 2 included the Obsidian2 X-24 as a single SLI card that could be paired with a 2D daughter card, the single slot SLI SB200/200SBi with 24MB of EDO RAM, and the Mercury Heavy Metal, which featured four 200SBi SLI boards connected via a controller board (AAlchemy) that served the function of SLI bridges found in today’s multi-GPU card setups.

The latter was a professional graphics solution intended for visual simulators, and as such it carried a hefty $9,999 price tag, while requiring an Intel BX or GX server board with four contiguous PCI slots.

mercury-brick

Four 200SBi SLI boards connected via AAlchemy controller board.

The Voodoo Banshee was announced in June 1998 but it didn’t hit retail for another three months. The card married the 2D portion of the still AWOL Rampage chipset to a single texture mapping unit (TMU), so while 3dfx could now offer a single chip with 2D and 3D capability at a much reduced production cost, the Banshee fell behind substantially when compared to the Voodoo 2’s ability of rendering multi textured polygons.

The revolution that 3dfx had ushered in three years earlier was now passing it by.

In raw 3D performance, the Voodoo 2 had no equal, but the competition was gaining ground fast. Amid increasing competition from ATI and Nvidia, 3dfx looked to retain a higher profit line by marketing and selling the boards themselves, something that was previously handled by a lengthy list of board partners. To this end, 3dfx purchased STB Systems on 15 December for $141 million in stock, but the venture proved a giant misstep as quality and cost of manufacture from the foundry used by the company (Juarez) could not compete with the Taiwanese (TSMC) foundries used by Nvidia. Nor could it compete with ATI’s Taiwanese foundry partner UMC.

Many of 3dfx’s former partners formed ties with Nvidia instead.

Amid increasing competition from ATI and Nvidia, 3dfx looked to retain a higher profit line by marketing and selling the boards themselves.

Compounding 3dfx’s mounting pressure in the marketplace, March 23 saw the launch of Nvidia’s Riva TNT — which stood for TwiN Texel (rather than the explosive). Adding a second parallel pixel pipeline to the Riva’s design doubled the pixel fillrate and rendering speed, as well as a prodigious (for 1998) 16MB of SDR memory — the Voodoo 2’s 8-16MB of RAM was of the slower EDO variety. While a strong contender, its performance was tamed as a result of its own complexity — an eight million transistor chip on TSMC’s 350nm process could not run at Nvidia’s original 125MHz core/memory frequency due to heat, and thus it shipped with a 90MHz clock instead. This was a sizeable 28% reduction, enough to ensure that the Voodoo 2 barely remained the performance leader, largely due to Glide.

Even with the reduced specification, the TNT was an impressive card. Its AGP 2x interface allowed for gaming at 1600 x 1200 and 32-bit color rendering with a 24-bit Z-buffer (image depth representation). This was a huge improvement over the Voodoo 2’s 16-bit color support and 16-bit Z-buffer. The TNT traded blows with the Voodoo 2 and Banshee, offering a better feature set, better scaling with CPU clock speed, excellent AGP texturing and better 2D performance. The card didn’t ship in any meaningful quantities until September.

Not everything was going Nvidia’s way, at least not initially.

SGI filed a lawsuit against them on April 9 alleging patent infringement over texture mapping. The resulting settlement in July 1999 gave Nvidia access to SGI’s professional graphics portfolio, while SGI terminated their own graphics team and turned their low-level graphics team over to Nvidia. This virtual giveaway of IP is generally regarded as one of the main reasons why SGI roared into bankruptcy at breakneck speed.

With the major players in the market dominating the media coverage for the first few months of the year, June and July spotlighted two of the fading lights of the industry.

On June 16, Number Nine launched their Revolution IV card.

The card proved unable to match the strides in 3D performance Nvidia and ATI were making and as a result the company attempted to reinforce their position in the 2D productivity market instead.

sgi-flat

SGI flat panel bundled
with the Revolution IV-FP

Number Nine had always favoured 2D performance over allocating resources to 3D technology, and found itself hemmed in in both markets by gaming cards such as Nvidia’s TNT. The company decided to exploit the one real weakness afflicting most dedicated gaming cards: high display resolution at 32-bit color.

To this end, Number Nine added a 36-pin OpenLDI connector to the Revolution IV-FP that would connect to an SGI flat panel screen bundled with the card. The 17.3” SGI 1600SW (1600×1024) plus Revolution IV-FP package initially retailed for $2795.

This was Number Nine’s last homemade card as they went back to selling S3 and Nvidia products. The company’s assets were acquired by S3 in December 1999 and sold to engineers of Number Nine’s original design team, who formed Silicon Spectrum in 2002.

S3 announced the Savage3D at the 1998 E3 Expo and – unlike the TNT and Voodoo Banshee – the card arrived in retail shortly. The penalty for the speedy introduction was half-baked drivers, however. OpenGL games were particularly affected, to the extent that S3 supplied a mini OpenGL driver solely for Quake games.

S3’s original specification called for a 125MHz clock for the card, but yields and heat output caused the final shipping part to be clocked at 90-110MHz – many review magazines and websites still received the higher clocked 125MHz pre-production samples. A Savage3D Supercharged part at 120MHz was released later on, and Hercules and STB sold the Terminator BEAST and Nitro 3200, respectively, clocked at 120/125MHz. Even though OpenGL emulation and DirectX performance was held back by driver support, the sub-$100 pricing for the reference boards, as well as acceptable performance in gaming and video playback brought in some sales.

Between 1997 and 1998 the number of graphics vendors who left the industry rose. Among them were Cirrus Logic, Macronix, Alliance Semiconductor, Dynamic Pictures (sold to 3DLabs), Tseng Labs, Chromatic Research (both bought by ATI), Rendition (sold to Micron), AccelGraphics (bought by Evans & Sutherland), and Chips and Technologies (engulfed by Intel).

The gulf between the have’s and have not’s became even more obvious in 1999.

January saw the release of the SiS 300, a budget business-machine graphics card. The Sis 300 offered minimal 3D performance in the context of 1999, and 2D wasn’t a match for most of SiS’s competitors in the retail market. A single pixel pipeline saw to that. Luckily for SiS, OEM’s had no qualms, since the card had enough feature checkboxes to satisfy them: a 128-bit memory bus (64-bit in the SiS 305 revision), 32-bit color support, DirectX 6.0 (DX7 for the 305), multitexturing, TV-out , and hardware MPEG2 decoding.

The SiS 315 followed in December 2000, which added a 256-bit memory bus, DirectX 8 support, full screen AA, a second pixel pipeline, a transform and lighting engine, DVD video motion compensation and DVI support. Performance was generally in the region of a GeForce 2 MX200. The same 315 chip formed the basis of SiS’s 650 chipset for Socket 478 boards (Pentium 4) in September 2001, and the SiS552 system on a chip in 2003.

 

Gaming on a budget: SiS 315 card running Unreal Tournament 2003

Besides SiS’ offerings, the budget-minded continued to have a substantial range of offerings to choose from. Among them was the Trident Blade 3D (~$65), whose passable 3D performance (if spotty driver support) was generally on par with Intel’s i740.

Buoyed by this, Trident went on to release Blade 3D Turbo with clocks boosted from 110MHz to 135MHz, which helped it keep pace with Intel’s revised i752. Adding to Trident’s woes, their integrated graphics association with VIA was about to come to an abrupt halt when VIA acquired S3 Graphics in April 2000.

Trident’s core graphics business was then largely dependent upon high volume, low priced chips, predominantly in the mobile sector. The Blade 3D Turbo was refined into the Blade T16, T64 (143MHz), and XP (166MHz). But Trident’s 3D developments were happening at a much slower pace than the market in general. So much so, that even a much delayed budget offering like the SiS 315 handily disposed of the company’s new cards. Trident’s graphics division was sold to SiS’s XGI subsidiary in June 2003.

The S3 Savage4 was a step up in performance from the SiS and Trident offerings. The card had been announced in February, with retail availability from May and with a price of $100-130 depending on whether it had 16 or 32MB of onboard memory. S3’s texture compression introduced with the Savage3D, ensured that even with a limited 64-bit memory bus, textures up to 2048×2048 bytes could be accommodated.

s3-savage4

Diamond Viper II Z200 (S3 Savage4)

The Savage4 became S3’s first card capable of supporting multitexturing, and the first card to support the AGP 4x interface. But not even improved drivers and a reasonable feature set could offset the fact that it struggled to attain the performance level of the previous generation of cards from 3dfx, Nvidia and ATI. This cycle was repeated at the end of the year when the Savage 2000 launched. The card achieved better than parity with the TNT2 and Matrox G400 at 1024×768 and below, but 1280×1024/1600×1200 was a different story.

The first of 3dfx’s Voodoo3 series arrived in March, backed by an extensive television and print advertising campaign, a new logo – now with a small “d” – and vivid box art. The long awaited Rampage chipset still hadn’t arrived, so the boards sported much the same architecture, if somewhat tweaked in the Avenger chipset. They remained hamstrung by reliance upon 16-bit color, 256×256 texture support limitation, and a lack of hardware-based transform and lighting (T&L). These factors were starting to become crucial with game developers and 3dfx continued to disappoint by not delivering on their architectural and feature-set promises.

In what seems a time-honoured tradition, 3dfx blamed an earthquake for its failing fortunes, although this didn’t really impact ATI and Nvidia that much. In another sign of 3dfx’s mounting troubles, the company announced in December that their Glide proprietary graphics API would finally be available as open source, at a time when DirectX and OpenGL continued to gain traction with game developers.

March also saw Nvidia release the Riva TNT2, including the first of its Ultra branded boards with faster core and memory speeds, while Matrox unveiled the G400 series.

In a sign of 3dfx’s mounting troubles, the company announced that their Glide graphics API would be released as open source, as DirectX and OpenGL continued to gain traction with game developers

The TNT2 utilized TSMC’s 250nm process and managed to deliver the performance Nvidia had hoped for with the original TNT. It comprehensively outperformed the Voodoo 3, the only exceptions were applications utilizing AMD’s 3DNow! CPU instruction extension in conjunction with OpenGL. Keeping up with with 3dfx and Matrox, the TNT2 included DVI output for flat panel displays.

Meanwhile, the Matrox G400 managed to outperform both the Voodoo 3 and TNT2 for the most part, although OpenGL support still noticeably lagged behind. At $199-229, the card represented excellent value for money from a performance, image quality and feature set standpoint. The ability to drive two monitors via twin display controllers (called DualHead by Matrox) started a multi-monitor support trend for the company. The secondary monitor in this case was limited to 1280×1024 resolution.

The G400 also introduced Environment Mapped Bump Mapping (EMBM), which provided better texture representation. For those with slightly deeper pockets, a higher clocked G400 MAX at $250 ensured you had the fastest consumer card in the market until GeForce 256 DDR-based boards such as the Creative Labs 3D Blaster Annihilator Pro hit the store shelves in early 2000.

Matrox concentrated on the professional market from this point forward, with a brief return to the gaming market in 2002 with the Parhelia. Triple monitor support was not enough to offset inferior gaming performance and the new wave of DirectX 9.0 compatible hardware.

 

Matrox G400 Tech Demo EMBM

By the time the smoke had cleared from the 3dfx, Nvidia, and Matrox launches, 3DLabs slipped in the long awaited Permedia 3 Create!. The card had been announced months ago and was targeted at the professional user with an interest in gaming. As such, 3DLabs prioritized 2D functionality over 3D, leveraging their professional graphics expertise which had been acquired in July 1998 from Dynamic Pictures, designers of the superlative Oxygen range of workstation cards.

Unfortunately for 3DLabs, what was important for workstation graphics tended to be complex polygon modelling – usually at the expense of texture fill rate. This was pretty much the reverse of gaming card requirements, where texturing and eye candy took precedence over elaborate wire frame modelling.

Overpriced and outperformed by TNT2 and Voodoo 3 in gaming scenarios, and not far enough ahead in workstation tasks to differentiate it from the competition, the Permedia 3 represented the last attempt by 3DLabs to build a card with gaming in mind. From that point on, 3DLabs would concentrate their efforts on the GLINT R3 and R4 based Oxygen cards; ranging from the $299 VX-1 to the $1499 GVX 420, while the Wildcat range (such as the $2499 Wildcat II -5110) were still based on Intense3D’s ParaScale graphics processors from the acquisition of Intense3D from Intergraph in July 2000. 3DLabs would begin to integrate their own graphics into the Wildcat series from 2002, when Creative Technology bought the company with their P9 and P10 processors.

The company left the desktop market in 2006 and shifted its focus to media orientated graphics as the division merged with Creative’s SoC group, renamed ZiiLabs — sold to Intel in November 2012.

ATI’s strides had been somewhat incremental since the Rage 128’s debut. In late 1998 the company added AGP 4x support and a clock boost to the Rage 128 to release the Pro variant of the card, which also featured video capture and TV-Out options. The Rage 128 Pro’s gaming performance was broadly equal to Nvidia’s TNT2, but fell way short of the TNT2 Ultra, something ATI intended to remedy with Project Aurora.

rage-fury-maxx

ATI’s Rage Fury MAXX combined two Rage 128 Pro chips on a single board

When it became apparent ATI would not have a chip capable of winning the performance race, the project changed tack and was realized as the Rage Fury MAXX, which comprised two Rage 128 Pro’s on the same PCB. Specification numbers were impressive, with the two chips each taking responsibility for rendering alternate frames and essentially halving the gaming workload between them. In practice, while the card bested the previous generation offerings, it wasn’t a match for the S3 Savage 2000 and would fall consistently behind the upcoming GeForce 256 DDR. The latter was only slightly more expensive at $279 versus ATI’s $249.

Nvidia’s GeForce 256 was the first graphics chip to actually be called a GPU, based on the addition of a hardware-based transformation and lighting engine (T&L).

Less than two months after the Rage Fury MAXX announcement, Nvidia announced the GeForce 256 SDR on October 1, followed in February 2000 by the DDR version. It would be the first card to use this form of RAM. A 23 million transistor chip built on TSMC’s 220nm process, this was the first graphics chip to actually be called a GPU (Graphics Processing Unit), based on the addition of a transformation and lighting engine (TnL or T&L).

This engine allowed the graphics chip to undertake the heavily floating-point intensive calculations of transforming the 3D objects and scenes – and their associated lighting – into the 2D representation of the rendered image. Previously, this computation was undertaken by the CPU, which could easily bottleneck with the workload, and tended to limit available detail.

 

Nvidia Grass Demo (GeForce 256)

The GeForce 256’s status as the first to incorporate programmable pixel shaders with the use of T&L has long been the subject of debate. That’s because a number of designs also incorporated T&L either at the prototype stage (Rendition VéritéV4400, BitBoys Pyramid3D, 3dfx Rampage), at a level approaching irrelevancy (3DLabs GLINT, Matrox G400’s WARP), or through a separate on-board chip (Hercules Thriller Conspiracy).

None of these achieved commercial functionality, however. Moreover, by being first to adopt a four pipeline architecture, Nvidia had an inbuilt performance lead over the competition. This combined with the T&L engine enabled the company to market the GeForce 256 as a professional workstation card.

A month after the desktop variant became available, Nvidia announced their first range of professional workstation Quadro cards, the SGI VPro V3 and VR3, based on the GeForce 256. The cards leveraged SGI’s graphics technology Nvidia had gained access to through a cross-license agreement signed in July 1999.

Nvidia’s $41 million profit for the year on revenue of $374.5 million comfortably eclipsed 1998’s figures of $4.1 million and $158.2 million, respectively, and represented a huge leap from 1997’s $13.3 million in revenue. Microsoft’s initial payment of $200 million for the NV2A (the graphics core for the Xbox) added to Nvidia’s coffers, as did a $400 million secondary bond and stock offering in April.

These numbers paled next to ATI’s $1.2 billion in revenue and $160 million profit for the year, thanks to a 32% share of the graphics market. But the latter was on the verge of losing a significant portion of the OEM business thanks to Intel’s 815 series of integrated graphics.

This article is the second installment on a series of four. Next week we’ll get closer to present time, when the industry takes a turn and major consolidation takes place leaving space for only two players. The beginning of the GeForce vs. Radeon era.

Joseph Forbes (691)

Information Technology Consultant. For SMB, SOHO, and Online business. From Computers to Telecommunications this guy has been into it since hippies made it hip. Drone Pilot and Tech Aficionado I get to travel the State of Texas to help businesses succeed.