With the advent of large-scale integrated circuits coming into their own in the late 1970s and early 1980s, fueling the PC revolution and several other developments, came a succession of remarkably powerful graphics controllers. NEC introduced the first LSI fully integrated graphics chip in 1982 with the NEC µ7220, and it was wildly successful finding its way into graphics terminals and workstations, but not PCs built by IBM. It did get used quite extensively by aftermarket suppliers.
Hitachi did NEC one better and introduced their HD63484 ACRTC Advanced CRT Controller chip in 1984. It could support a resolution up to 4096 × 4096 in a 1-bit mode within a 2 Mbyte display (frame) memory. The ACRTC also proved to be very popular and found a home in dozens of products from terminals to PC graphics boards. However, these chips, pioneers of commodity graphics controllers, were just 2D drawing engines with some built in font generation. That same year IBM introduced their EGA which with its many clones became the standard for mainstream PCs. But companies that wanted high-resolution, bit-mapped graphics, chose the Hitachi HD63484.
Figure 1. ISA-16 based ELSA workstation add-in board using Hitachi HD63484
The LSI HD63484 was built with 2 µ CMOS technology and had around 60,000 transistors (a Motorola 68020 of the time had about 190k). The ARTC could run at 8 MHz.
The ARTC introduced 12k — (4096 × 4096 pixels) screen resolution; that is 8 times bigger than HD (1920 ×1080); however, it was only 1-bit deep, but it offered a unique (at the time) interleaved access mode for “flashless” displays. If you wanted 16-bit color (which it supported) than you would have to drop down to 1024 × 1024 resolution, which was astounding at the time and only a few monitors could support it. However, the super high-resolution monochrome was targeted at the emerging bit-mapped desktop publishing market. The chip had full programmability of the CRT’s timing signal capability for whatever monitor you hung on it.
The ARTC could support up to 2 Mbyte of video RAM and offered an asynchronous DMA bus interface that could be mapped to the PC ISA-16, VME or the P1014 16-bit busses, and according to the company, was optimized for the 68999 MPU family and the 68459 DMAC. With the DMA capability it was possible to provide Master or Slave synchronization to multiple ACRTCs or other devices.
Figure 2. VME Force Computer add-in board based on Hitachi HD63484 chip
The chip offered a high-level command which reduced software development costs. In this way, the ACRTC converted logical x – y coordinates to physical frame buffer addresses. It supported 38 commands, including LINE, RECTANGLE, POLYLINE, POLYGON, CIRCLE, ELLIPSE, ARC, ELLIPSE ARC, FILLED RECTANGLE, PAINT, PATTERN and COPY. An on-chip 32-byte pattern RAM could be used for powerful graphic environments. Conditional drawing functions were available for drawing patterns, color mixing and software windowing, and it supported clipping and hitting.
You could control four hardware windows with the ARTC, zooming and smooth scrolling in both vertical and horizontal directions. And it offered the capability of displaying up to 256 colors and the maximum drawing speed of 2 million pixel per second in monochrome and color applications which proved to be useful in high performance CAD terminals and workstations of the time.
For those workstation users, there were eight user definable video attributes that could be set, and it also had light pen detection.
The chip was very popular and got designed into several long-lifetime products. In order to provide a continued supply, clones of the chip developed using innovASIC’s MILES — Managed IC Lifetime Extension System, cloning technology.
Figure 3. Block diagram of Hitachi HD63484 graphics controller
This was in the early days of the PC and IBM had cleverly designed an expansion bus architecture that was only 8-bits wide in the original 1981 version of the PC, but by 1984 with the introduction of the PC AT, the bus got extended to 16-bits. With that came a flurry of graphics add in boards (AIBs), and the first generation of them used the NEC µ7220. By 1986 there were 88 AIBs and the Hitachi was displacing the µ7220, appearing in 22% of the AIBs. The ARTC was a breakthrough chip and by 1988 there were 194 AIBs being offered, and 24% of them had adopted the HD63484. The ARTC was being eclipsed by a new, more powerful, and programmable graphics processor, the Texas Instruments TMS34010, which we will discuss in the next installment of Graphics Chips Hall of Fame.
Hitachi tried to extend the ARTC design to a 3D chip, but the bandwidth needed, and other issues were too complicated and as a result they failed in what was a valiant effort. For reasons only known to the management of Hitachi, the company abandoned the graphics market, just as it was about to take off.
Benchmarking
The original IBM PC came with an ISA-based AIB called the monochrome display adapter or MDA, and it established a set of instructions on how to drive a display. Therefore, to replace the MDA one had to build an MDA compatible board (the terms “card” and “board” were, and still are, used interchangeably). The MDA could only generate monochrome 7 x 9 dot characters.
Right after the PC came out, the first independent graphics AIB supplier appeared, Hercules. Hercules offered the first bit-mapped AIB with a higher resolution 720 × 350. Also, during this period, entry-level graphics boards were also being introduced. In 1984, IBM, the standards setter, introduced the EGA — enhanced graphics adapter, which provided low resolution (640 × 350) 16-color bit-mapped graphics. The EGA chip was cloned by a half dozen suppliers and in one of the installments we will discuss the clones and their evolution.
AutoCAD, a new low-cost, PC-based computer-aided design (CAD) program was introduce right after the PC. In 1983. Don Strimbu created a detailed single view of a firehose nozzle, which became known as “The Nozzle.”
Figure 4. Don Strimbu’s Nozzle was a 2D drawing benchmark for many years (Source CAD Nauseam)
The Nozzle was used as a benchmark to see how fast a graphics AIB could render it. It wasn’t a totally fair test as the PC’s processor and memory were also in the loop an could dramatically influence the result. But it was all we had at the time and was an appreciated and well used benchmark for several years. The iconic image has since been reimagined in 3D and we’ll show that too in future installments.
Editor’s note: this article is part of a series originally written for IEEE’s Computing Now publication. The series is ongoing and will be continued in JPR’s Tech Watch with stores about new GPU advances as well. To see other stories in the series, search the category GPU History.