Hardware: The New Software

1 Conversation


Most people are aware of how the invention of the 'microchip' integrated circuit, by Jack Kilby at Texas Instruments, in 1958, revolutionized the world. Integrated circuits could perform the functions of many individually packaged transistors, which made digital electronic devices practical for consumers. This revolution culminated in the development of computers which fit entirely on a single silicon chip. The software controlled microcomputer has become a core element of most all modern electronic devices. But now, new developments in chip design may render this approach obsolete and eliminate the boundary between electronic hardware and software. These new chips are known as Field Programmable Gate Arrays, or FPGAs.


Historical Background


For most of the 1960's and early 1970's, digital electronics were built up from standard integrated circuits, or ICs, which performed simple logical fuctions, or stored a single bit of memory. Circuits which perform these basic logic functions are called 'gates'. These chips were sometimes referred to as "small scale integrated circuits" or SSI devices, and contained perhaps a dozen individual transistors and their interconnections. As chip making technology improved, leading to smaller and smaller transistors, manufacturers could add more and more functionality to their chips, leading to the development of "large scale integrated ciruits, (LSI), or "very large scale integrated circuits", or VLSI. The development of LSI and VLSI chips was and remains very expensive and time-consuming. LSI and VLSI chips therefore were relegated to critical functions like microprocessors and memory chips, where the high sales volume could recoup the massive investment in developing the component.


If you were to look at a digital circuit board design from the mid 1970's, you would often see several hundred SSI components on a circuit board, and one or two LSI devices. The circuit board's copper traces serve to connect the gates together to perform the desired functionality. To create a new function, one merely developed a new circuit board to produce a different interconnections between the various gates. This led engineers at a small start-up company called MMI to develop microchips that contained a generic set of gates, which could be connected in any arrangement by a method known as fuse programming. These devices were known as Programmable Array Logic (PAL) chips. They contained an array of OR gates and an array of AND gates, with a grid of fine metal (or polysilicon) wires connecting the AND array to the OR array. Every place that the wires meet, a tiny electrical fuse is formed. To customize the part, the user places the chip in a special programming device, which runs a pulse of electrical current between the row wire and the column wire, which opens the fuse, and removes the undesired connections. When all the undesired conections have been opened, the device is a unique, custom integrated circuit which could perform the function of dozens of individual standard integrated circuits. PALs were critical in allowing the Apple II and Macintosh computers to be developed for very low cost, and earlier were instrumental in the development of the Data General Eagle and DEC VAX computers.


About the same time as PALs were being developed, some semiconductor manufacturers started offering much more complex devives called 'Gate Arrays' as low-cost alternatives to custom integrated circuits, which are also known as ASICs. The gate array provides the designer with a regular grid arrangement of gates and latches (1 bit memories). The
semiconductor manufacturer would build up the wafers with the uderlying basic blocks of logic circuitry intact, but leave off the top two or three layers of "metal" which would later be used for interconnections. Users would then design just the interconnecting layer, which the chip supplier could then lithographically pattern on top of the wafers, saw the wafer into individual ICs, package them, and ship them to the customers.


Gate arrays provided significant performance advantages over PALs and much higher
densities, and also usually contained some memory circuits as well. However, there was still a large cost to develop a gate array for a specific use, as well as a long time delay. If a gate array turned out to have a design defect, it could take weeks for the chip maker to fabricate a new batch of wafers to correct the problem.


The next breakthrough came with the development of "Field Programmable Gate Arrays", or FPGAs. These devices were intended to provide the functionality of gate arrays with the
easy programmability of PALs. FPGAs are essentially gate arrays in which the interconnections are made by the user of the part. These interconnections can be formed using a fuse-like method, called 'antifuse', using non-volatile electrical storage similar to that used in Flash memory, or simply by storing the interconnect information in an internal RAM within the device, and letting an external device (which could be a microcomputer) send the configuration, or programming, data to the device prior to its use. In this latter approach, the logic cells themseleves can be simplified by converting them into look up tables. Unlike the other approaches, this method requires the FPGA to be re-programmed each time it's powered up.


This third approach, referred to as 'SRAM based FPGAs', turned out to be the most sucessful in the marketplace. When the circuit is powered up, an external non-volatile memory copies the programmimg information into the FPGA, and then starts the FPGA's normal operation. SRAM-based FPGAs can be made on the same wafer fabrication lines used for RAM or microprocessor chips, while the Flash or antifuse-type FPGAs require special processing. This sharing of fabrication lines meant that the billions of dollars invested in microprocessor and RAM factories could be leveraged to build these new devices, which now can be purchased economically in chips that contain over a million gates. By comparison, an early 8-bit microprocessor from the 1980's contained only a few thousand gates.


The Programmable Logic Revolution


The development of these very high gate count FPGAs and the ease of re-programming them is making a revolution in the way that electronics are designed and used. Most significantly, these devices blur the distinction between what is done in hardware and what is done in software. For example, suppose an electronic device needs to very quickly read in ten numeric values, multiply them together, and then create an output. Until recently, one would use a microprocessor and some software to read in ten numbers, and then multiply them together one at a time, and then generate the output value. But with FPGAs, a designer can simply create ten multipliers within the same chip, all operating simultaneously to produce the desired answer in one fell swoop.
Not only that, but entire processors such as the ARM or PowerPC chip can now be implemented within the FPGA itself. This allows the designer to start product development while the
decisions as to what functions are implemented in hardware and what are implemented in software, or even what processor to use, are postponed until later on.


Design of these FPGAs isn't done by drawing schematics. Instead, the design is written out in one of two software-like languages, VHDL or Verilog. This
code is 'synthesized' or compiled, into a list of standard logic gates ans connections between gates, and then an automatic routing (maze-running) software tool maps the generic gates into a specific FPGA device's gates (or SRAM truth table entries) and interconnects to fit the synthesized design into the circuitry available within an actual device. These synthesis tools also know about the internal signal delays within the chip and can be told that a certain function must be able to operate at a target clock rate, so the synthesis tool's maze running function can repeatedly try alternate connection approaches until one which meets the timing requirements is discovered.


Why isn't everything a FPGA?


Medium to high-volume consumer electronics are extremely cost-sensitive, and margins are razor-thin. This usually justifies the development of custom chips for high volume products. A custom ASIC can achieve as much as 20x the equivalent gate density as a FPGA, with the corresponding reduction in unit cost, but the development cost could easily be 100X as large. For smaller markets and innovative companies which are "inventing the future", FPGAs are proving to be the answer.


Examples of products being marketed now where FPGAs are key components include Gibson's new "MaGIC" digital guitar, ReplayTV's Personal Video Recorder, the Sensio 3D true 3D imaging system, NASA's Mars Exploration Rover, and Harman-Kardon's CDR2 dual CD recorder/player.


For more information, the websites of the largest FPGA chip makers provide excellent information. These companies are currently Xilinx, Altera, Lattice Semiconductor, Actel, and Atmel.


Bookmark on your Personal Space


Entry

A3000385

Infinite Improbability Drive

Infinite Improbability Drive

Read a random Edited Entry


Written and Edited by

Disclaimer

h2g2 is created by h2g2's users, who are members of the public. The views expressed are theirs and unless specifically stated are not those of the Not Panicking Ltd. Unlike Edited Entries, Entries have not been checked by an Editor. If you consider any Entry to be in breach of the site's House Rules, please register a complaint. For any other comments, please visit the Feedback page.

Write an Entry

"The Hitchhiker's Guide to the Galaxy is a wholly remarkable book. It has been compiled and recompiled many times and under many different editorships. It contains contributions from countless numbers of travellers and researchers."

Write an entry
Read more