Integrated Circuit Definition

  Hardware
Integrated Circuit Definition

An integrated circuit, or IC, is small chip that can function as an amplifier, oscillator, timer, microprocessor, or even computer memory. An IC is a small wafer, usually made of silicon, that can hold anywhere from hundreds to millions of transistors, resistors, and capacitors. These extremely small electronics can perform calculations and store data using either digital or analog technology.

Digital ICs use logic gates, which work only with values of ones and zeros. A low signal sent to to a component on a digital IC will result in a value of 0, while a high signal creates a value of 1. Digital ICs are the kind you will usually find in computers, networking equipment, and most consumer electronics.

Analog, or linear ICs work with continuous values. This means a component on a linear IC can take a value of any kind and output another value. The term “linear” is used since the output value is a linear function of the input. For example, a component on a linear IC may multiple an incoming value by a factor of 2.5 and output the result. Linear ICs are typically used in audio and radio frequency amplification.

LEAVE A COMMENT