DAC Definition

  Hardware
DAC Definition

Stands for “Digital-to-Analog Converter” and is often pronounced “dac.” Since computers only recognize digital information, the output produced by computers is typically in digital format. However, some output devices only accept analog input, which means a digital-to-analog converter, or DAC, must be used.

The most common use for a DAC is to convert digital audio to an analog signal. This conversion typically takes place in the sound card, which has a built-in DAC. The digital signal, which is basically a stream of ones and zeros, is transformed into an analog signal that might take the form of an electrical charge. This electrical charge is recognized by most speaker inputs and therefore can be output to a speaker system.

DACs are also used for converting video signals. Historically, most video displays, such as TVs and computer monitors used analog inputs. Only in the last couple of years have digital displays with DVI and HDMI connections become commonplace. Therefore, in order for a computer to output to an analog display, the digital video signal must be converted to an analog signal. This is why all video cards with an analog output (such as a VGA connection) also include a DAC.

Any time a signal is converted from one format to another, there is a potential loss of quality. Therefore, it is important to have a high-quality DAC whether you are converting audio or video signals. The same holds true when performing the opposite conversion, which requires an analog-to-digital converter, or ADC.

LEAVE A COMMENT