tags: mus-407 digital-audio audio audio-programming

Digital Delay Line

A digital delay line (DDL) is a type of [audio signal] processing that digitally implements a [delay line]. It stores a sequence of [audio samples] in memory and outputs them after a period of time.

Mixing delayed output with input produces a variety of effects:

These effects are split between [fixed delays] and [variable delays].

Analog vs Digital Delay Lines

Analog delay:

Digital Delay:

Signal Flow

Basic DDL (inherently stable):

Basic DDL Signal Flow

Basic DDL with [feedback] (unstable if delayed signal [amplitude] > 1):

Basic DDL with feedback

Delays in Parallel/Series

Parallel: multiple copies are individually sent through multiple delays; outputs are summed

Series: signal is sent through a succession of delays; input/output are usually mixed at each stage

Delays in parallel/series

Implementation

Uses a [circular queue] or circular buffer

Digital Filters vs. Digital Delays

In terms of design, digital [filters] & delays are essentially distinguishable

Filters require delay/mixing in order to cancel/reinforce certain [frequencies].

Consider a simple [lowpass filter]:

$$y[n] = 0.5 \times x[n] + 0.5 \times x[n-1]$$

Performs averaging function on consecutive sample pairs

Creates a slight smoothing effect on [waveform] shape, thus attenuating higher frequencies.

For a waveform at [Nyquist frequency], waveform will be completely nullified (consecutive samples are equal and opposite, yielding a zero average)

Longer-term averaging function (ten incremental sample delays in parallel) creates a stronger smoothing effect, lowering the [cutoff frequency]:

$$y[n] = 0.1 \times x[n] + 0.1 \times x[n-1] + ... + 0.1 \times x[n-9]$$

Digital filters and digital delays are closely related in design and result:

General naming distinction:

Sources