From analog to digital


Two steps

  1. Sampling
  2. Quantization

Sampling Theorem

Let $s\in L^2(\mathbb{R})$ a band limited analogical signal un signal in the bandwidth $\nu\in[-\nu_0,\nu_0]$. Then $s$ can be reconstructed (ie, interpolated) without error from its samples $s(t_n)$ taken at times $t_n=\frac{n}{2\nu_0}$.

  • $\nu_0$ is called the cutting frequency
  • $\nu_s = 2\nu_0$ is called the sampling frequency
  • if $\nu_s < 2\nu_0$ the reconstruction is not possible + aliasing
  • if $\nu_s > 2\nu_0$ the reconstruction is possible

Quantization

Each values $s(t_n)$ must be mapped from a real value (infinite precision) to a decimal value (with finite precision).