What is Johnson noise?

Johnson noise, roughly defined, is the random variation of voltage due to the thermal agitation of charge carriers in a resistor. Originally described and measured by J. B. Johnson ([1], [2]), a physicist at Bell Laboratories who performed his experiments with a vacuum tube amplifier and thermocouple, the effect was first explained theoretically by H. Nyquist in 1928 [3]. Some of the mathematical derivation that follows is drawn from these original sources, as well as from notes given by Hugh Lippincott as an introduction to Phys504Lb [4].

Let's adopt the circuit model of Nyquist in order to understand the behavior of the resistor. Imagine a resistor R at temperature T connected in series with an identical resistor R at the same temperature, through a transmission line of length L. To avoid losses due to radiation from the wires connecting the resistors, one line may be enclosed by the other, as in a coaxial cable. Imagine the first resistor to be held at a finite temperature T, so that thermally agitated electrons within it produce a fluctuating voltage signal, and now represent this signal with the source V, as shown in Figure (1.1).

Fig. 1.1

By placing the second resistor at the distant end of the transmission line, a boundary condition has been achieved: namely, that the transmission line between 0 and L ought to accommodate voltage waves $V(x,t)$ obeying boundary conditions

\begin{equation} V(0,t) = V(L,t) \end{equation}

where, generally,

\begin{align} V(x,t) = V_0\mathrm{e}^{i(kx-\omega t)}. \end{align}

The voltage waves propagate at the speed of light,

\begin{align} c=\frac{\omega}{|k|}. \end{align}

The boundary condition implies

\begin{align} kL=2\pi n, \ n \in \mathbbmss{Z}, \end{align}

or, assuming the transmission line to be very long (so that many modes $n$ are permitted),

\begin{eqnarray} L\; \mathrm{d}k = 2\pi\; \mathrm{d}n \\ \Rightarrow \frac{1}{L}\; \mathrm{d}n = \frac{1}{2\pi}\; dk. \end{eqnarray}

This is simply the density of modes — the number of modes, or “voltage states” allowed per unit length — in the line. Notice that both right- and left-propagating voltage waves are admitted in the solution (2), so that both $k$ and its associated frequency $\omega$ range over positive and negative values.

Propose now that each mode of voltage oscillation is a mode in which the resistor on the right can absorb radiation from the (quantized) electromagnetic field. Since one imagines the resistor to be an idealized, one-dimensional “lumped element”, propose also that it absorbs radiation in the single dimension of the transmission line. Then the power — energy per unit time — absorbed by the resistor is: the energy of a certain mode multiplied by the number of quanta (according to Planck) in that mode, integrated over all modes, multiplied by the rate at which the quanta are absorbed ($c/L$),

\begin{eqnarray} P_\text{absorbed} &=& \frac{c}{L}\int_{-\infty}^{\infty}{\hbar|\omega| \frac{1}{\mathrm{e}^{\frac{\hbar|\omega|}{k_BT}}-1}\frac{L}{2\pi} \; dk\\ &=& \frac{1}{2\pi}\int_{-\infty}^{\infty}{\mathrm{d}\omega \ \frac{\hbar|\omega|}{\mathrm{e}^{\frac{\hbar|\omega|}{k_BT}}-1}. \end{eqnarray}

The current generated in the circuit by the fluctuating voltage is simply

\begin{align} I = \frac{V}{2R}, \end{align}

so that the average power dissipated (emitted) by the resistor is

\begin{align} P_\text{emitted} = \frac{\langle V^2 \rangle}{4R}. \end{align}

The notation $\langle ... \rangle$ represents the time average:

\begin{align} \langle V^2 \rangle = \lim_{T \rightarrow \infty} \frac {1}{T} \int_{-T}^{T}{\mathrm{d}t \ V(t)V^{*}(t)}. \end{align}

In Eq.(9) and the expressions that follow, the voltage function $V(t)$ is taken to be that at frequency $\omega$, Eq.(2). Then the power spectrum — the distribution of power, or square voltage, over many frequencies — is defined as follows:

\begin{align} S(\omega) = \int_{-\infty}^{\infty}{\mathrm{d}\tau \ R(\tau) \mathrm{e}^{-i\omega \tau}}, \end{align}


\begin{align} R(\tau) = \lim_{T \rightarrow \infty} \frac{1}{T} \int_{-T}^{T}{\mathrm{d}t \ V(t)V^*(t-\tau) \end{align}

is the autocorrelation function of the function V(t) over the period $[-T,\ T]$. Then, integrating $S(\omega)$ over all frequencies,

\begin{eqnarray} \int_{-\infty}^{\infty}{\mathrm{d}\omega \ S(\omega)} &=& \lim_{T \rightarrow \infty} \frac{1}{T} \int_{-\infty}^{\infty}\mathrm{d}\omega \int_{-\infty}^{\infty}\mathrm{d}\tau \int_{-T}^{T}\mathrm{d}t \ V(t)V^*(t-\tau)\mathrm{e}^{-i\omega \tau} \\ &=& \lim_{T \rightarrow \infty} \frac{2\pi}{T}\int_{-\infty}^{\infty}\mathrm{d}\tau \int_{-T}^{T}\mathrm{d}t \ V(t)V^*(t-\tau) \delta(\tau) \\ &=& \lim_{T \rightarrow \infty} \frac{2\pi}{T}\int_{-T}^{T}\mathrm{d}t \ V(t)V^*(t) \\ &=& 2\pi \langle V^2 \rangle. \end{eqnarray}

Now, equating the power dissipated by the resistor (Eq.(8)) and that absorbed due to thermal radiation (Eq. (6)), we have

\begin{align} \frac{\langle V^2 \rangle}{4R} = \frac{1}{2\pi}\int_{-\infty}^{\infty}{\mathrm{d}\omega \ \frac{\hbar|\omega|}{\mathrm{e}^{\frac{\hbar|\omega|}{k_BT}}-1}. \end{align}

By the relationship in Eq. (12),

\begin{align} \frac{1}{4R} \frac{1}{2\pi} \int_{-\infty}^{\infty}{\mathrm{d}\omega \ S(\omega)} = \frac{1}{2\pi}\int_{-\infty}^{\infty}{\mathrm{d}\omega \ \frac{\hbar|\omega|}{\mathrm{e}^{\frac{\hbar|\omega|}{k_BT}}-1} \end{align}

so that

\begin{align} S(\omega) = 4R \frac{\hbar |\omega|}{\mathrm{e}^{\frac{\hbar |\omega|}{k_B T}}-1}. \end{align}

(Equating the integrands of two definite integrals cannot be justified mathematically. A strict derivation would then appeal to the balance of emitted and absorbed power over an arbitrarily small frequency interval, rather than simply over the full range; this is the argument Nyquist constructs in [3]. There, he considers the following 'circuit thought experiment': imagine that over all frequencies, the exchange of power between the two resistors (at equal temperature) is equal, but over a small range of frequency, the resistor on the left radiates more power than it absorbs from that on the right. Then introduce a non-dissipative circuit element between the two that interferes more with the transfer of energy in the frequency range of interest than in any other range, so that now more total power flows from the right resistor to the left than in the other direction. But since the resistors are initially at the same temperature, to let the system evolve would be to raise the temperature of a hotter thermal body using the heat of a colder body, by simply introducing a passive circuit element. So, there must be a 'detailed balance' of power over each tiny range of frequency, in accordance with the second law of thermodynamics.)

Then, in the classical limit, $\hbar \omega \ll k_BT$,

\begin{align} S(\omega) \equiv S_V(\omega) = 4Rk_BT, \end{align}

the Nyquist relation between the voltage power spectrum and temperature. The current power spectrum is closely related,

\begin{align} S_I(\omega) = \frac{1}{R^2}S_V(\omega) = \frac{4k_B T}{R}. \end{align}

Since the actual resistor will be situated in a circuit with an amplifier designed to enhance the tiny voltage fluctuations, and since the transmission line will exhibit capacitive effects, a realistic circuit diagram would look like Fig. 1.2:

Fig. 1.2

The parallel combination of current source $i$ and resistor $R$ is the Norton equivalent of $R$ and a small fluctuating voltage source (as seen in Fig. 1.1). The net input voltage is then denoted by $V_\text{in}$, and the additional current and voltage associated with the operation of the amplifier itself are labelled $I_\text{amp}$ and $V_\text{amp}$. The input voltage power spectrum takes the simple form

\begin{align} S_{V_\text{in}} = S_I[\Re(R||C)]^2 + S_{I,\text{amp}}[\Re(R||C)]^2 \end{align}

where $S_I$ is the current power spectrum due to the resistor (as seen in Eq.(17)), $S_{I,\text{amp}}$ is the current power spectrum due to the amplifier, and $[\Re(R||C)]^2$ is the squared real part of the parallel resistor-capacitor combination. The voltage power spectrum output by the amplifier is simply

\begin{align} S_{V\,\text{out}} = (S_{V_\text{in}} + S_{V,\text{amp}})G^2 \end{align}

where $S_{V,\text{amp}}$ is the voltage power spectrum of the amplifier itself, and $G$ is its gain. The Eqs. (16)—(19) relate the measured quantity, $S_{V\,\text{out}$ (as described in the Experimental Procedure), with temperature.

One important feature of Nyquist's model is that Johnson noise is independent of the material chosen for the resistor. This feature is exploited in accurate thermometry, allowing one to measure precise temperatures without worrying about the particular type of sensor material having a contaminating effect. See [5] for a review of such techniques. Quite surprisingly, Johnson noise finds application in classical information cryptography [6]. In our experimental study we restrict ourselves to test the validity of the Nyquist relation (16) and to measure kB.

1. J. B. Johnson, Nature 119, 50 (1927).
2. J. B. Johnson, Phys. Rev. 32, 97 (1928).
3. H. Nyquist, Phys. Rev. 32, 110 (1928).
4. Hugh Lippincott, Notes on Johnson Noise, given 19 February 2007.
5. D R White et al, Metrologia 33, 325-335 (1996).
6. L B Kish, Phys. Lett. A 352, 178-182 (2006) /physics/0509136 ; A Cho Science 309 5744, 2148 (2005)
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-Share Alike 2.5 License.