User Tools

Site Tools


gaussian

Gaussian Integrals

Lots of integrals in life look like Gaussians.

\[ \int_{\mathbb{R}^n} \frac{e^{-\frac{|y|^2}{2}}}{(2\pi)^{n/2}} \mathrm{d}y = 1. \] In the heat kernel, we often have some rescaling $y \to \sqrt{\lambda} y$, scaling the integral to \[ \int_{\mathbb{R}^n} \frac{e^{-\frac{|y|^2}{2\lambda}}}{(2\pi \lambda)^{n/2}} \mathrm{d}y = 1, \] where perhaps we might take $\lambda = \frac{1}{2\pi}$ to get the familiar \[ \int_{\mathbb{R}^n} e^{-\pi |y|^2} \mathrm{d}y = 1. \] More generally, we might change coordinates by an invertible linear transformation $Sx = y$, whence $|y|^2 = |Sx|^2$ and \[ \int_{\mathbb{R}^n} \frac{e^{-\frac{1}{2} |S^{-1} x|^2}}{(2\pi)^{n/2}}\det(S) \mathrm{d}y = 1; \] that is, \[ \int_{\mathbb{R}^n} \frac{e^{-\frac{1}{2} |Sx|^2}}{\det(\sqrt{2\pi} S^{-1})} \mathrm{d}y = 1; \] Note that $|Sx|^2 = x^T A x$ with $A = S^T S$, and $\det(A) = \det(S)^2$; thus, for any symmetric $A$ \[ \int_{\mathbb{R}^n} \frac{e^{-\frac{1}{2} x^T A x}}{\sqrt{\det(2\pi A^{-1})}} \mathrm{d}x = 1; \] taking $A = \lambda I$ recovers our earlier formulae.

Each of these integrals is a measure on $\mathbb{R}^n$. Let $\mu_A$ denote the measure in this last formulation; a Gaussian integral is an integral $\int f(x) \mu_A(x)$ with respect to such a Gaussian measure. Let $\langle f \rangle_A$ denote this integral, or just $\langle f \rangle$ if $A$ is understood. Let $\mu_{\lambda}$ denote $\mu_{\lambda I}$, and let $\mu$ denote $\mu_1$.

The Wick formula

Gaussian integrals are particularly nice to evaluate. There's the usual Calculus 1 problem \[ \langle x \rangle = 0 \] since $x e^{-\frac{x^2}{2}} = -\frac{\mathrm{d}}{\mathrm{d}x} e^{-\frac{x^2}{2}}$ is a total derivative. We likewise compute \begin{align*} \langle x^2 \rangle &= \int_{-\infty}^\infty x \frac{\mathrm{d}}{\mathrm{d}x} e^{-\frac{x^2}{2}} \mathrm{d} x\\ &= \langle 1 \rangle = 1. \end{align*} Continuing this computation, clearly for all $n \ge 1$, $\langle x^{2n-1} \rangle = 0$ and \[ \langle x^{2n} \rangle = (2n-1) \langle x^{2n-2} \rangle = (2n-1)!! \] is a double factorial.

The goal is to extend this formula into the multivariate setting, and therefore find a rule to evaluate for any monomial the quantity $\langle x_{i_1} \cdots x_{i_k}\rangle$. Certainly if the degree of the monomial is odd, its Gaussian integral is zero; some variable must occur in odd degree, and then the single variable proof suffices. Moreover, integrating by parts gives $\langle x_i x_j \rangle_A = (A^{-1})_{ij}$.

Theorem (Wick). Let $f_1, \ldots, f_{2n}$ be linear polynomials. Then \[ \langle f_1 \cdots f_{2n} \rangle_A = \frac{1}{2^n n!} \sum \langle f_{p_1} f_{q_1} \rangle_A \cdots \langle f_{p_n} f_{q_n} \rangle_A, \] where the sum runs over permutations of $(1, \ldots, 2n)$.

Note that $\frac{(2n)!}{2^n n!} = (2n-1)!!$, as we would expect. The formula can be made more computationally efficient by summing just over those permutations for which the $p_i$ are increasing and $p_j < q_j$ for each $j$; there are $(2n-1)!!$ of these, so no normalization is needed. This is probably only really useful when $n = 4$ or $6$, after which there is already an explosion in the number of terms needed.

A consequence of Wick's formula is that the Gaussian integral of a monomial is a sum of products of matrix coefficients, and thus Gaussian integration against a monomial is a kind of a permanent-type operation applied to $A^{-1}$; thus Cramer's rule implies that Gaussian integrals of polynomials are always rational functions in the entries of $A$.

The proof of the Wick formula is simple; assume that the input polynomials are all monomial, and after an orthogonal change of coordinates assume that $A$ is diagonal. Then integrate by parts until you're done.

gaussian.txt · Last modified: by spencer