Moments as tensors

We discussed the second multivariate moment a bit haphazardly in the last article. In general, we'd like a nice way of expressing the general moment (i.e. multivariate cross-moment).

Let $X=(X^1,\ldots X^n)$ be a vector of random variables, and consider their $p$th order moments ($p\le n$) -- these form a rank-$p$ tensor of dimension $n$, the moment tensor, given by:

$$Mp[X]^{j_1\ldots j_p}=\mathrm{E}(X^{j_1}\ldots X^{j_p})$$
(e.g. $p=1$ gives you the mean vector, $p=2$ gives you the badly-named auto"correlation" matrix) And the central moments form a similar tensor, the central moment tensor, given by:

$$mp[X]^{j_1\ldots j_p}=\mathrm{E}\left((X^{j_1}-EX^{j_1})\ldots (X^{j_p}-EX^{j_p})\right)$$
(e.g. $p=1$ gives you zero, annoyingly, but $p=2$ gives you the covariance matrix aka autocovariance matrix) But, well, each random variable $X^i$ can also be understood as a vector, remember? Let's write $X^i=(X^i_\alpha)$ for $\alpha$ a pseudo-index that represents the idea that $X^i$ is a vector (I guess this is really Penrose (abstract index) notation rather than Einstein notation).

Actually, let's also make the following extension to tensor notation: every Greek index is summed over, regardless of whether/how many times it's repeated and where -- and we take the expectation instead of the sum (which is like a normalized sum, or some sort of a trace). So we write:

$$Mp[X]^{j_1\ldots j_p}=X^{j_p}_\alpha\ldots X^{j_p}_\alpha$$$$mp[X]^{j_1\ldots j_p}=(X^{j_p}_\alpha-X^{j_p}_{\alpha_1})\ldots (X^{j_p}_\alpha-X^{j_p}_{\alpha_p})$$
Where we use different dummy indices $\alpha_1,\ldots\alpha_n$ to indicate that these are summed over earlier (since they're not repeated again in the expression). These changes to index notation are all an artifact of the fact that random variables are not really "fundamentally quadratic", but rather "fundamentally $p$-normed".



OK -- so that's the univariate cross-moment -- it can also be considered a multivariate moment, the moment of the random vector $X$ -- its mean is the mean vector, its variance is the covariance matrix, etc. What about cross moments between random vectors? And you can imagine that once we have that, we'll call it a moment of a random rank-2 tensor, and so on.

What we're really looking for is the moment of a random tensor. This is a rank $pq$ tensor where $p$ is the degree of the moment and $q$ is the rank of the random tensor. As an example, when $p=2$ and $q=2$, one gets a rank 4 tensor consisting of cross-covariance (and autocovariance) matrices.

Note that this is not at all some unnecessary generalisation -- measuring the correlation between random vectors is a thing with very significant practical implication.

For example, a time series is a random vector -- its covariance matrix represents its internal correlations (how well its current value predicts a future value), but often we're interested in looking at correlations between time series -- how does the price of gold correlate with the price of S&P 500, etc. Then this cross-covariance matrix will be a bivariate function of $(t_1,t_2)$, called the cross-correlation function.

No comments:

Post a Comment