Random Matrices in Three Short Stories

This post is a slightly modified version of a student talk I gave for the IISER Pune Science Club earlier this year. All the three stories (this one and the two to follow) will be accessible to anyone with some exposure to high school physics and mathematics. Readers with a formal training in advanced level physics and/or mathematics have all the rights to criticize the author for an over-simplistic presentation.


The first story begins with that of my personal hero, Freeman John Dyson. Growing up as a kid in England in the period between the two of the most disastrous wars this planet has seen, Dyson developed a strong interest for everything numbers. This interest, quite naturally, evolved into a passion for physics and mathematics. When the eighteen-year-old Dyson arrived at Cambridge in 1941 as a student, there were few physicists around – a constant phenomenon at the universities during the war years; physicists were perhaps the most suitable people to be sent away with war-related responsibilities.

As it happened, the greatest influence on Dyson, while at Cambridge, was the famous mathematician duo, Hardy and Littlewood [1]. After working for a few years on number theory problems (he published a couple of influential papers in this period), Dyson moved to the United States where he was appointed a professor at the Cornell University; he didn’t have a Ph.D., though (and never got one).

With the brightest of physicists around (Richard Feynman and Eugene Wigner to name just two from a pretty illustrious list), Dyson’s focus shifted towards problems from quantum physics. (Number theory, however, was to appear in his life again, albeit for a small period, as we’ll see in the second part of this series of stories.)

Quantum mechanics, one of the two greatest triumphs of the twentieth century physics (General Relativity being the other one, of course), reformulates the study of physical systems in the language of the Hamiltonian. If you were reading a chapter from a textbook on Quantum Physics (which this post is not), you’d be told that this Hamiltonian is a Hermitian operator. Now, a Hermitian operator, to put in rather simple words, is a matrix with some special properties. And a matrix is nothing more (?) than an array of numbers. But what has a matrix got to do with a physical system?

As it turns out, the way a system evolves can be mathematically expressed in terms of the product of certain matrices – and the Hamiltonian of a system, which itself is a matrix, determines what matrices you should be multiplying. Say, you want to study a Hydrogen atom. You’ll have to begin with its Hamiltonian and see what all you can say about the energy of its components. And then you can check how well you did your job by comparing your results with a Hydrogen spectrum. (A quick Google Images search for a Hydrogen spectrum at this point is strongly recommended for those who haven’t seen one.)

As seems intuitive, the Hamiltonian of a Hydrogen atom should be a lot simpler than that of a more complicated atom. A more complicated atom would also mean a heavier atom; consider, for example, a Uranium atom, which is over 200 times heavier than the Hydrogen atom. But with increasing weight comes increasing complexity (sic); heavier atoms have a larger number of interacting components. This essentially renders it impossible to write down a Hamiltonian that you can use to predict the spectrum of your complex atom. Poof! All the powers quantum mechanics bestowed upon you go awry.

Not really. What Freeman Dyson and Eugene Wigner showed that you can make a pretty smart guess about the Hamiltonian of such a complex, heavy atom using a Random Matrix – a matrix which contains random numbers. It’s like having your usual matrix except for that the entries are drawn randomly from a set of numbers.

H = \begin{pmatrix} H_{11} & H_{12} \\ H_{21} & H_{22} \end{pmatrix}
H, which can describe the Hamiltonian of a system, is a matrix with elements
H11, H12, H21 and H22. You can call Random Matrix if the entries of H
are random variables. Here, H has four elements – in practice, you’ll have to
take much larger matrices for your computations.

Now, the essential idea here is to realize that to make predictions about the system you are trying to model, you need to consider an ensemble of all the random matrices which would give rise to the properties you’re expecting your system to have. These properties are manifested in terms of certain symmetries of the system under consideration. (The notion of symmetry is both utterly important and extremely fascinating in the physical sciences. If you’re ever in a need to spot a theoretical physicist in a large crowd, a passing mention of symmetry will do the job.)

The next step in your analysis of complex atoms will then be to study the statistical properties of the matrix ensemble with the appropriate symmetries. And that’s pretty much all. There is this entirety of machinery coming from the theory of random matrices that gives you the freedom to treat the complex atom as a black box with a very large number of interacting components and to still be able to extract out the essential information with great accuracy [2].

A point that must not go without a mention here is the practical importance of studying heavy atoms. Heavy atoms act as the source of nuclear energy, the foremost candidate for being the chief contributor to the fulfilment of our energy needs in the future.

If you’re feeling bewildered and fascinated by the fact that such an apparently unrelated notion of random matrices can help you predict the spectra of complex atoms, you’re not only in the company of some of the greatest minds working in this area, but also stand a chance of being a part of the wonderful discoveries yet to be made. Our understanding is very limited at present – we know that things work but have very little idea why. There appears to be some statistical law of large numbers working behind the scenes. Interestingly, it was in mathematical statistics where random matrices had first made an appearance in the 1930s with the work of the agricultural statistician John Wishart.

Talking of statistics and numbers, the theory of random matrices have also led great insights into the solutions to challenging problems in number theory, which will be the theme of the second of our stories. As you’ll see, Random Matrix Theory has made its way far beyond statistics and atomic physics. A comprehensive and insightful reference for those interested is the classic book Random Matrices by Madan Lal Mehta. (Mehta, who had a very fruitful collaboration with Dyson and Wigner, was one of the leading contributors to the subject.) However, you’ll need to have some background in linear algebra to follow the text; but it is surely worth the efforts.


[1] Freeman Dyson, Selected Papers of Freeman Dyson with Commentary, American Mathematical Society, 1996.
[2] Madan Lal Mehta, Random Matrices, Vol. 142, Pure and Applied Mathematics, Academic Press, Ed. 3, 2004.

Thanks are due to Divya Singh and Ramesh Chandra for providing essential feedback for this series of posts on Random Matrices. You can find the second and the third posts here and here.

This entry was posted in Physics and Mathematics and tagged , , , , , . Bookmark the permalink.

2 Responses to Random Matrices in Three Short Stories

  1. Rick says:

    Reblogged this on cantaloupetales and commented:
    A wonderful brief insight into the advent of Random Matrices by a close friend.


  2. Pingback: Random Matrices in Three Short Stories – Strings and Loops

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s