## Random Matrices in Three Short Stories – II

#### This is the second part of the series – the first one can be found here.

Two

If you ever happen to be in a conversation with someone with a devout love for number theory, it won’t be long before the conversation will evolve into one about prime numbers – about how fascinating these elementary, yet mystical creatures are and about how despite their apparent randomness, there is an intriguing rhythm, a captivating music in their distribution.

Prime numbers are the prime ingredients of natural numbers (and of interesting conversations, too). Take any natural number (greater than one) – you can always write it as a product of certain primes. And these primes themselves cannot be expressed as the product of smaller numbers. Simple as they sound from this definition, prime numbers have a very surprising and inexplicable manner of showing up on the number line. Much of folklore and mathematical literature alike have their origin in this mystery surrounding primes.

But hold on, why are we talking about primes in a story about random matrices?

We wouldn’t have been, had it not been for a chance teatime conversation between Freeman Dyson and mathematician Hugh Montgomery (interesting conversations, remember) [1].

In the spring of 1972, Montgomery was visiting the Institute for Advanced Study at Princeton, New Jersey to discuss his recent work on the zeros of the Riemann zeta function with fellow mathematician Atle Selberg. Selberg happened to be a leading figure of the time on the Riemann zeta function and the much fabled Riemann Hypothesis.

First formulated by Georg Friedrich Bernhard Riemann in his 1859 paper, the Riemann hypothesis is a hugely famous and celebrated conjecture yet to be (dis)proved. (And guess what, this was the only number theory paper the mathematician extraordinaire Riemann wrote in his entire lifetime!)

Bernhard Riemann made an important observation that the distribution of primes was intricately related to the properties of a function that now bears his name (it was first introduced by the Swiss mathematician Leonhard Euler, though). Now, as it turns out (and as you’ll see if you happen to delve more into abstract mathematics), mathematical functions and objects have personalities of their own. The zeta function, as Riemann observed, appeared to have an interesting one.

##### Behold the mighty Riemann zeta function! The Riemann Hypothesis states that all the interesting values of s for which ζ(s) = 0 lie on a straight line in the complex plane.

Numbers as you probably know, can have two parts – a real part and an imaginary one (and you’ll probably realize later that the imaginary part is not so imaginary after all).

##### A complex number has two parts – real and imaginary (here, a and b respectively). The tiny i hanging alongside the imaginary part, b is what makes itthe imaginary part. (i is the imaginary unit, the square root of 1.)

The Riemann zeta function has some non-interesting zeros (values of s for which ζ(s) = 0) at s = -2, -4, -6 and so on. What Riemann conjectured was that all the other zeros of the function will always have their real parts equal to 1/2 – and hence, when you plot them on the complex plane, they’ll all lie on a vertical line [2].

Now, over a century and a half later, all we know is that this hypothesis appears to be true. We know it to be true for the first 1013 (!) zeros we’ve found till now, but have no idea whether it holds true in general or not. (For those of you who refuse to take abstract concepts arising in pure mathematics seriously, the zeta function will keep appearing in your life even if you restrict yourself to more concrete (?) areas of applied mathematics and/or physics.)

As has been the norm with challenging problems since the beginning of the previous century, the Riemann Hypothesis along with five other open problems are listed as the Millennium prize problems by the Clay Mathematics Institute – each with a bounty of a million dollars.

But this story is not about the million dollars – mathematicians don’t care much about it anyway (or so I am guessing). In the early 70s, number theorist Hugh Montgomery was working on the statistical distribution of the interesting zeros of the Riemann zeta function on the critical line – the vertical line where all of them are conjectured to lie on.

When, during their teatime conversation at the Institute for Advanced Study, Montgomery mentioned his recent results to Freeman Dyson, both Dyson and Montgomery were up for a surprise. Dyson realized that the statistical distribution of the Riemann zeros that Montgomery had worked out had a lot in similar with the statistical properties of a certain class of Random Matrices Dyson had earlier looked at while working on the physics of heavy atoms. And more importantly, the theory of random matrices had, by then, pretty well established results that could be applied to Montgomery’s problem [1].

Dyson then wrote a letter to Selberg referring Madan Lal Mehta’s book on random matrices to be looked up for the results that were needed by Montgomery. (You must read this article published in the IAS Spring 2013 newsletter; the article also has a scanned image of Dyson’s handwritten note to Selberg!)

This striking similarity in the statistics of the Riemann zeros and the spectra of heavy atoms points towards an universality in the underlying structures. Stronger results coming from probability theory and mathematical statistics appear to give a clearer picture; though much of it still appears to be an outright miracle. More on this notion of universality in the last part of the series when I’ll narrate the story of one of the greatest quests in present day science – the quest for a theory of quantum gravity.

#### References

[1] Kelly Devine Thomas, From Prime Numbers to Nuclear Physics and Beyond, The Institute Letter Spring 2013.
[2] Peter Sarnak, Problems of the Millennium: The Riemann Hypothesis, 2005.

## Random Matrices in Three Short Stories

#### This post is a slightly modified version of a student talk I gave for the IISER Pune Science Club earlier this year. All the three stories (this one and the two to follow) will be accessible to anyone with some exposure to high school physics and mathematics. Readers with a formal training in advanced level physics and/or mathematics have all the rights to criticize the author for an over-simplistic presentation.

One

The first story begins with that of my personal hero, Freeman John Dyson. Growing up as a kid in England in the period between the two of the most disastrous wars this planet has seen, Dyson developed a strong interest for everything numbers. This interest, quite naturally, evolved into a passion for physics and mathematics. When the eighteen-year-old Dyson arrived at Cambridge in 1941 as a student, there were few physicists around – a constant phenomenon at the universities during the war years; physicists were perhaps the most suitable people to be sent away with war-related responsibilities.

As it happened, the greatest influence on Dyson, while at Cambridge, was the famous mathematician duo, Hardy and Littlewood [1]. After working for a few years on number theory problems (he published a couple of influential papers in this period), Dyson moved to the United States where he was appointed a professor at the Cornell University; he didn’t have a Ph.D., though (and never got one).

With the brightest of physicists around (Richard Feynman and Eugene Wigner to name just two from a pretty illustrious list), Dyson’s focus shifted towards problems from quantum physics. (Number theory, however, was to appear in his life again, albeit for a small period, as we’ll see in the second part of this series of stories.)

Quantum mechanics, one of the two greatest triumphs of the twentieth century physics (General Relativity being the other one, of course), reformulates the study of physical systems in the language of the Hamiltonian. If you were reading a chapter from a textbook on Quantum Physics (which this post is not), you’d be told that this Hamiltonian is a Hermitian operator. Now, a Hermitian operator, to put in rather simple words, is a matrix with some special properties. And a matrix is nothing more (?) than an array of numbers. But what has a matrix got to do with a physical system?

As it turns out, the way a system evolves can be mathematically expressed in terms of the product of certain matrices – and the Hamiltonian of a system, which itself is a matrix, determines what matrices you should be multiplying. Say, you want to study a Hydrogen atom. You’ll have to begin with its Hamiltonian and see what all you can say about the energy of its components. And then you can check how well you did your job by comparing your results with a Hydrogen spectrum. (A quick Google Images search for a Hydrogen spectrum at this point is strongly recommended for those who haven’t seen one.)

As seems intuitive, the Hamiltonian of a Hydrogen atom should be a lot simpler than that of a more complicated atom. A more complicated atom would also mean a heavier atom; consider, for example, a Uranium atom, which is over 200 times heavier than the Hydrogen atom. But with increasing weight comes increasing complexity (sic); heavier atoms have a larger number of interacting components. This essentially renders it impossible to write down a Hamiltonian that you can use to predict the spectrum of your complex atom. Poof! All the powers quantum mechanics bestowed upon you go awry.

Not really. What Freeman Dyson and Eugene Wigner showed that you can make a pretty smart guess about the Hamiltonian of such a complex, heavy atom using a Random Matrix – a matrix which contains random numbers. It’s like having your usual matrix except for that the entries are drawn randomly from a set of numbers.

##### H, which can describe the Hamiltonian of a system, is a matrix with elementsH11, H12, H21 and H22. You can call H a Random Matrix if the entries of H are random variables. Here, H has four elements – in practice, you’ll have to take much larger matrices for your computations.

Now, the essential idea here is to realize that to make predictions about the system you are trying to model, you need to consider an ensemble of all the random matrices which would give rise to the properties you’re expecting your system to have. These properties are manifested in terms of certain symmetries of the system under consideration. (The notion of symmetry is both utterly important and extremely fascinating in the physical sciences. If you’re ever in a need to spot a theoretical physicist in a large crowd, a passing mention of symmetry will do the job.)

The next step in your analysis of complex atoms will then be to study the statistical properties of the matrix ensemble with the appropriate symmetries. And that’s pretty much all. There is this entirety of machinery coming from the theory of random matrices that gives you the freedom to treat the complex atom as a black box with a very large number of interacting components and to still be able to extract out the essential information with great accuracy [2].

A point that must not go without a mention here is the practical importance of studying heavy atoms. Heavy atoms act as the source of nuclear energy, the foremost candidate for being the chief contributor to the fulfilment of our energy needs in the future.

If you’re feeling bewildered and fascinated by the fact that such an apparently unrelated notion of random matrices can help you predict the spectra of complex atoms, you’re not only in the company of some of the greatest minds working in this area, but also stand a chance of being a part of the wonderful discoveries yet to be made. Our understanding is very limited at present – we know that things work but have very little idea why. There appears to be some statistical law of large numbers working behind the scenes. Interestingly, it was in mathematical statistics where random matrices had first made an appearance in the 1930s with the work of the agricultural statistician John Wishart.

Talking of statistics and numbers, the theory of random matrices have also led great insights into the solutions to challenging problems in number theory, which will be the theme of the second of our stories. As you’ll see, Random Matrix Theory has made its way far beyond statistics and atomic physics. A comprehensive and insightful reference for those interested is the classic book Random Matrices by Madan Lal Mehta. (Mehta, who had a very fruitful collaboration with Dyson and Wigner, was one of the leading contributors to the subject.) However, you’ll need to have some background in linear algebra to follow the text; but it is surely worth the efforts.

#### References

[1] Freeman Dyson, Selected Papers of Freeman Dyson with Commentary, American Mathematical Society, 1996.
[2] Madan Lal Mehta, Random Matrices, Vol. 142, Pure and Applied Mathematics, Academic Press, Ed. 3, 2004.

## Mimamsa: Stories from Behind the Scenes

The past weekend saw a gruelling but very exhilarating contest for the coveted winner’s trophy of what we at IISER Pune love to call the toughest undergraduate Science quiz in India. Mimamsa, in its ninth edition in 2017, had teams from IISc Bengaluru, NISER Bhubaneswar, IIT Bombay and IIT Madras in the finals, selected after a preliminary round held earlier this year.

Undergraduates from across the country who have participated in Mimamsa over the years have called it “intellectually stimulating” and “enjoyable” – quite aptly so, given the ideology behind its conceptualisation in 2009 by Dr. Sutirth Dey.

However, this post is not about the Mimamsa presented to the participants, but the one students at IISER Pune spend time creating – and I’d argue why these two are not the same. But of course, the arguments I present here are (almost) entirely based on my personal experiences, and I don’t expect everyone to agree with them.

Perhaps the most significant element that makes Mimamsa unique is the flavour of the questions. The questions are non-trivial to begin with, and in fact take a form that makes them seem impenetrable, until, of course, one gets to know the solution. Then the participants feel awestruck if they were not been able to solve a question or otherwise feel elated to have grabbed some essential points to add up to their tally in the contest. That’s it, right? No.

Remember when I said that the Mimamsa presented to the participants is not the Mimamsa students at IISER Pune spend time creating? The questions (on most occasions) evolve from being raw ideas to taking the final forms they’re presented in. And in the course of this evolution, the students involved in the making of these questions evolve too; learning a lot in the process – new ideas, new methods of enquiry, ways to come up with smart solutions, the ability to gauge the level of difficulty of problems, the intricacies of posing questions. The process is long and tiring, and like any other venture, the students make numerous mistakes in the process, but then they get to learn from these mistakes, too. Exactly the things we expect ourselves to become extremely good at as students of Science (and Mathematics).

On the front-end what appears to be a nice, sophisticated quizzing event, is, behind the scenes, a very dense, sometimes exhausting but almost always rewarding process.

Mimamsa is surely about the spirit of quizzing and about motivating enquiry, but it is also about the enormous efforts that are put in by the students on all fronts, and it goes without saying that the teams involved in the organisational aspects over the years must be given an equal credit for what Mimamsa has come to be today.

It might be a bit too early to call Mimamsa a phenomenon – it certainly appears to have the potential to become one in the time to come – but it has already impacted the lives of many of us who have been associated with it, and this does make Mimamsa a phenomenon in our lives.

## Do you like humans?

I have lately been playing around with Allo, an instant messaging mobile app by Google. Among other interesting features, the app comes with a virtual assistant which when asked its name tells the user (rather plainly) to call it her/his Google Assistant.

To begin with, the Google Assistant is meant to assist you in almost all your online and offline tasks. But what really got me interested in it were the smart (and sometimes, witty) replies it comes up with. Take for example, the following string of queries and replies:

How smart are you?
It might seem like I’m smart
But I’m just good at searching

It takes a village to raise a virtual assistant

I’d want to be a super-memory

How many roads must a man walk down?
The answer, my friend, is blowin’ in the wind
But I’ll spare you my harmonica solo

Do you like humans?
I love humanity
You have the best questions
And the best dreams

Now, my objective here is surely not to review the app or the virtual assistant (or worse, to add to the pile of Siri vs Cortana sort of articles). And anyway, to many, the above string may appear rather uninteresting. What actually prompted me to write this post was the reply to the last question in the string.

The reply just came out from a large chunk of code written by the developers at Google (with probably some inputs coming in as a result of some super-cool machine learning algorithms) and there’s nothing much special about it from the AI-development perspective. But the idea behind the reply is probably important. Humanity does have wonderful questions and dreams. It is this innate human curiosity that has resulted in the technology at our disposal today (that includes virtual personal assistants like Allo’s, too).

This belief (in the importance of questioning), however, has an interesting aspect to it: the conundrum of which questions are important. For instance, is the question of what precursors are there to an earthquake more important than that of what lies behind the event horizon of a black hole? Or that of finding cures to cancer more important than finding a proof of the celebrated Riemann hypothesis? Are some questions really more important than others?

I do not know the answer. It does appear that finding answers to some questions have a more immediate impact and usefulness. But what about the other seemingly less useful ones? There have been multiple instances when fundamental research, which appears to be undertaken only to answer not-as-important questions, has led to enormous developments in fields that directly cater to humanity. But even if it did not, how appropriate would it be to call it less important?

As I said, I do not have an answer.

Posted in Physics and Mathematics | | 1 Comment

## On Strings and Perspectives

This week has been really interesting. IISER Pune is hosting the eighth edition of the biennial Indian Strings Meeting. For the uninitiated, string theory is one of the many different approaches to a theory of quantum gravity (a theory which can consistently describe quantum mechanics and classical gravity within a common framework). A slightly technical introduction to string theory is this very well-written piece What every physicist should know about string theory by Edward Witten.

From a general point of view, (perhaps) more fascinating are the dialogues that have been established between physics and mathematics as a result of the research in string theory (and quantum gravity, in general) in the last five to six decades. A number of problems in both these areas have seen significant contributions coming from the other side – hugely motivating new insights into interesting solutions. These dialogues have largely been possible because of the different perspectives each of these topics have offered to the other.

Talking of perspectives, I recently chanced upon this TED talk by Roger Antonsen where he very beautifully describes how taking different perspectives helps us improve our understanding and why this ability has an inherent universality.

Ed Witten, who stands tall among the giants in the field of string theory, would undoubtedly classify as one of the finest interpreters to have facilitated the dialogue between physics and mathematics. He is the only physicist till date to have been awarded the Fields Medal, the most coveted prize in mathematics. His works, apart from leading new directions to string theory, have also made enormous impacts on pure mathematics.

Though string theory has its share of proponents and adversaries, even its detractors would agree that these different perspectives and the resulting conversations have helped reinforce the idea that the whole of science is but one single adventure.

The presentation slides from the talks at the Indian Strings Meeting 2016 are available online and can be accessed here.

## Knots (and a lot many things)

Prof. Louis H. Kauffman is here at IISER Pune for a few days. For a man of his age (he’ll turn 72 the coming February), he hardly shows any signs of frailty – he has given three (pretty long) talks in the last two days. His breadth of knowledge is enthralling – the topics he has talked on range from knot theory and mathematical logic to quantum computing and statistical mechanics.

Prof. Kauffman is one of the leading figures in knot theory today. While knot theory deals with the topological features of knots, the extent to which these ideas have been found to be related to physics and other sciences is enormous. World Scientific has a Series on Knots and Everything, with Prof. Kauffman as the series editor; he has also authored a book in the series titled Knots and Physics.

If you’re interested in reading more about knots, you can find a number of interesting articles on the subject (and on many other things) on Prof. Kauffman’s UIC webpage.