Monday, April 12, 2010

Enter the matrix: the deep law that shapes our reality ~ by Mark Buchanan


http://bit.ly/amx3od

* 07 April 2010 by Mark Buchanan

A numbers game (Image: Cey Adams/Corbis)

A numbers game (Image: Cey Adams/Corbis)
A numbers game (Image: Cey Adams/Corbis)

SUPPOSE we had a theory that could explain everything. Not just atoms and quarks but aspects of our everyday lives too. Sound impossible? Perhaps not.

It's all part of the recent explosion of work in an area of physics known as random matrix theory. Originally developed more than 50 years ago to describe the energy levels of atomic nuclei, the theory is turning up in everything from inflation rates to the behaviour of solids. So much so that many researchers believe that it points to some kind of deep pattern in nature that we don't yet understand. "It really does feel like the ideas of random matrix theory are somehow buried deep in the heart of nature," says electrical engineer Raj Nadakuditi of the University of Michigan, Ann Arbor.

All of this, oddly enough, emerged from an effort to turn physicists' ignorance into an advantage. In 1956, when we knew very little about the internal workings of large, complex atomic nuclei, such as uranium, the German physicist Eugene Wigner suggested simply guessing.

Quantum theory tells us that atomic nuclei have many discrete energy levels, like unevenly spaced rungs on a ladder. To calculate the spacing between each of the rungs, you would need to know the myriad possible ways the nucleus can hop from one to another, and the probabilities for those events to happen. Wigner didn't know, so instead he picked numbers at random for the probabilities and arranged them in a square array called a matrix.

The matrix was a neat way to express the many connections between the different rungs. It also allowed Wigner to exploit the powerful mathematics of matrices in order to make predictions about the energy levels.

Bizarrely, he found this simple approach enabled him to work out the likelihood that any one level would have others nearby, in the absence of any real knowledge. Wigner's results, worked out in a few lines of algebra, were far more useful than anyone could have expected, and experiments over the next few years showed a remarkably close fit to his predictions. Why they work, though, remains a mystery even today.

What is most remarkable, though, is how Wigner's idea has been used since then. It can be applied to a host of problems involving many interlinked variables whose connections can be represented as a random matrix.

The first discovery of a link between Wigner's idea and something completely unrelated to nuclear physics came about after a chance meeting in the early 1970s between British physicist Freeman Dyson and American mathematician Hugh Montgomery.

Montgomery had been exploring one of the most famous functions in mathematics, the Riemann zeta function, which holds the key to finding prime numbers. These are numbers, like 2, 3, 5 and 7, that are only divisible by themselves and 1. They hold a special place in mathematics because every integer greater than 1 can be built from them.

In 1859, a German mathematician called Bernhard Riemann had conjectured a simple rule about where the zeros of the zeta function should lie. The zeros are closely linked to the distribution of prime numbers.

Mathematicians have never been able to prove Riemann's hypothesis. Montgomery couldn't either, but he had worked out a formula for the likelihood of finding a zero, if you already knew the location of another one nearby. When Montgomery told Dyson of this formula, the physicist immediately recognised it as the very same one that Wigner had devised for nuclear energy levels.

To this day, no one knows why prime numbers should have anything to do with Wigner's random matrices, let alone the nuclear energy levels. But the link is unmistakable. Mathematician Andrew Odlyzko of the University of Minnesota in Minneapolis has computed the locations of as many as 1023 zeros of the Riemann zeta function and found a near-perfect agreement with random matrix theory.

The strange descriptive power of random matrix theory doesn't stop there. In the last decade, it has proved itself particularly good at describing a wide range of messy physical systems.


Universal law?

Recently, for example, physicist Ferdinand Kuemmeth and colleagues at Harvard University used it to predict the energy levels of electrons in the gold nanoparticles they had constructed.

Traditional theories suggest that such energy levels should be influenced by a bewildering range of factors, including the precise shape and size of the nanoparticle and the relative position of the atoms, which is considered to be more or less random. Nevertheless, Kuemmeth's team found that random matrix theory described the measured levels very accurately (arxiv.org/abs/0809.0670).

A team of physicists led by Jack Kuipers of the University of Regensburg in Germany found equally strong agreement in the peculiar behaviour of electrons bouncing around chaotically inside a quantum dot - essentially a tiny box able to trap and hold single quantum particles (Physical Review Letters, vol 104, p 027001).

The list has grown to incredible proportions, ranging from quantum gravity and quantum chromodynamics to the elastic properties of crystals. "The laws emerging from random matrix theory lay claim to universal validity for almost all quantum systems. This is an amazing fact," says physicist Thomas Guhr of the Lund Institute of Technology in Sweden.

Random matrix theory has got mathematicians like Percy Deift of New York University imagining that there might be more general patterns there too. "This kind of thinking isn't common in mathematics," he notes. "Mathematicians tend to think that each of their problems has its own special, distinguishing features. But in recent years we have begun to see that problems from diverse areas, often with no discernible connections, all behave in a very similar way."

In a paper from 2006, for example, he showed how random matrix theory applies very naturally to the mathematics of certain games of solitaire, to the way buses clump together in cities, and the path traced by molecules bouncing around in a gas, among others.

The most important question, perhaps, is whether there is some deep theory behind both physics and mathematics that explains why random matrices seem to capture essential truths about reality. "There must be some reason, but we don't yet know what it is," admits Nadakuditi. In the meantime, random matrix theory is already changing how we look at random systems and try to understand their behaviour. It may possibly offer a new tool, for example, in detecting small changes in global climate.

Back in 1991, an international scientific collaboration conducted what came to be known as the Heard Island Feasibility Test. Spurred by the idea that the transmission of sound through the world's oceans might provide a sensitive test of rising temperatures, they transmitted a loud humming sound near Heard Island in the Indian Ocean and used an array of sensors around the world to pick it up.

Repeating the experiment 20 years later could yield valuable information on climate change. But concerns over the detrimental effects of loud sounds on local marine life mean that experiments today have to be carried out with signals that are too weak to be detected by ordinary means. That's where random matrix theory comes in.

Over the past few years, Nadakuditi, working with Alan Edelman and others at the Massachusetts Institute of Technology, has developed a theory of signal detection based on random matrices. It is specifically attuned to the operation of a large array of sensors deployed globally. "We have found that you can in principle use extremely weak sounds and still hope to detect the signal," says Nadakuditi.

Others are using random matrix theory to do surprising things, such as enabling light to pass through apparently impenetrable, opaque materials. Last year, physicist Allard Mosk of the University of Twente in the Netherlands and colleagues used it to describe the statistical connections between light that falls on an object and light that is scattered away. For an opaque object that scatters light very well, he notes, these connections can be described by a totally random matrix.

What comes up are some strange possibilities not suggested by other analyses. The matrices revealed that there should be what Mosk calls "open channels" - specific kinds of waves that, instead of being reflected, would somehow pass right through the material. Indeed, when Mosk's team shone light with a carefully constructed wavefront through a thick, opaque layer of zinc oxide paint, they saw a sharp increase in the transmission of light.
Random matrix theory comes up with strange possibilities not suggested by other analyses, which are then borne out by experiments

Still, the most dramatic applications of random matrix theory may be yet to come. "Some of the main results have been around for decades," says physicist Jean-Philippe Bouchaud of the École Polytechnique in Paris, France," but they have suddenly become a lot more important with the handling of humungous data sets in so many areas of science."

In everything from particle physics and astronomy to ecology and economics, collecting and processing enormous volumes of data has become commonplace. An economist may sift through hundreds of data sets looking for something to explain changes in inflation - perhaps oil futures, interest rates or industrial inventories. Businesses such as Amazon.com rely on similar techniques to spot patterns in buyer behaviour and help direct their advertising.

While random matrix theory suggests that this is a promising approach, it also points to hidden dangers. As more and more complex data is collected, the number of variables being studied grows, and the number of apparent correlations between them grows even faster. With enough variables to test, it becomes almost certain that you will detect correlations that look significant, even if they aren't.

Curse of dimensionality

Suppose you have many years' worth of figures on a large number of economic indices, including inflation, employment and stock market prices. You look for cause-and-effect relationships between them. Bouchaud and his colleagues have shown that even if these variables are all fluctuating randomly, the largest observed correlation will be large enough to seem significant.

This is known as the "curse of dimensionality". It means that while a large amount of information makes it easy to study everything, it also makes it easy to find meaningless patterns. That's where the random-matrix approach comes in, to separate what is meaningful from what is nonsense.

In the late 1960s, Ukrainian mathematicians Vladimir Marcenko and Leonid Pastur derived a fundamental mathematical result describing the key properties of very large, random matrices. Their result allows you to calculate how much correlation between data sets you should expect to find simply by chance. This makes it possible to distinguish truly special cases from chance accidents. The strengths of these correlations are the equivalent of the nuclear energy levels in Wigner's original work.

Bouchaud's team has now shown how this idea throws doubt on the trustworthiness of many economic predictions, especially those claiming to look many months ahead. Such predictions are, of course, the bread and butter of economic institutions. But can we believe them?

To find out, Bouchaud and his colleagues looked at how well US inflation rates could be explained by a wide range of economic indicators, such as industrial production, retail sales, consumer and producer confidence, interest rates and oil prices.

Using figures from 1983 to 2005, they first calculated all the possible correlations among the data. They found what seem to be significant results - apparent patterns showing how changes in economic indicators at one moment lead to changes in inflation the next. To the unwary observer, this makes it look as if inflation can be predicted with confidence.

But when Bouchaud's team applied Marcenko's and Pastur's mathematics, they got a surprise. They found that only a few of these apparent correlations can be considered real, in the sense that they really stood out from what would be expected by chance alone. Their results show that inflation is predictable only one month in advance. Look ahead two months and the mathematics shows no predictability at all. "Adding more data just doesn't lead to more predictability as some economists would hope," says Bouchaud.

In recent years, some economists have begun to express doubts over predictions made from huge volumes of data, but they are in the minority. Most embrace the idea that more measurements mean better predictive abilities. That might be an illusion, and random matrix theory could be the tool to separate what is real and what is not.

Wigner might be surprised by how far his idea about nuclear energy levels has come, and the strange directions in which it is going, from universal patterns in physics and mathematics to practical tools in social science. It's clearly not as simplistic as he initially thought.

Mark Buchanan is a writer based in the UK. His latest book is The Social Atom (Bloomsbury)


http://www.newscientist.com/article/mg20627550.200-enter-the-matrix-the-deep-law-that-shapes-our-reality.html?full=true

++++++++++++++++++++++++++++++++++++
HumanE-Liberation-Party Blog
http://help-matrix.blogspot.com/
++++++++++++++++++++++++++++++++++++

No comments:

Post a Comment

Please keep comments humane!