
… A French mathematician has just won the Abel Prize for his decades of work developing a set of tools now widely used for taming random processes…
Random processes take place all around us. It rains one day but not the next; stocks and bonds gain and lose value; traffic jams coalesce and disappear. Because they’re governed by numerous factors that interact with one another in complicated ways, it’s impossible to predict the exact behavior of such systems. Instead, we think about them in terms of probabilities, characterizing outcomes as likely or rare…
… the French probability theorist Michel Talagrand was awarded the Abel Prize, one of the highest honors in mathematics, for developing a deep and sophisticated understanding of such processes. The prize, presented by the king of Norway, is modeled on the Nobel and comes with 7.5 million Norwegian kroner (about $700,000). When he was told he had won, “my mind went blank,” Talagrand said. “The type of mathematics I do was not fashionable at all when I started. It was considered inferior mathematics. The fact that I was given this award is absolute proof this is not the case.”
Other mathematicians agree. Talagrand’s work “changed the way I view the world,” said Assaf Naor of Princeton University. Today, added Helge Holden, the chair of the Abel prize committee, “it is becoming very popular to describe and model real-world events by random processes. Talagrand’s toolbox comes up immediately.”
…
A random process is a collection of events whose outcomes vary according to chance in a way that can be modeled — like a sequence of coin flips, or the trajectories of atoms in a gas, or daily rainfall totals. Mathematicians want to understand the relationship between individual outcomes and aggregate behavior. How many times do you have to flip a coin to figure out whether it’s fair? Will a river overflow its banks?
Talagrand focused on processes whose outcomes are distributed according to a bell-shaped curve called a Gaussian. Such distributions are common in nature and have a number of desirable mathematical properties. He wanted to know what can be said with certainty about extreme outcomes in these situations. So he proved a set of inequalities that put tight upper and lower bounds on possible outcomes. “To obtain a good inequality is a piece of art,” Holden said. That art is useful: Talagrand’s methods can give an optimal estimate of, say, the highest level a river might rise to in the next 10 years, or the magnitude of the strongest potential earthquake…
Say you want to assess the risk of a river flooding — which will depend on factors like rainfall, wind and temperature. You can model the river’s height as a random process. Talagrand spent 15 years developing a technique called generic chaining that allowed him to create a high-dimensional geometric space related to such a random process. His method “gives you a way to read the maximum from the geometry,” Naor said.
The technique is very general and therefore widely applicable. Say you want to analyze a massive, high-dimensional data set that depends on thousands of parameters. To draw a meaningful conclusion, you want to preserve the data set’s most important features while characterizing it in terms of just a few parameters. (For example, this is one way to analyze and compare the complicated structures of different proteins.) Many state-of-the-art methods achieve this simplification by applying a random operation that maps the high-dimensional data to a lower-dimensional space. Mathematicians can use Talagrand’s generic chaining method to determine the maximal amount of error that this process introduces — allowing them to determine the chances that some important feature isn’t preserved in the simplified data set.
Talagrand’s work wasn’t just limited to analyzing the best and worst possible outcomes of a random process. He also studied what happens in the average case.
In many processes, random individual events can, in aggregate, lead to highly deterministic outcomes. If measurements are independent, then the totals become very predictable, even if each individual event is impossible to predict. For instance, flip a fair coin. You can’t say anything in advance about what will happen. Flip it 10 times, and you’ll get four, five or six heads — close to the expected value of five heads — about 66% of the time. But flip the coin 1,000 times, and you’ll get between 450 and 550 heads 99.7% of the time, a result that’s even more concentrated around the expected value of 500. “It is exceptionally sharp around the mean,” Holden said.
“Even though something has so much randomness, the randomness cancels itself out,” Naor said. “What initially seemed like a horrible mess is actually organized.”…
“Michel Talagrand Wins Abel Prize for Work Wrangling Randomness,” from @QuantaMagazine.
* James Gleick, The Information
###
As we comprehend the constructs in chance, we might spare a thought for Caspar Wessel; he died on this date in 1818. A mathematician, he the first person to describe the geometrical interpretation of complex numbers as points in the complex plane and vectors.
Not coincidentally, Wessel was also a surveyor and cartographer, who contributed to the Royal Danish Academy of Sciences and Letters‘ topographical survey of Denmark.
