Cowan discovers analogy between behavior of chemical reaction networks, neural networks of brainBy Steve Koppes
Jack Cowan’s quest to understand the brain’s workings using mathematical methods spans more than four decades. When his research career began in 1962 as a graduate student in electrical engineering, he worked with the founders of neural network theory at the Massachusetts Institute of Technology, including Norbert Wiener, who had a clear idea about how they should proceed.
Wiener died in 1964, before he and Cowan could work jointly on the problem that Cowan continues to address. “I didn’t really understand what he was saying to me until I worked it out myself. He was one of the great mathematicians of the 20th century,” said Cowan, Professor in Mathematics, Neurology and the College.
But now Cowan realizes that the activity patterns of neurons in the human brain follow the same rules of physics that govern molecules as they condense from gas to liquid, or from liquid to solid. Cowan presented an outline of these ideas in February at the 2008 annual meeting of the American Association for the Advancement of Science.
“Structures built from a very large number of units can exhibit sharp transitions from one state to another state, which physicists call ‘phase transitions,’ ” Cowan said. “Strange and interesting things happen in the neighborhood of a phase transition.”
When liquids undergo phase transitions, they evaporate into gas or freeze into ice. When the brain undergoes a phase transition, it moves from random to patterned activity. “The brain at rest produces random activity,” said Cowan, referring to what physicists call “Brownian motion.”
Although the bulk of his work involves deriving equations, Cowan’s findings mesh well with laboratory data generated on the cerebral cortex and electroencephalograms.
His latest findings show that the same mathematical tools physicists use to describe the behavior of subatomic particles and the dynamics of liquids and solids can now be applied to understanding how the brain generates its various rhythms.
These include delta waves generated during sleep, alpha waves of the visual brain and gamma waves, which were discovered during the last decade and seem related to information processing. “The resting state of brain activity seems to have a statistical structure that’s characteristic of a certain kind of phase transition,” Cowan said. “The brain likes to rest in such a state because that’s the state in which information processing is optimized.”
At this stage of his research, Cowan said it would be premature and speculative for him to try to relate how phase transitions in the brain might relate to neurological conditions or states of human consciousness. “That’s for the future,” he said.
Another component of his latest research is the close relationship between spontaneous pattern formation in brain circuits and in chemical reaction networks. In this research, he shows how mathematics can help explain visual hallucinations and how the visual cortex obtained its stripes, which are visible to the naked eye when removed from cadavers.
“This line of research on pattern formation can be traced back to Alan Turing, who also founded the modern science of computation,” said Terrence Sejnowski of the Salk Institute for Biological Studies in La Jolla, Calif., who is a leading specialist in computational neurobiology.
Some of Cowan’s findings extend the work the late Heinrich Klüver, the former Sewell Avery Distinguished Service Professor Emeritus in Biological Psychology, who died in 1979.
Cowan joined the Chicago faculty in 1967, the same year Klüver retired. Much to Cowan’s dismay, he was never able to discuss Klüver’s research on the brain and visual hallucinations because of the latter’s poor health in his final years.
Cowan continued his research, meanwhile, in a series of collaborations with Ph.D. students and colleagues in physics, mathematics, biology and neuroscience.
In 1972, he and postdoctoral fellow Hugh Wilson, now of Canada’s York University, formulated a set of equations that could describe the dynamics of neural networks. Now called “Wilson-Cowan equations,” they became a mainstay of neural network research. “But I always knew that these equations were only part of the story, so I kept thinking about them,” Cowan said.
Then in 1985, he ran across an article in a Japanese journal that described a statistical physics approach to chemical reaction networks. “It took me years to understand how to use these tools for biological networks,” he said. “It so happens that there is an analogy between the behavior of chemical reaction networks and neural networks.”
Working with Michael Buice, a former graduate student in Physics who is now a postdoctoral fellow at NIH, Cowan recently was able to demonstrate this analogy and solve many of the problems on which he had worked for nearly 46 years.