
Home / Opinions / Battle Of The 'Functionals' Battle Of The 'Functionals'New tools fix many problems with density functional theory, but which one is best?
FOR CHEMISTS, the quick and simple density functional theory (DFT), which readily generates electron structure predictions for groundstate atoms or molecules, seemed to be one of the best computational deals around. DFT predicts electronic properties of atoms or molecules from the densities of their electron clouds, rather than from the solution of the Schrödinger equation for each individual electron's motion. This greatly simplifies what would otherwise be an intractable, prohibitively expensive computational task. Using not much more than modern desktop computers, chemists routinely include DFTbased predictions in their papers to corroborate their experimental results.
Courtesy of Don Truhlar and Yan Zhao /U of Minnesota/Theory fixing New functionals more accurately predict properties of compounds like this hydrocarbon nanoring (top) and this "buckyball catcher," or "buckybowl," a concave, supramolecular structure, shown with a buckyball overhead. But in the past few years, it has become evident that despite its ubiquity, DFT has some serious shortcomings. For example, DFTcalculated CC bondbreaking energies for larger hydrocarbons are often inaccurate, as are DFTbased reactionbarrier heights. DFT also has trouble describing systems that involve radical cations, charge transfer, and longrange interactions. Although DFT often computes geometries well, even nalkane bond separation energies can have large errors. Theorists say they've been aware of these and other problems with DFT for a while now, and in response to the problem, they've been churning out new, vampedup varieties of the method. But the continuing reliance on wellentrenched, older versions of DFT could have serious consequences for many chemists who use the method but aren't hardcore theorists. These users may have been lulled into a false sense of DFT's accuracy, warns University of Georgia chemistry professor Paul von Ragu Schleyer. "The happy days of 'black box' DFT computations are over," he says. Now that scientists have come to expect accurate, sophisticated DFT computations, the pressure is on to improve DFT's predictive powers. Not surprisingly, the theory is experiencing a new wave of growing pains along the way. A profusion of new DFT methods solve one problem or another, but no clear winner has emerged. The idea that quantum states can be described mathematically has been recognized since 1926, when Erwin Schrödinger derived his famous equation. The descriptions of individual electrons and their interactions with one other, however, have remained relatively difficult for quantum theory to calculate. Even with today's most powerful supercomputers, solving the Schrödinger equation for any system more complex than a very simple molecule would take eons, to say nothing of being colossally expensive. It wasn't until the 1960s that Walter Kohn, now emeritus physics professor at the University of California, Santa Barbara, with his colleague Pierre Hohenberg and thenpostdoc Lu Jeu Sham, derived the KohnSham equations. These equations replaced the many variables required to describe all the electrons in the Schrödinger equation with terms for only one variableelectron densityand led to DFT. For his development of DFT, Kohn shared the 1998 Nobel Prize in Chemistry with another theorist, the late John Pople, who developed computational methods for quantum chemistry. DFT was originally intended to help condensed matter physicists deal with their systems, which are awash in electrons. The theory's potential use for chemists soon became apparent as well. But while DFT was extremely fast and simple, it still couldn't match the accuracy of socalled wave function methods based on the Schrödinger equation. Although most of the terms in Kohn's equations are easily calculated, one term presents a great deal of difficulty: the "exchange correlation energy," which accounts for the effects of electrons' tendencies to get out of each others' way. Most of the energy theorists expend on DFT involves inventing clever ways to deal with this difficult exchange correlation term. The different equations devised to handle exchange correlation, bearing numerous acronyms such as LDA, GGA, and PW91, are known as "functionals." There exists, in theory, a "perfect" functional that should accurately describe any system. But scientists have no way to systematically search for such a functional. DFT TOOK OFF in 1993, when Axel Becke, now a chemistry professor at Dalhousie University in Halifax, Nova Scotia, injected a small amount of the wave functionbased HartreeFock method into the exchange correlation term, resulting in a hybrid functional known as B3LYP, which had unparalleled accuracy. Since then, B3LYP has been wildly popular. It's been estimated that it is used in more than 80% of DFT calculations. But even B3LYP has shortcomings that have become more obvious as the atomic and molecular systems chemists examine with theory have become larger and more complex. "The challenge is to find improvements that do not sacrifice DFT's appeal in the first place: namely, its simplicity and computational efficiency," Becke says. That's a difficult balance to achieve. The more wave function aspects that get added into a functional, the more expensive and slow the calculation becomes. "There's no free lunch," says JengDa Chai, a postdoc who, with UC Berkeley chemistry professor Martin HeadGordon, has developed functionals that accurately treat dispersion forces (weak intermolecular attractions) and systems with large charge separations. NEVERTHELESS, a number of labs boast new functionals that they say have done away with many of DFT's troubles. "A lot of the problems have been solved already," says Donald Truhlar, a chemistry professor at the University of Minnesota, who, with his postdoc Yan Zhao, has developed functionals that accurately predict onceproblematic chemical phenomena, including noncovalent interactions and ππ stacking. The biggest issue now, Truhlar believes, is one of inertia. It will take time to shift the popular mentality to the new functionals, including the Minnesota group's, that are now being included in computational chemistry software programs. Like Becke and HeadGordon, Stefan Grimme, a chemistry professor at Germany's University of Münster, prefers to include more aspects of wave function theory in his functionals. The advantage of this strategy, he says, is that such functionals are general. "Our functional should work for everything," he says. Work from the lab of Troy Van Voorhis, chemistry professor at Massachusetts Institute of Technology, now accurately treats mixed valence compounds. If a complex has two metal centers, for example, one with a 2+ charge and the other with a 3+ charge, DFT had tended to split the charge evenly, giving each metal atom a charge of 2.5+, whereas Van Voorhis' work avoids that problem. Although the new functionals are proliferating, at some point the community will have to settle on which are best, Van Voorhis says. "If you have 18 different functionals, and you get 18 answers, you can probably find one that will reproduce what you want it to say," he says. "You lose the rigor of being able to say, 'This is wrong.' " Still, many theorists see DFT as the computational theory of the future, especially for large problems like biological systems. "Any wave function theory that would be remotely accurate would be totally unaffordable," Truhlar says. "That's why DFT has taken over everybody's imagination." 