For more information about item submission and attendance, see About the Technical Calendar.
Wednesday, October 10
Statistical Methods in Materials Science:
Fernando Reboredo, Materials Sciences and Technology Division, ORNL
An Outlook into the Present and Future
Materials and Chemistry Seminar Series
10:00 AM — 11:00 AM, Building 4500-N, Weinberg Auditorium
Contact: Athena S. Sefat (email@example.com), 865.574.5495
AbstractWhile any expert could write something like "LDA does not provide a good description of material X," or "GW fails to predict that material Y is an insulator," or "PBE+U provides a very good description in some cases but not always," no expert would ever write or even say "Quantum Mechanics fails." The approximation is the one we think fails; Quantum Mechanics is widely believed to be able to describe any materi, if only we could do the calculation. Despite of this wide consensus, there is still intense debate on the interpretation of Quantum Mechanics. What happens, for example, when a small system is connected to the large number of bodies that constitute any experimental probe, is still matter of controversy. This debate has been fueled by the difficulty in performing accurate quantum-many-body calculations in large systems. Historically, the discovery of an accurate experimental probe, such as the scanning tunneling microscopy, or a growth method, such as molecular beam epitaxy, have opened entire new areas of research and have changed the accepted views of what is interesting and what it is not. Likewise, improvements in computer power have opened new avenues for theoretical efforts and could redefine what materials are interesting. Many materials can have unusual properties but only those that are understood are useful in practice. Furthermore, we know that the approximations are supposed to fail, sooner or later; only a failure of Quantum Mechanics would be really interesting. If we could do the calculation, and if Quantum Mechanics works as expected, we could control that material to an unprecedented level. Therefore, materials that can be calculated accurately are interesting, both if we fail or if we succeed in the prediction. In the last decade, supercomputers have increased power only by becoming vast parallel machines. In the near future, any research computer in the world will consist of tens of thousands and sometimes millions of computing cores. A numerical solution to a theoretical problem, to be practical, must adapt and take advantage of this new reality. Statistical methods appear to be an obvious route to follow because they are easily distributed in large parallel machines and they do not suffer the communication bottlenecks of current approaches. Recent advances in new algorithms promise not only taking advantage of this computational resource, but also delivering results of unprecedented accuracy for real materials of interest for basic energy sciences with almost no approximations.
In this talk I will describe some of our effort in ab-initio statistical methods on (1) Quantum Monte Carlo (QMC) for the many-body calculations with minimum approximations in real materials and (2) the Wang-Landau approach as an avenue to incorporate finite temperature within a DFT framework to study magnetic phenomena in complex systems. I will cover some common aspects of statistical methods and some fundamental issues of the QMC approach that have been recently controlled (such as the infamous sign problem). I will describe which materials we expect to be able to describe essentially without significant approximations in the near future.
The Materials and Chemistry Seminar Series is hosted jointly by the Materials Science and Technology Division and the Chemical Sciences Division.