By Marge d’Wylde
The development of quantum mechanics as we know it today was first established about 100 years. Physicist Niels Bohr made foundational contributions to the understanding of atomic structure and quantum theory in the 1920s.
Physicist Erwin Schrödinger published his famous “Schrödinger’s cat” thought experiment highlighting the paradox of quantum superposition in 1935. Chemist G.N. Lewis, who was a renowned researcher and dean of the College of Chemistry between 1912 and 1941, did extensive experimentation on relativity and quantum physics, coining the term “photon” for the smallest unit of radiant energy, in 1926.
Early work in quantum chemistry was pioneered by physicists Walter Heitler and Fritz London, who published a study on the covalent bond of the hydrogen molecule in 1927. Quantum chemistry was subsequently developed by a large group of researchers that included theoretical chemist Linus Pauling at Caltech and physicist John C. Slater at Harvard who introduced various theories such as the Molecular Orbital Theory.
Today, many of our College of Chemistry scientists work across the disciplines of chemistry and physics. In fact, the College is renowned for its pioneering work in this area. Physicist and Nobel Laureate Ernest Lawrence arrived at the University of California’s Berkeley campus, having been wooed from a faculty position at Yale University by promises that included working with chemists in the College of Chemistry. His research relationship with G.N. Lewis, Glenn Seaborg, and Melvin Calvin helped lead to ground-breaking discoveries in heavy water processing, the discovery of plutonium, and the discovery of the carbon dioxide assimilation in plants.
The first theoretical chemist at the College of Chemistry was the renowned Kenneth Pitzer who joined the faculty in 1937 after completing his Ph.D. in just two years. Pitzer was the founder of modern theoretical chemistry at Berkeley. He not only used quantum and statistical mechanics to explain the thermodynamic and conformational properties of molecules, but also pioneered quantum scattering theory for describing chemical reactions at the most fundamental level. He also made contributions to relativistic effects in chemical bonding and the theory of fluids and electrolyte solutions.
Robert Harris was next arriving in 1963 followed by William Miller and Henry Schaefer in 1969. William Lester joined in 1981 and David Chandler became a faculty member in 1986. Interestingly, Schaefer’s story about becoming a theoretical chemist echoes tales from some of our current scientists in the theoretical chemistry group. Schaefer’s true scientific calling was revealed during an organic chemistry course when he almost demolished a fume hood with a failed aniline experiment. According to his professor, “The damage here is far in excess of the lab fee you paid. Have you ever considered theoretical chemistry?”
Today there are seven active faculty in the College’s theoretical group. The members are working on some large scale problems. Quantum computers, quantum biology, and quantum dots are taking up some of the faculty’s time. Others are working on theoretical questions about the structure of aqueous solutions and interfaces, nanostructures, cell signaling, and electronic structure.
Simulating chemical reactions is a complicated process. Today it takes powerful computing to model the increasingly complex molecular structures being researched. In tandem with the computers, theoretical chemists create more complex algorithms and mathematical analysis as chemistry research moves ever deeper into the quantum realm.
U.S. governmental agencies are also relying on theoretical chemists’ research in groundbreaking efforts to build a fully realized quantum computer and network and hoping to be the first to eventually realize the establishment of the quantum internet. In a new “space race”, other quantum network efforts are underway in Japan, the U.K., the Netherlands, and China. There is much to be gained from getting there first.
One possible future for theoretical chemistry is the day when chemists discover new chemical reactions with a computer, and then verify it in the lab. In 2019, Google researchers announced they had used a quantum computer to simulate a chemical reaction for the first time. Although a simple reaction, this event marked a step towards verifying the use of quantum computers in chemistry.
According to Ryan Babbush who worked on the project at Google, “We’re doing quantum computations of chemistry at a fundamentally different scale now. Years ago, quantum calculations were done with a pencil and paper by hand. While this reaction may be relatively basic, and it isn’t necessary to have a quantum computer to simulate it, this work is still a big step forward for quantum computing. Scaling this algorithm up to simulate more complex reactions should be fairly easy.”
Here, we catch up with four members of our theoretical group to see what they are working on including Martin Head-Gordon, Director of the Pitzer Center, Birgitta Whaley, Phillip Geissler, and Eran Rabani.
Martin Head-Gordon: Discovering the frontiers of electronic structure theory
- Kenneth S. Pitzer Distinguished Professor of Chemistry
- Director of the Pitzer Center for Theoretical Chemistry at UC Berkeley
- Fellow of the Royal Society
- Member of the National Academy of Sciences and American Academy of Arts and Sciences
Martin Head-Gordon’s theoretical chemistry research is focused on the frontiers of electronic structure calculations through the development of novel theories and algorithms. His investigations center on the development and application of electronic structure theories, to analyze problems that are currently beyond the reach of standard methods. Since this information is crucial to understanding and controlling the chemistry of molecules, applications of electronic structure theory play an important and growing role in many areas of chemistry. Realization of this goal generally requires the coupling of fundamental quantum mechanics with large scale scientific computing.
Head-Gordon says of his research, “There is this concept that really began with Nobel Laureate John Pople about the fact that a good quantum chemical model should require no input other than what atoms are involved, where are the nuclei, and then the rest should be calculated by the model. You can then try and validate the model on some chemistry where you know the answers pretty well. If the results come out to your satisfaction, you can begin to computationally predict answers to other research questions with the model.”
Head-Gordon is Director of the Pitzer Center for Theoretical Chemistry. The Center was established in 1999 to promote a home for excellence in theoretical chemistry by enhancing the education and research of students at Berkeley. The Center was established through an endowment from the Pitzer Family Foundation in honor of Professor Kenneth Pitzer (Ph.D. ’37, Chem) and his wife Jean Mosher Pitzer. The Center is currently housed in sections of Historic Gilman Hall, which was home to the labs of Nobel Laureates William Giauque and Glenn Seaborg and currently is home to seven active theoretical chemists and their labs.
Building better theoretical computations
What kinds of problems are you currently working on?
MHK — In terms of my group’s perspective, we are mostly about trying to exploit breakthroughs in computing hardware by making our own software, making new theories, and ultimately new algorithms. It is kind of like a food chain: from formal theory, to algorithms, to software, and then finally to groups that can use our software to solve problems.
Some members of the group are working on making better density functionals. Others are bridging between numerical experiments, which is what we do in quantum chemistry, and the research of experimental chemists. It is a form of energy decomposition analysis (EDA). For example, our EDAs are being used to understand the origin of hydrogen-bonding, to design better force fields (with Teresa Head-Gordon’s group), and to understand the role of non-bonded interactions in catalysis (with John Hartwig’s group).
Because our computational methods complement experiments, we often collaborate with experimental groups. One productive area of collaboration that I already briefly alluded to is catalyst design and function for areas such as new energy (converting photons to fuels), chemical transformations, and polymer up cycling.
What is Q-Chem software?
MHG — I have been involved in the Q-Chem project since the early 1990s. It was started by two members of John Pople’s research group in 1992. He was my Ph.D. advisor at Carnegie Mellon. I joined as the third founder in 1993. We describe it as open team software because there are 100 to 200 active developers around the world who submit source code to the project. The software is licensed worldwide. We’re lucky that we’ve been able to build up a significant community around Q-Chem with contributions to areas that include density functional theory, electron correlation, excited states, molecular interactions, and more. John Pople joined us as a director and code developer in 1999.
We think there are something like 50,000 copies of our code being used by chemists, material scientists, and others around the world. I think it’s a way to have a larger impact in chemistry research. You can learn more about the Q-Chem project at https://www.q-chem.com/.
How are you solving chemistry problems using a simulated world?
MHG – There is the real world and then there’s the simulated world, and they should match. Quantum mechanics is thought to be essentially exact for chemistry, just like Newton’s equations are exact for predicting satellite orbitals around the earth. But while we can solve Newton’s equations as accurately as we like and then build rockets and launch satellites on that basis, the equations of quantum mechanics are very difficult to solve. So, while Schrödinger’s equation itself is exact, we can’t solve it exactly. And so that’s what keeps my research area busy. We are trying to improve our approximations and make them more accurate, so they can be more useful for predicting chemistry. That’s what drives electronic structure theory.
What is your current interest in quantum computing?
MHG — Currently, I am collaborating with Birgitta Whaley on some research because quantum chemistry is thought to be one of the possible early applications for quantum computers. I published a paper in Science in 2005 which simulated quantum chemistry on a quantum computer. Quantum computing research is a focus of intense interest right now because of exciting advances in hardware, with the prospect of more to come.
Nothing can simulate a quantum system better than another quantum system. For that reason, quantum chemistry is a promising application on a quantum computer. It will be really fascinating to see where that goes.
Solving future problems
What is a big problem that can benefit from a quantum chemistry focus?
MHG — The Haber-Bosch process (which takes nitrogen from the air and converts it into ammonia to make fertilizers) is one of the most energy intensive chemical processes in the world. If we could have a green alternative to that, it would be wonderful.
The ideal solution would be to use solar energy to produce ammonia. Imagine having a catalyst that could take atmospheric nitrogen and water, do a little bit of electro catalytic magic with the aid of some green energy to produce ammonia and oxygen. In fact, there are already some catalysts that will do this. But they are not yet active or efficient enough to be industrially useful.
Quantum chemistry can help by unraveling the mechanism of an existing catalyst that maybe has a too slow turnover frequency or too high energy cost. Once we understand the origin of its limitations, we might be able to improve it or gain inspiration for a better catalyst.
Birgitta Whaley: The quantum realm
- Professor of Chemical Physics
- Director of the UC Berkeley Quantum Information and Computation Center
- Co-Director of the NSF Challenge Institute for Quantum Computing
- Executive board member, Center for Quantum Coherent Science
- Fellow, American Physical Society and Member, American Academy of Arts and Sciences
- Emeritus Member, President’s Council of Advisors on Science and Technology
Birgitta Whaley is a pioneer in the field of quantum mechanics. She is a foremost expert in the fields of quantum information, quantum physics, molecular quantum mechanics, and quantum biology. She is investigating the role of quantum mechanics in functional biological systems. Another research focus is on quantum control and quantum information. Her theoretical work in quantum computation is currently centered on creating algorithms, applications, and error correction for near-term noisy quantum computers.
Whaley states, “When it comes to really understanding the quantum world, we are still at an early stage. Quantum mechanics poses fundamental questions for our understanding of the world in which we live — from the behavior of the smallest particles to that of black holes, from the behavior of simple physical systems to complex chemical and biological phenomena that drive life.”
In a recent talk, Whaley discussed new quantum biology insights, “Current research is showing mounting evidence for the existence of dynamical phenomena in biological systems that involve coherent quantum motion in unexpected situations, requiring us to revise our long-standing view of quantum effects in biology. New quantum research into plant and bacterial photosynthesis, bird navigation, and other biological processes is more than just cataloging the quantum chemical properties of the biological systems. What we are looking at, and what makes it very challenging, is that we want to understand the relationship of the biophysics/chemistry bridge to biological function. We are now looking at integrating microscopic, mesoscopic, and behavioral studies into our research and that is very challenging.”
On quantum biology
Why is the discovery of quantum mechanics in biology more recent than in chemistry?
BK – The underlying influence of quantum mechanics in biology via its role in determining molecular energy levels and reaction barriers was acknowledged in the 1930s but things got more interesting for studying quantum biology after the advent of the laser in the 1960s. To get to the timescales where non-trivial dynamical quantum effects are relevant for biological systems, the timescale of electronic motions over femtoseconds had to be accessed. And that did not happen until methods of ultrafast spectroscopy were pioneered in the 1970s and 1980s. Previously, biologists realized that molecules had discrete energy levels but had no way to access the associated quantum dynamics of key molecular processes.
What are scientists looking for in quantum biology research?
BK —The line between the quantum and classical realms in physics and chemistry is porous. This is part of the reason why this is such an exciting area.
The overarching goal of research into quantum biology is to understand biological function across all time and size scales. This means two things. The first is to develop tools that can go all the way down from the macroscopic to sub-atomic particles to probe the structure and dynamics of biological systems. The second part is to travel from the smallest to the largest component in biology asking the question can “quantum coherence be relevant for a biological function”?
Function is like the jewel in the crown at the top of our understanding of biological systems. You shouldn’t speculate about function before you actually know what the structure really is. And after structure, then you should start thinking about dynamics. How does this work? What are the mechanisms? What are the timescales? And what are the quantum features that you might see in measurements? And only when all this is understood can you start to think about function.
What is next for quantum biology research?
BK — Recent research has shown mounting evidence for the existence of dynamical phenomena in biological systems that involve coherent quantum motion in unexpected situations, requiring scientists to revise their long-standing view of quantum effects in biology being restricted to understanding of molecular energetics, stability and kinetics.
These revelations have led to a new generation of studies bringing the tools of advanced quantum optics and quantum information science and technologies to probe complex biological phenomena such as photosynthesis and bird navigation, raising our understanding of the role of quantum mechanics in functional biological systems.
On quantum computers
Are there commercial applications for quantum computers?
BK — In late 2018, Congress passed the National Quantum Initiative which directed the Office of the President to establish goals and priorities for a 10-year plan to accelerate the development of quantum information science and technology applications. I was invited to join the President’s Council of Advisors on Science and Technology to create recommendations in the areas of Industries of the Future including artificial intelligence, quantum information science, and advanced manufacturing and biotech. The quantum recommendations included building infrastructure at scale utilizing Federal investment in quantum computing user facilities and a quantum internet and intranet. A number of federal programs have since been set up to explore scientific and information processing applications of quantum computers. Many of these will have commercial applications, e.g., in communications, chemicals and pharmaceutical industries, or in the financial sector.
What is currently happening in quantum computer research?
BK — Quantum computers are not going to be sitting on our desktops any time soon, but we are now building them. They have the potential to execute complex algorithms billions of times faster than classical computers. Currently we are at “stage one”, with quantum machines having on order of 50 -70 quantum bits (qubits). What is not clear is when and how we will go from stage one to stage two, with more than 100 qubits and some error correction. In the current iteration, we don’t have enough qubits to explore the full power of quantum algorithms.
Performing the simulations that many people undertake in chemistry on classical computers is very time consuming when done at scale. For instance, when Phillip Geissler is simulating the chemical dynamics of multiple solutes in water, he needs to make a simulation that is as large as possible. But the time for that computation can scale exponentially with the number of particles, which puts a constraint on how much information it can yield.
However, if you can do the simulation on a quantum computer with an algorithm whose time scales only polynomially with the number of particles, there is a signification gain. It’s a difference in the scaling of compute time as you go to larger numbers of particles or degrees of freedom.
Phillip Geissler: Statistical mechanics of complex materials and biological systems
- Aldo De Benedictis Distinguished Professor of Chemistry
- Member American Chemical Society
- Awarded the Donald Sterling Noyce Prize for Excellence in Undergraduate Teaching and the UC Berkeley Distinguished Teaching Award
- Alfred P. Sloan Fellowship and a Kavli Frontiers Fellow
Phillip Geissler uses theory and computer simulations to study molecular phenomena at the frontiers of physical chemistry. His work spans a variety of systems ranging from aqueous solutions and interfaces to biomolecular assemblies, to nanoscale materials. These systems share essential motifs of disorder and heterogeneity that are amenable to the tools of modern statistical mechanics. Using those tools, he explores emergent physical principles which govern complex molecular systems.
“A characteristic feature of the microscopic world is that nothing stands still. For instance, if you look at a glass of water on a macroscopic scale, it will appear that nothing is happening,” Geissler states. “It looks completely placid. But it is universally true that if you zoom in and focus on what’s going on at the scale of nanometers or angstroms the molecules are in constant motion.”
His work with longtime collaborator Richard Saykally, a physical chemist in the department, has clarified many aspects of these molecular motions in liquid water. For example, their combined experimental and theoretical approach resolved a forty year old question about how water molecules arrange themselves in a liquid drop. They demonstrated that nearby molecules in water adopt not just a few intermolecular structures, but instead a whole continuum of arrangements – from strong hydrogen bonds to highly distorted geometries.  They later showed how dissolved salts can stick to the interface between liquid water and air, controlled in part by microscopic undulation of the liquid’s surface.
On creating theoretical solutions
What are examples of your research in statistical mechanics?
PG – In a very recent example, we showed that active particles – self-propelled particles that move in different ways than typical molecules – can crystallize much as common substances do. Like bacteria and flocks of birds, these particles consume energy to power their motion and thus do not follow familiar laws of equilibrium thermodynamics. And yet they can condense and crystallize in close analogy to water. This tantalizing result suggests that we can eventually understand living, nonequilibrium systems with principles and concepts similar to those we currently teach in undergraduate physical chemistry. My young theory colleagues David Limmer and Kranthi Mandadapu are doing important research to discover such new laws.
In another example, we worked with Berkeley Lab material scientist Thomas Russell to develop a way to print 3-D structures composed entirely of liquids. The approach uses a modified 3-D printer to inject threads of water into silicone oil, sculpting tubes made of one liquid into another liquid. The key is a special nanoparticle-derived surfactant that locks the water in place, in essence a special soap that prevents the tubes from breaking up into droplets. Our theoretical contribution helped establish a physical basis for the locking mechanism, which gives clues on how to improve these nanoparticles.
What kinds of theoretical problems interest you?
PG — What fascinates me is how the variability of microscopic structure can dictate the way things work. The kinds of problems that appeal to me are problems where the answer is not a single structure, or not just an average, but the answer really lies in understanding what underpins those variations and what combinations of molecular motions are important to how a particular phenomenon works.
For over a hundred years, we have known that macroscopic properties of materials (their conductivities and heat capacities and catalytic behaviors) are controlled by fluctuating arrangements of atoms and molecules. But in many areas of chemistry, experimental tools have emerged only in the last couple of decades to witness in vivid ways the existence and the importance of those variations. The growing, detailed recognition that microscopic variability is essential to how proteins function and assemble, how nanocrystals form and react provides a wealth of exciting problems for us to work on.
Nano-science has been a revolution in that regard, both in developing the ability to image what goes on at microscopic scales, but also in synthesizing miniature materials that demand an accounting of fluctuations. So now systems of interest have moved from things that look placid to things that are fundamentally variable. I like to say this is the age of statistical mechanics.
How do theorists and experimentalists work together?
PG — Historically, there was not a clear separation between experiment and theory. That has been a product of the last 50 years or so. I think it is important to recognize how the scientific process plays out in chemistry. When we say we understand something, it’s because we have a theoretical model for how the world works and we can compare that model with what happens in reality. Traditionally, those two things were inseparable parts of the research being done by chemists. They developed the model descriptions, mathematical descriptions of how the universe might work at the molecular scale and performed measurements in the laboratory to hold up against those models.
Science is much more specialized today. Laboratory skills and innovations have become technical in a way that really occupies most of one’s expertise. The same thing has happened on the theoretical side. The kinds of mathematical analysis and computing skills required to build complex molecular models have become a specialized endeavor. But the process of discovery still requires combining these capabilities. It also benefits profoundly from combining experimental and theoretical perspectives. I have come to realize how strongly the way you think about a problem is shaped by the tools that you have.
On creative teaching
You are a highly regarded teacher at the College. One of your known specialties is you play the guitar and sing in Chem 1A classes. How did you get started with that?
PG — My father and mother are accomplished musicians. I grew up in a very musical household and from a young age music really appealed to me. It took a while for me to find my instrument the same way it took a while for me to find theoretical chemistry as my discipline. After several years of failed piano lessons, I found the guitar and fell in love. (Finding theory as my love required several years of growing bacteria and breaking glassware.)
The songs I use in the course were introduced to me by my high school biology teacher. She and I even performed one at a high school talent show. The songs were written by Michael Offutt, who is still composing and performing today. They involve very basic chemistry and turned out to be perfect for Chem 1A.
I was surprised to see how often my musical performances were mentioned in students’ course evaluations. Many students wrote that the songs helped them study and remember important concepts. And for some of them, it was just entertaining. I think any time someone who is passionate about teaching tries a new way of communicating, something good comes out of it. Some students learn in ways that you never really thought about. The songs really resonated with many of them.
Eran Rabani: Theoretical and computational nanoscience
- Glenn T. Seaborg Chair in Physical Chemistry, UC Berkeley
- Professor of Chemistry, Tel Aviv University
- Founder and Director of The Sackler Center for Computational Molecular and Materials Science, Tel Aviv University
- Some of his many awards include Visiting Miller Professorship, The Bruno Memorial Award, and the Israel Chemical Society award
Eran Rabani is a pioneer in developing theoretical and computational tools to investigate fundamental properties of nanostructures. He currently investigates the structural, electronic, and optical properties of nanocrystals, doping of nanoparticles, exciton and multiexciton dynamics at the nanoscale, and transport in correlated nano-junctions. Much of this relies on the development of stochastic electronic structure techniques to describe the ground and excited state properties in large-scale nanostructures.
He is also a pioneer in describing real-time dynamics of many-body interacting systems. In 2011, he was inspired by a challenge from Nobel Laureate Philip W. Anderson who wrote that the understanding of classical glasses was one of the biggest unsolved problems in condensed matter physics. Rabani and his colleagues at Columbia University were intrigued and asked the question, “If we looked at the material at the quantum level, would we still see the hallmarks of a classical glass?”
The researchers demonstrated that under very special conditions, namely a few degrees above absolute zero, glass could melt.  “It all has to do with how molecules in materials are ordered,” Rabani explained. “At some point in the cooling phase, a material can become glass and then liquid if the right conditions exist.”
On recent theoretical findings in quantum dot research
You are part of a research team who recently announced exciting new findings about the behavior of quantum dots.
ER – Bright semiconductor nanocrystals known as quantum dots give QLED TV screens their vibrant colors. But attempts to increase the intensity of that light generate heat instead, reducing the dots’ light-producing efficiency. Results from our recent study have broad implications for developing future quantum and photonics technologies.
Members of my lab worked with other labs at Stanford and Berkeley to study the behavior of quantum dots as they were hit with various wavelengths and intensities of laser light. The scientists used a high-speed “electron camera” to watch dots turn incoming high-energy laser light into their own glowing light emissions. Several members of my lab, including graduate students Dipti Jasrasaria and John Philbin, worked to calculate and understand the resulting interplay of electronic and atomic motions from a theoretical standpoint. 
We met with the experimenters regularly. Ideas were flowing back and forth between the team members, but it was all seeded from the experiments, which were a big breakthrough in being able to measure what happens to the quantum dots’ atomic lattice when it’s intensely excited.
The theoretical question we looked at specifically was the mechanism for this outcome. That is what my group explained. Through atomistic simulations and coupling many different ideas together, we ended up with a core picture which we were able to synch with the observable outcomes from the experiments. I thought that was impressive.”
How does this research enhance our understanding of photonics?
ER — This research is part of an ongoing DOE Energy Frontier Research Center grant on photonics at thermodynamic limits managed by Jennifer Dionne who is the Senior Associate Vice Provost for Research Platforms at Stanford. The center includes four principal investigators from Berkeley including myself, Paul Alivisatos, Naomi Ginsberg, and Eli Yablonovitch who is from Electrical Engineering and Computer Science.
The center’s research goal is to demonstrate photonic processes, such as light absorption and emission, at the limits of what thermodynamics allows. This could bring about new technologies including refrigeration, heating, cooling, powered entirely by light.
In this project, using femtosecond electron diffraction measurements corroborated by atomistic simulations, we uncovered transient lattice deformations accompanying radiationless electronic processes in colloidal semiconductor nanocrystals. Investigation of the excitation energy dependence in core/shell system showed that hot carriers created by a photon energy considerably larger than the bandgap induce structural distortions at nanocrystal surfaces on few picosecond timescales, associated with the localization of trapped holes. Elucidation of the structural deformations associated with the surface trapping of hot holes provides atomicscale insights into the mechanisms deteriorating optoelectronic performance and a pathway towards minimizing losses in nanocrystal-based devices
On the future of theoretical chemistry at UC Berkeley
If you could predict the future of theoretical chemistry, what would it look like?
ER — I think it’s very hard to predict the future, even though I’m a theorist and that’s my job. I know how to do it for molecules, but to predict what human beings are going to discover is harder to do. What’s unique about our department’s members is that their approach is broad so there is a chance of discovering things in many different directions, not just in one. All of them are equally important.
What is significant about the theoretical chemistry research at Berkeley?
ER – What is unique about Berkeley, and different from the other places I have visited, is the very collaborative nature of the organization. I collaborate with many experimentalist from our department (Paul Alivisatos, Peidong Yang) and from other departments and also interact very closely with other theorists, like David Limmer and Phill Geissler.
This is very important because it allows our students to be exposed to many different ideas and ways of working theoretically. The department offers a unique environment in that regard. The faculty are all working on distinctive, exciting research. Also, we have a big theory group here at Berkeley with tremendous quality in the research.
Where do you think the Pitzer Center theoretical chemistry group is headed?
ER — Theoretical chemistry is a modern discipline within the field of chemistry. Historically, a department would only hire one or two theoretical chemists. In recent years, the need for theoretical and computational support and the complexity of questions asked in chemistry, physics, and materials science, requires closer collaborations between theory and experiments. Today, we have seven theoretical chemists in the department working on diverse fields.
I have seen significant progress in the last decade in our understanding of the behavior of more complex chemical systems, being able to describe complex phenomena on more diverse length- and time-scales. A lot of progress has been made in our understanding of various fields including information, biology, and material sciences. Berkeley’s theoretical group has been central in many of these recent developments.
We also have an exceptionally talented younger generation of faculty working on extremely challenging problems. Together with the more senior scientists I see a brilliant future for theoretical chemistry at Berkeley.