A Summer of IBOs

Hello everyone! This is P.J. Robinson and I’m currently a fourth-year undergraduate physics major at UCLA. Over this past summer, I was a John Stauffer VURP fellow in the Chan group. I’m excited to share a bit about my time with the group in this blog post.

I had a great summer with the group and at the institute in general. Caltech was an amazing place to work and I was continually impressed by the delightfully nerdy happenings around campus. Two that particularly stood out were a giant eclipse celebration on the Beckman lawn, and a space themed extravaganza for the Cassini spacecraft’s descent into Saturn’s atmosphere.

Now, on to what I actually did. My project was to extend and explore the capabilities of Intrinsic Bonding Orbitals (IBOs) and Intrinsic Atomic Orbitals (IAOs) in the solid state. IBOs and IAOs were originally developed to be a bridge between quantum chemical calculations and chemical intuition. Chemical bonds and atomic charges are a central to chemistry; however, bonds and charges are not uniquely defined. Over the past half-century numerous methods have been developed to recover chemical intuition from computed wavefunctions. One recent addition to the available methods is IBOs and IAOs. They are simple in construction and have been demonstrated to produce the ‘correct’ bonding picture even in unconventional situations. This makes them an ideal candidate for application to solid state. With Professor Chan, I worked on adapting the IBO/IAO method for solids in PySCF. We also explored using the IAO projections to create better model Hamiltonians for complex solids.

In addition to my project, I had the opportunity to learn about the group’s cutting-edge methods from group meetings and seminars. The frequent discussions I had around the group were some of the highlights of my summer.

High-Temperature Superconductivity via Quantum Chemistry

Garnet’s first blog post revisited the lasting effort in the group to do wavefunction based correlation calculations in crystals. As he pointed out, many of our research projects are trying to push the boundaries of quantum chemistry and take years to complete (or not). In this post, I’ll write about our ongoing quest to understand high-temperature superconductivity, which has been an exciting journey since the beginning.

Garnet’s interest in strongly correlated materials dates back long ago, but he finally started seriously working in this field in 2010 with Dominika Zgid, then a postdoc in the group. They initially visited dynamical mean-field theory (DMFT), which, among all other things, is a framework one can easily plug in quantum chemistry methods. Their work resulted in the increased popularity of quantum chemistry impurity solvers, e.g., truncated CI solvers in the DMFT community [1]. But more importantly, many problems, such as the infinite expansion of the bath and the complexity of working with Green’s function, became the motivation to find an alternative “DMFT” method based on wavefunctions.

This led to the invention of density matrix embedding theory (DMET) — the paper by Gerald Knizia and Garnet [2]The new approach, while embracing the essential idea of embedding from DMFT, improved the overall numerical tractability by switching the primary variable of interest from the Green’s function to the ground-state wavefunction.

With this powerful new tool, it became possible to study the ground states of various interesting model systems. The group worked on both applications of the method, such as the honeycomb Hubbard model [3] and transition metal oxides, as well as methodological extensions, such as the treatment of spectral properties [4], electron-phonon coupling [5] and molecular calculations [6].

I joined the group during this period. As an initial project, I worked with Qiaoni Chen and Barbara Sandhoefer, both postdocs in our group, to study transition metal oxides, hoping to find the metal-insulator transition boundary. Unfortunately, we found that it was difficult to precisely characterize the boundary due to various convergence problems, and the work was never finished. However, it was a great exploration into various numerical aspects of DMET and was a very useful experience for our future work in tackling these problems.

At the later stage of that project, Garnet and I were also thinking about extending the kinds of systems which could be studied using DMET. After a long and painful process of searching, we realized that DMET could easily incorporate broken particle-number symmetry wavefunctions, which means we could study superconductors directly. Though the physics was clear, it took quite a long time to work out the actual math. At first look, the number of bath orbitals seemed doubled with BCS wavefunctions. Only after detailed analysis, did I find that the Schmidt decomposition of a normal state Slater determinant is equivalent to finding its single-particle and singe-hole spectra relevant to the impurity. For BCS wavefunctions, these excitations are just mixed together. This leads to a way to construct the impurity model with a BCS wavefunction that gives the correct number of bath orbitals.

As I mentioned, studying high-temperature superconductivity is one of the goals Garnet was particularly interested in, and now we could actually model it!

The implementation was a quite a bit of work. I wrote a new DMET code to fix the issues discovered in our transition metal project, and to provide a more flexible interface — one that could treat both normal and BCS, and spin-restricted and unrestricted wavefunctions in the DMET calculation. One of the key theoretical improvements was to introduce a chemical potential to reduce the discrepancy between the impurity and the mean-field numbers of electrons. In addition, I started to use the Block DMRG code (extended for particle-number nonconserving wavefunctions) as the impurity solver, making it possible to deal with larger impurities. Later on, AFQMC and CASSCF solvers were also added.

Finally in mid-2014, we obtained the first numbers for the 2D Hubbard model in a minimal impurity (2×2). The results looked both encouraging (we could see superconducting domes and antiferromagnetic orders) and frustrating (the antiferromagnetic order was way too strong). While we decided to try larger impurities as well, we also presented the results in conferences and workshops, hoping to get input from people more familiar with the condensed matter world.

It was the Simons Collaboration on the Many Electron Problem where we got the most help. During meeting with scientists there, we got positive feedback and many useful suggestions. We decided to run 4×4 impurity calculations on the Hubbard model — that was hard even with DMRG, as the impurity model would have 32 orbitals without definite particle number. We had to develop a series of extrapolation techniques, and finally, when everything was combined to get the single number — the energy, it “magically” agreed with the best available data! At the same time, the phase diagram looked much more reasonable. Additionally, we found strong evidence for the long-suspected stripe phases as well. The results basically told us that the claim that DMET converges to the exact results with increasing impurity size is not a formal statement, but something that can be applied even to small cluster calculations to get a good estimate of physical quantities in the thermodynamic limit! Honestly, I was surprised by the results [7].

After the initial success, we revisited the 2D Hubbard model after noticing the Hubbard stripe paper by Philippe Corboz. His iPEPS calculations indicated a low energy stripe state of wavelength 5 in the underdoped region of the phase diagram. Our DMET calculations confirmed this state, and in addition, found even lower energy stripes with wavelengths 6 to 8. In the process, we had frequent communications with Philippe, Steve White, Shiwei Zhang, Georg Ehlers and Reinhard Noack, leading to a large collaboration project. In the end, all the different ground-state methods independently simulated stripes of various wavelengths and obtained consistent energy landscapes for stripes of different wavelengths. For the first time, we were able to determine the ground state of a traditionally interesting point in the 2D Hubbard model with great confidence. (This work has been posted on arXiv [8].)

The success of DMET in treating the 2D Hubbard model was both surprising and reassuring, but it will be a bigger and more exciting challenge to study real materials . As an intermediate step, Ushinish Ray, a postdoc in the group, and I are working on the three-band Hubbard model, and the periodic infrastructure Garnet described in the last post would greatly contribute to the further study of realistic cuprates. We may see the first DMET calculation of cuprates very soon!

Reference:
[1] D. Zgid, E. Gull, G. K.-L. Chan, Phys. Rev. B 86, 165128 (2012).
[2] G. Knizia, G. K.-L. Chan, Phys. Rev. Lett. 109, 186404 (2012).
[3] Q. Chen, G. Booth, S. Sharma, G. Knizia, G. K.-L. Chan, Phys. Rev. B, 89, 165134 (2014).
[4] G. Both, G. K.-L. Chan, Phys. Rev. B 91, 155107 (2015).
[5] B. Sandhoefer, G. K.-L. Chan, Phys. Rev. B 94, 085115 (2016).
[6] Q. Sun, G. K.-L. Chan, J. Chem. Theory Comput. 10, 3784 (2014).
[7] B.-X. Zheng, G. K.-L. Chan, Phys. Rev. B 93, 035126 (2016); Simons Collaboration on Many Electron Problems, Phys. Rev. X 5, 041041 (2015).
[8] B.-X. Zheng, C.-M. Chung, P. Corboz, G. Ehlers, M.-P. Qin, R. M. Noack, H. Shi, S. R. White, S. Zhang, G. K.-L. Chan, ArXiv:1701.00054 (2017)

6 Tips for the First Year of Grad School

The first year of graduate school can feel like an expedition into uncharted territory. There is an abundance of excitement and wonder, some confusion and mistakes, and plenty of personal growth and development. Having recently begun my second year of grad school here at Caltech, I thought that I would share a few pointers, by no means comprehensive, that were either crucial to my first year or that, in hindsight, might have had a significant impact. Organized into two sections are my six tips. The first section focuses on items that can be implemented and measured in a clear, practical way, while the second section is centered on concepts more impalpable.

Practical Tips

1. Start a Research Library

When you start performing research in a new area, a huge fraction of your time is dedicated to reading and struggling to understand all key concepts from various noteworthy scientific papers. These readings will form the foundation of knowledge that is built upon for your years as a researcher and are regularly starting points for new research endeavors. Because these articles will be useful for years in the future, I’d advise to develop a way of organizing what you have read and your thoughts on each article. The method you choose will be unique and can range from a filing cabinet to note cards to the plethora of computer programs (End Note, Mendeley, Zotero, etc.).

2. Apply for Grants:

The eligibility guidelines for many grants for PhD students limit applicants to first or second year graduate students; as such, this is something that should be at the forefront of your mind your first year. Having external funding obviously provides financial benefits, but often these are minute since students are normally guaranteed funding for their studies at their chosen institution. There are a number of other benefits to applying and being selected for a grant. First, in just applying, students further develop scientific writing skills that are essential for the remainder of their lives and learn to frame their research interests in a way that is understandable to general scientific audiences. In being selected for a grant, the benefits can be immense, ranging from increased academic freedom to previously unavailable research opportunities (GRIP and GROW for NSF GRFP Fellows) to annual travel stipends and mentorship opportunities (DOE CSGF). There are a number of grants available, but I would recommend starting by looking into the NSF GRFP, the DOE CSGF and the Hertz Fellowship. Additionally, there are many institute specific fellowships that might be relevant for your work and merit your attention.

3. Use Rotations

You have likely heard many times the importance of choosing an advisor whose research piques your interest and whose advising style will promote your success. Accordingly, the act of selecting an advisor is frequently emphasized as the single-most important decision of the first year of a PhD program. My advice would be to take a test drive when offered.

To justify the importance of this, I’ll share my own experience in selecting a research group. I had elected to attend Caltech with intentions of working with a specific advisor and upon my arrival, I began my first rotation in that group. I enjoyed the work, the advisor, and the group and intended to do my second rotation in the same group before officially joining at the end of my first year. At the advisement of a senior graduate student, though, I decided to branch out and try something entirely new and do my second rotation in the Chan group studying quantum chemistry. I’d always had an interest in this area, but was completely inexperienced in the research and thought it would simply be a challenging experiment before returning to the group I intended to join. Surprisingly, I found myself becoming more and more interested in the research I was doing and, perhaps more importantly, found Prof. Chan’s advising style to match my learning style. I looked at the research projects available in each group and eventually decided, contrary to my initial intentions, to join the Chan group. Because of this, and because of benefits I’ve heard other students express, I would recommend taking advantage of rotation opportunities in your program.

Intangible Tips

1. Create a Balance for Yourself

Yesterday, I had a conversation with a student in my cohort in the Chemical Engineering program here where we discussed the intense drive we both feel to work constantly. When you begin your first semester as a grad student, you will likely feel like you are being pulled in every direction. You might begin by focusing on excelling in some rigorous coursework, then realize you need to put more time in as a TA because some students are beginning to struggle. Suddenly, you hit a roadblock in your research and spend nights scouring papers or stay late in the lab trying to flex your creativity before you realize the NSF GRFP proposal is due and you still have revisions to make. The point is, with the gargantuan amount of work to be done, balancing research, coursework, teaching, and your personal and social life becomes paramount.

Though these problems seem universal, and apparently don’t diminish upon completion of a PhD, there is clearly no universal solution. Because of this, I’ll share a few examples from people I’ve met. The first unique solution I heard was at orientation, where in a panel on work-life balance, a student indicated that he would work long hours when a storm was coming so he could spend the best surfing days at the beach without feeling like his research was neglected. Alternatively, a neighbor recently told me that he initially treated grad school as a 9-5 job, but found he spent too much time waiting for lab equipment and wasn’t able to make the progress he had hoped for. He now works abnormal hours to avoid wasting time while others use the equipment and has become more efficient in completing his work. Personally, I find it useful to keep a regular schedule, but allow for flexibility for weeks with extra demands, such as before a presentation or exam or when I am grading assignments for a class I am TAing. In general though, this is something to consider throughout your life, because maintaining an effective balance can improve all aspects of your life.

2. Learn From More Experienced Researchers

As you arrive at graduate school and join a research group, it is important to recognize that there is a huge pool of knowledge available for outgoing and inquisitive students. Senior researchers, whether they be more experienced graduate students, post-docs, or faculty members, are key resources in becoming an expert in your field. With coursework, I found it important to develop a good relationship with a few TAs. By attending their office hours, discussing questions with them, and emailing them when needed, I was able to get the help on assignments or studying for exams that I regret to say I neglected during my undergraduate education. Often times, graduate courses have lower enrollment as well, which means that each TA is able to dedicate a more significant amount of time to helping individuals.

Within your research group, it can be crucial to begin developing working relationships with more experienced members. After joining the Chan group, I was encouraged to branch out, ask questions, and attend group activities to get to know the other members of the group. I also found it extremely useful to talk individually with some group members, before I joined the group, and ask them about their experiences and for any advice they might have for incoming students. I have also learned that I can occasionally spend hours working on a program (such as getting a program to compile), but by simply asking an experienced colleague, the solution can be found in minutes and I can move on to more important tasks.

3. Remember You Are Only Beginning

My last tip can be one of the most difficult to implement. Leaving your undergraduate program, you will likely feel as though you have learned a great deal and are proficient in your area of study. As you begin advanced coursework and research, the limit of your knowledge becomes rapidly apparent. I quickly realized that I had much to learn in some areas and that there was a significant learning curve to start innovating in my field. It is easy to become caught up with all of the concepts and theories to be mastered and become concerned with the mountain of learning left to climb. As such, I’d advise students to enjoy the journey. Learning is a long-term endeavor and will be a constant for your career as a researcher. There is a huge amount of satisfaction and happiness to be found in developing new skills and I have found one of my favorite parts of my first year here at Caltech to be when I’ve just mastered a new skill or achieved a goal and basking for a few moments in the thrill of success before jumping towards my next project.

Waiting for PBC

wfg

Recently, we published a paper called “Gaussian-based coupled-cluster theory for the ground state and band structure of solids” (J. Chem. Theory Comput., 13, 1209, (2017)). There we describe how one can implement and apply the same systematically improvable framework of electron correlation for materials spectra, that we have long enjoyed (and taken for granted) in molecular problems. However, what might not be apparent from the publication is that it represents the culmination of a long-standing multi-year dream to create our own quantum chemistry software that works with periodic boundary conditions (PBC). In this blog post, I’ll write a little bit about the background behind this work and the many year effort that was required to get here.

The first time I felt that we needed PBC quantum chemistry software was when I was working together with Dominika Zgid on the implementation of dynamical mean-field theory for the ab initio treatment of solids, sometime around 2010. Back then, we had a copy of the CRYSTAL program that implements a Gaussian-based treatment of periodic systems. However, as the program was only provided as a binary, the only way to extract intermediate quantities was either through the output file, or through an additional cryptic text file that contained some information on quantities such as the orbitals. This was not really enough for us to build on, and in fact it is one of the reasons why we do not report DMFT energies in the paper (J. Chem. Phys. 134, 094115 (2011) as we had no access to the periodic integrals!

For those of us who develop molecular quantum chemistry methods, we take for granted that we will have easy access to the quantities we need to model electron correlation. For example, I have always assumed that we can extract atomic and molecular integrals and molecular orbitals either from some open-source library or a quantum chemistry program of our choice. However, this not something to take for granted in the PBC setting, and it’s not only a problem of software and source code. In fact, there are many open-source PBC programs (and closed-source PBC programs provided with source code) but they usually do not provide the quantities one needs for electron correlation methods. For example, most plane-wave codes do not have the capability to compute the molecular orbital integrals, as they are not needed at the DFT level.

Based on our initial experience, I decided that the group should try its hand at a PBC quantum chemistry implementation, and one that, to boot, would support the basis functions that we know and love from molecular quantum chemistry, namely Gaussians. When I moved to Princeton in 2012, one of the first events I held at the new house (still completely devoid of furniture) was a hackathon involving the whole group, to try build a periodic quantum chemistry code in two days. Trying to write a single new code, starting with a group of 12 people who know nothing about what they should be implementing, is not actually a good idea if you expect a real product! But in hindsight, it was a useful exercise. The great thing about an event like a hackathon, is that it forces you to overcome your fears about learning and trying something new, simply because of the peer pressure. If you come from a molecular theory setting, there are plenty of things to learn in order to understand periodic implementations, for example, how to deal with divergent Coulomb contributions, as well as reciprocal space and Brillouin zone sampling. For me personally, while I certainly didn’t learn and understand everything I needed from that single event, at least I learnt what it was I had to learn, and that in itself was productive.

After 2012, our dream of a periodic code languished for a while. This was in part because we were very busy with other projects, but also because the idea, as so often is the case, needed some time to mature. This did not mean, however, we had given up. In 2014 I had the chance to go to a meeting in Vienna (German theoretical chemistry conference). The best thing about going to science meetings is that you (hopefully) get to see all your friends. In this case, I had the chance to meet again with George Booth, who, when a postdoc with me, had organized the whole PBC hackathon. We had been corresponding for a few months about the possibility of reviving the PBC project, and the Vienna meeting was a great place to start, because we could also meet with Andreas Grueneis, George’s longstanding collaborator, and the author of the pioneering correlated quantum chemistry capabilities in VASP. I had been pushing the idea that we should implement a Gaussian based periodic code *inside* VASP, where the Gaussians could be internally represented as projector augmented plane waves (PAW). This might seem like just adding a layer of complication, but the reason for supporting a Gaussian representation is that Gaussians are remarkably compact, if all one wants is modest accuracy. Thus one could expect correlated calculations (at modest accuracy) with Gaussian basis sets to be much more affordable than ones working in the PAW basis directly. The presence of the “augmented” part of the PAW basis made the implementation in VASP a little complicated, but we managed to resolve these at the Vienna meeting with some helpful discussions with Georg Kresse (the original author of VASP). After this point, I returned to Princeton, and everything further was done by George and Andreas and his student Theodoros, and resulted in the nice comparison between the power of Gaussian bases and PAW representations in correlated PBC MP2 calculations, described in J. Chem. Phys. 145, 084111 (2016).

Nonetheless, the complexities working with the VASP codebase was for my group – non-expert in the subtleties of VASP – still unsatisfactory. As the PySCF project, started by Qiming Sun, began to mature, and the ease of programming with it became apparent, it became clear that we should really just implement periodic code functionality within PySCF itself. And so it was that one weekend in the late summer of 2015, Tim Berkelbach and I sat down to program a Gaussian based periodic code in PySCF – as is apparent from the above, really the third attempt to do so in our group! I sat down with the book by Dominik Marx and Juerg Hutter (which Tim had a copy of) which has a nice description of the organization of a periodic code. I started programming based off Chapter 3 (some of the variable names in the PySCF PBC implementation such as “ngs” come from this book), while Tim sat to work out all the pesky details of implementing pseudopotentials along with working out some of the finer technicalities with James. After 4 weekends, much of which was spent with Tim valiantly correcting errors in the literature in the reported pseudopotentials, we had a working prototype. It could compute, very slowly, the DFT energy of the helium solid. That’s not much, but that’s how things start!

We were then joined in our efforts by James McClain, and of course Qiming, to whom you can give any function in a program and have it transformed into something that runs 10 times as fast. James, Qiming, and Tim worked throughout the following year to help lay down more of the theoretical groundwork needed for non-trivial systems and to further develop a production level code with correlated quantum chemistry functionality – all the way from efficient integrals and Hartree-Fock to deriving and implementing parallelized k-point sampled equation of motion coupled cluster! Once that framework was laid down, Qiming was then able to go in and completely revamp the code: removing many of the bottlenecks through more efficient coding and development of novel algorithms. Arguably everything was almost completed close to 9 months after the initial prototype, but as is always the case, crossing the t’s and dotting the i’s in the last stretch takes a very long time. I personally think it’s important, when developing methods, not to just carry out a toy calculation, but to show that you can carry out calculations of meaningful quality – otherwise no one will appreciate the method. James, without complaining, and with great efficiency, parallelized and optimized the EOM-CCSD code and patiently carried out the calculations on our modest computational resources. (We only had 3 Tb of diskspace which he had to carefully ration!) This represented the first time EOM-CCSD was implemented in three-dimensional systems. Of course, in the summer, we also had to say goodbye to Tim, and moving to Caltech further delayed the work. But in the end, I think we managed to carry out a calculation we could all be happy with – an EOM-CCSD calculation for silicon bandstructure, with a modest, but not laughable basis (TZVP) and sufficient k-point sampling (4x4x4) — all in all, more than 2000 correlated orbitals — from which we could roughly estimate the remaining basis set error, and extrapolate to the thermodynamic limit of the bandgap.

And so, that is the multi-year story behind this single paper in JCTC! Of course, this represents not the end of the story – it is just a chapter in a much longer effort to transform materials simulations through high accuracy quantum chemistry. As you may have gathered, in our group, we take a long term perspective on things, and probably a similar kind of story could be told for most of the papers that we write. Those stories will have to wait for another post. But in the meantime, all our code is available through our PySCF repository, so check it out yourself, and have fun doing periodic quantum chemistry!