Machine Learning for Quantum Many-body Physics

For each poster contribution there will be one poster wall (width: 97 cm, height: 250 cm) available. Please do not feel obliged to fill the whole space. Posters can be put up for the full duration of the event.

Excited eigenstates and swarm intelligence algorithms

Bagrov, Andrey

Accessing excited states of a many-body quantum Hamiltonian can be a challenging task, even more complicated than obtaining its ground state. The recently suggested RBM variational ansatz for many-body wavefunctions seems to be pretty universal (modulo some limitations), and one might hope that it can serve as a good approximation for excited eigenstates of a variety of quantum systems. To make it converge to a state in the middle of the spectrum, we suggest employing particle swarm optimization algorithm instead of the gradient descent-based schemes. Each particle in a swarm represents a neural network, and the fitness function to be optimized can be the energy variance, higher momenta of the Hamiltonian, or a certain function of them known to minimize on eigenstates. The PSO algorithms are easy to parallelize and do not require numerically heavy operations, and the only limitation is due to the intrinsic stochasticity of the neural network approach, which makes it difficult to resolve individual eigenstates.

Quantum Error Correction with Recurrent Neural Networks

Baireuther, Paul

In quantum computation one of the key challenges is to build fault-tolerant logical qubits. A logical qubit consists of several physical qubits. In stabilizer codes, a popular class of quantum error correction schemes, a part of the system of physical qubits is measured repeatedly, without measuring (and collapsing by the Born rule) the state of the encoded logical qubit. These repetitive measurements are called syndrome measurements, and must be interpreted by a classical decoder in order to determine what errors occurred on the underlying physical system. The decoding of these space- and time-correlated syndromes is a highly non-trivial task, and efficient decoding algorithms are known only for a few stabilizer codes. In our research we design and train decoders based on recurrent neural networks. The training is done using only experimentally accessible data. A key requirement for an efficient decoder is that it can decode an arbitrary and unspecified number of error correction cycles. To achieve this we use so-called long short-term memory cells [1]. These recurrent neural network building blocks have an internal memory. During training, the decoder learns how to update and utilize its internal memory, in order to detect errors on the logical qubit. The trained decoder is therefore a round based algorithm, rather than a rigid pattern recognition scheme. It can process the syndrome information in realtime, without having to wait for the quantum computation to be completed. In our recent work [2] we have focused on one type of stabilizer code, the surface code, which is currently being implemented by several experimental groups [3-5]. We have trained and tested the neural network decoder on both a simple circuit model, and on a density matrix simulator with experimental parameters. In the presence of correlated bit-flip and phase-flip errors the neural network decoder outperforms the popular minimum-weight perfect matching decoder. However, our neural network decoder is not tailored to the specifics of the surface code, and should also be applicable to other stabilizer codes, such as the color code [6]. References [1] S. Hochreiter and J. Schmidhuber, Neural Computation 9, 1735 (1997). [2] P. Baireuther, T. E. O’Brien, B. Tarasinski, and C. W. J. Beenakker, arXiv:1705.07855. [3] J. Kelly et al., Nature 519, 66 (2015). [4] M. Takita et al., Phys. Rev. Lett. 117, 210505 (2016). [5] R. Versluis et al., Phys. Rev. Applied 8, 034021 (2017). [6] H. Bombin and M. A. Martin-Delgado, Phys. Rev. Lett. 97, 180501 (2006).

Learning Hamiltonians from local data

Bairey, Eyal

Recent works have succeeded in recovering the local Hamiltonian of an isolated quantum system from a single eigenstate. However, these methods require the measurement of long range correlations throughout the entire system, as well as the system being in a pure state. Here we extend these methods to to any mixed state which commutes with the Hamiltonian. In particular our methods apply to thermal mixed states. In addition, when the underlying Hamiltonian is defined on a lattice with short-range interactions, our method allows us to recover the Hamiltonian of a subsystem by performing measurements restricted to that subsystem and its boundary. When restricted to thermal states, our method can be viewed as a quantum generalization of well-known problem of learning graphical models, such as Boltzmann machines. Surprisingly, whereas the sampling complexity of the classical problem is exponential in the local degree of the underlying graph, we show that under reasonable assumptions, our algorithm is polynomial. Finally, we show how to adapt our method such that it can be used to learn the underlying Hamiltonian from sampling the time dynamics of the system.

Locating spin-liquid transitions in three-dimensional quantum magnets

Buessen, Finn Lasse

Quantum magnetism and the formation of quantum spin liquids remains one of the most intriguing aspects of contemporary solid state physics, driving high research activity of experimentalists and theorists alike. To substantiate experimental findings with appropriate theoretical understanding, an efficient methodological framework is often vital to study frustrated quantum spin models in three dimensions -- a challenging regime that is inaccessible to many conventional (both numerical and analytical) methods. Utilizing a recently developed pseudo-fermion functional renormalization group (pf-FRG) approach, we demonstrate its capability to qualitatively capture the interplay between spin liquid phases and magnetic order even at finite temperature. Quantitatively pinpointing the precise location of phase boundaries based on transition signatures in physical observables, however, can be challenging. Therefore, we started to investigate machine learning as an alternative approach to search for signatures of phase transitions in the vast amount of data on single- and two-particle vertex functions that we generate in pf-FRG calculations.

Progressive lifting of the ground-state degeneracy of the long-range kagome Ising antiferromagnet

Colbois, Jeanne

The nearest-neighbour antiferromagnetic Ising model on the kagome lattice is wellknown to be highly frustrated, and in particular to have a very large macroscopic groundstate degeneracy [1][2]. Recently, a candidate ground state for the model with dipolar couplings has been proposed [3]. In order to study the degeneracy lifting that leads to the ground state of the dipolar model, we implement a rejection-free dual worm algorithm [4] and use it to study the antiferromagnetic Ising model on the kagome lattice with up to fourth neighbour interactions. For the model with up to third neighbour interactions, we show that the ground state exhibits five different phases as a function of the ratio J3 J2 , some of which still have a non-zero residual entropy. Surprisingly, for the model with dipolar couplings truncated at fourth neighbours, we find a ground state which is neither one of those of the J2−J3 model, nor the one proposed for the full dipolar model [3]. This new state, however, is not the ground state for the model with full dipolar couplings, leading to the conclusion that further neighbours beyond the fourth one play an important role in the selection of the ground state of the dipolar model. References [1] K. Kanô and S. Naya, Prog. Theor. Phys. 10, 158 (1953) [2] A. Sütö, Z. Phys. B 44, 121 (1981) [3] I.A. Chioar, N. Rougemaille and B. Canals, Phys. Rev. B 93, 214410 (2016) [4] G. Rakala and K. Damle, Phys. Rev. E 96, 023304 (2017) 1

Artificial Neural Network Representation of Spin Systems in a Quantum Critical Regime

Czischek, Stefanie

We use the newly developed artificial-neural-network (ANN) representation of quantum spin-1/2 states based on restricted Boltzmann machines to study the dynamical build-up of correlations after sudden quenches in the transverse-field Ising model with and without longitudinal field. We calculate correlation lengths and study their time evolution after sudden quenches from a large initial transverse field to different final fields. By comparison with exact numerical solutions given by exact diagonalization or tDMRG, we show that in regimes of large correlation lengths and volume-law entanglement also large network sizes are necessary to capture the exact dynamics. On the other hand we show a high accuracy of the network representation for quenches into regimes of smaller correlation lengths even for small network sizes scaling linearly with the system size. In these regimes the ANN representation shows promising results which suggest that the method may be efficiently used for more complex systems in one or higher dimensions.

Many-body localization in large quantum chains

Doggen, Elmer

We study quench dynamics of the Heisenberg spin chain with a random on-site magnetic field, using a combination of the time-dependent variational principle, machine learning, and exact diagonalization, with a focus on chains up to 100 spins in length. Around the regime where previous studies have reported the existence of the many-body localization transition, we instead find a wide range of disorder strengths with slow but finite transport. A lower bound for a true many-body localization transition, higher than previous estimates, is presented.

Machine learning of quantum phase transitions

Dong, Xiaoyu

Machine learning algorithms provide a new perspective on the study of physical phenomena. In this paper, we explore the nature of quantum phase transitions using multi-color convolutional neural-network (CNN) in combination with quantum Monte Carlo simulations. We propose a method that compresses d+1 dimensional space-time configurations to a manageable size and then use them as the input for a CNN. We test our approach on two models and show that both continuous and discontinuous quantum phase transitions can be well detected and characterized. Moreover we show that intermediate phases, which were not trained, can also be identified using our approach.

Reinforcement learning with neural networks for quantum feedback: a new approach to quantum memory

Fösel, Thomas

The past few years have seen dramatic demonstrations of the power of neural networks to challenging real-world applications in many domains. In the search for optimal control sequences, where the success can only be judged with some time-delay, reinforcement learning is the method of choice. We have explored how a neural-network based agent can be trained to generate optimal control sequences for quantum feedback, where the agent interacts with a quantum system, using reinforcement learning. We apply this to the problem of stabilizing quantum memories based on few-qubit systems, where the qubit layout and available set of gates is specified by the user.

Investigating ultrafast quantum spin dynamics with machine learning

Giammarco, Fabiani

G. Fabiani, Th. Rasing, J.H. Mentink Radboud University, Institute for Molecules and Materials, Heyendaalseweg 135, 6525 AJ, Nijmegen, the Netherlands E-mail: The use of femto-second laser pulses offers the exciting possibility to control the exchange interaction, the strongest interaction between spins in magnetic materials, on ultrashort time scales [1]. Moreover, recently it was shown that even weak perturbations of the exchange interaction can trigger genuine quantum spin dynamics in magnetic materials [2], which is characterized by femtosecond longitudinal oscillations of the magnetic order parameter [3]. This suggests intriguing possibilities to enhance quantum effects in the short-time dynamics of magnetic materials. However, so far, this dynamics was studied only in the linear response regime, and it is unclear how to access and exploit strongly nonlinear quantum dynamics. To investigate these issues, we apply the machine learning inspired approach recently developed by Carleo and Troyer [4] that, as they showed, efficiently captures both the ground state and time evolution of quantum spin models in one and two dimensions. Here we will present our results for the ground state energy and staggered magnetization of the two dimensional Heisenberg model on a square lattice, which is the simplest possible model that captures the spin dynamics of antiferromagnetic Mott-Hubbard insulators. By extrapolating our results to the thermodynamical limit, we find good agreement with ground state calculations performed with other techniques. For the same system, we also show that this method is able to catch the ultrafast spin dynamics triggered by an ultrashort perturbation of the exchange interaction. Our results pave the way to study strongly nonlinear quantum dynamics of macroscopic magnetic materials, with potential predictive power for experiments based on ultrashort laser pulses in the optical and THz regime. REFERENCES [1] J.H. Mentink 2017 J. Phys.: Condens. Matter 29 453001 [2] Jimin Zhao, A. V. Bragas, D. J. Lockwood, and R. Merlin. Phys. Rev. Lett. 93, 107203 [3] Bossini et al. arXiv: 1710.03143 [4] G. Carleo and M. Troyer, Science 355, 602 (2017)

Neural-Network and Tensor-Network duality with applications to chiral topological states and supervised learning

Glasser, Ivan

Neural-network quantum states have recently been introduced as an Ansatz for describing the wave function of quantum many-body systems. Particularly promising results have been obtained with Boltzmann machines, a kind of probabilistic graphical model. These models can also be seen as tensor networks in particular geometries and we show how to exploit these strong connections with the example of restricted Boltzmann machines : short-range restricted Boltzmann machines are entangled plaquette states, while fully connected restricted Boltzmann machines are string-bond states with a nonlocal geometry and low bond dimension. String-bond states also provide a generic way of enhancing the power of neural-network quantum states and a natural generalization to systems with larger local Hilbert space. We compare the advantages and drawbacks of these different classes of states and present a method to combine them together. This allows us to benefit from both the entanglement structure of tensor networks and the efficiency of neural-network quantum states into a single Ansatz capable of targeting the wave function of strongly correlated systems. While it remains a challenge to describe states with chiral topological order using traditional tensor networks, we show that, because of their nonlocal geometry, neural-network quantum states and their string-bond-state extension can describe a lattice fractional quantum Hall state exactly. In addition, we provide numerical evidence that neural-network quantum states can approximate a chiral spin liquid with better accuracy than entangled plaquette states and local string-bond states. Furthermore we demonstrate that the connection between string-bond states and restricted Boltzmann machines can also be used in traditional machine-learning applications. We provide an algorithm for optimizing string-bond states in a supervised learning setting and discuss improvements over other tensor-network algorithms.

Gaussian Process Wave Functions

Glielmo, Aldo

Neural Networks (NNs) and Gaussian Processes (GPs) are arguably the most wide spread algorithms for regression of complex functions, both possessing relative strengths and weaknesses. However, while NNs have recently received attention as efficient many body wave-function ansatzes [Carleo and Troyer, Science (2017)], the representative power of GPs in the same context has not been explored so far. I will present a first attempt in this direction. The GP ansatz we propose can be understood in terms of the GP kernel function, which implicitly models the interactions taken into account. Inspired by the success of correlated wave-functions like the Slater-Jastrow or the correlator product state, we devise a kernel which can be thought as a generalisation of the above two, encompassing interactions of any order at any distance (up to the full system size). Tests on Hubbard systems in 1 and 2 spatial dimensions reveal the competitiveness of the described model. Notably, regularised training on exact data from small (affordable) systems yields a faithful representation also in large systems. Although computationally inexpensive (training takes minutes on a laptop computer) this approach suffers from inevitable residual finite size effects, which however can be tackled by a variational optimisation of the small-database entries.

Supervised Learning of Exotic Tensor Spin Order

Greitemann, Jonas

We apply supervised machine learning to the identification of exotic spin phases with high-rank tensor order parameters. Following Ponte and Melko (PRB 96, 205146 (2017)), we find that the decision function of a support vector machine (SVM) produces the scalar order parameter. Moreover, despite our reliance on supervision, we show that in a situation where the critical temperature is not accurately know, we can still obtain the order parameter and may even be able to infer the physical critical temperature in a learning-by-confusion scheme. In addition to reproducing the order parameter curve, we are able to infer the analytic form of the tensor order parameter for a variety of symmetries with tensor order parameters of rank up to 6. This may prove useful in the exploration of exotic magnetic orders in spin liquid candidates.

Playing the Ice Game

Kao, Ying-Jer

In classical ice models, the ice rule imposes a strong constraint on the spin configurations and makes the conventional single spin-flip Monte Carlo update inefficient. On the other hand, by proposing global updates in the form of loops, the loop algorithm can provide an efficient update scheme for ice systems. In general, finding global updates is problem dependent and requires sophisticated algorithm design. Reinforcement learning is a fast-growing research field due to its outstanding exploration capability on dynamic environments. In this work, we apply a reinforcement learning method that parametrizes transition operator with neural networks. By promoting the Markov chain to a Markov decision process, the algorithm can adaptively search for global update policy by interacting with the physical model. We observe the emergence of several global update patterns on the ice manifold discovered by the agent. It may serve as a more general framework for searching update schemes in more complicated models.

A practical guide to training neural networks of quantum many body systems

Lang, Thomas

Encoding the representation of the electronic wave function of a minuscule fragment of a crystal is a nearly impossible task. learning promises to cut through the complexity and to allow for efficient encoding of a vastly complex system in a limited number of degrees of freedom by identifying the subtle, yet relevant signatures of phases of matter. We assess the efficiency and practical limits of the representational power of basic neural networks for the many body wave functions of quantum spin systems. We identify the types of wave functions, bases and network topologies, which are favorable and investigate what features the neural networks learn and how to exploit them in scaling up the network. Finally, we comment on the predictive power and entanglement properties of neural networks trained on small portions of the full phase space.

Multigrid Renormalization

Lubasch, Michael

On this poster, I present our article [1] in which we use tensor networks to solve partial differential equations. More precisely, we combine the multigrid (MG) method with state-of-the-art concepts from the variational formulation of the numerical renormalization group. The resulting MG renormalization (MGR) method is a natural generalization of the MG method for solving partial differential equations. When the solution on a grid of N points is sought, our MGR method has a computational cost scaling as $\mathcal{O}(\log(N))$, as opposed to $\mathcal{O}(N)$ for the best standard MG method. Therefore MGR can exponentially speed up standard MG computations. To illustrate our method, we develop a novel algorithm for the ground state computation of the nonlinear Schrödinger equation. Our algorithm acts variationally on tensor products and updates the tensors one after another by solving a local nonlinear optimization problem. We compare several different methods for the nonlinear tensor update and find that the Newton method is the most efficient as well as precise. The combination of MGR with our nonlinear ground state algorithm produces accurate results for the nonlinear Schrödinger equation on $N = 10^{18}$ grid points in three spatial dimensions. [1] M. Lubasch, P. Moinier, and D. Jaksch, arXiv:1802.07259 (2018).

Systematic construction of density functionals based on matrix product state computations

Lubasch, Michael

On this poster, I present the article [1] in which we use simple machine learning concepts in the context of density functional theory. More precisely, we propose a systematic procedure for the approximation of density functionals in density functional theory that consists of two parts. First, for the efficient approximation of a general density functional, we introduce an efficient ansatz whose non-locality can be improved systematically. Second, we present a fitting strategy that is based on systematically increasing a reasonably chosen set of training densities. We investigate our procedure in the context of strongly correlated fermions on a one-dimensional lattice in which we compute accurate training densities with the help of matrix product states. Focusing on the exchange-correlation energy, we demonstrate how an efficient approximation can be found that includes and systematically improves beyond the local density approximation. Importantly, this systematic improvement is shown for target densities that are quite different from the training densities. [1] M. Lubasch, J. I. Fuks, H. Appel, A. Rubio, J. I. Cirac, and M.-C. Ba\~{n}uls, New Journal of Physics 18, 083039 (2016).

Machine Learning Competing Orders

Matty, Michael

The entanglement spectrum is expected to provide a characterization of topologically ordered systems beyond traditional order parameters. Nevertheless, so far attempts at accessing this information relied on the presence of translational symmetry. Here we introduce a framework for using a simple artificial neural network (ANN) to detect defining features of a fractional quantum Hall state, a charge density wave state and a localized state from entanglement spectra, even in the presence of disorder. We then successfully obtain a phase diagram for Coulomb-interacting electrons at fractional filling $\nu = 1/3$, perturbed by modified interactions and disorder. Our results bench-mark well against existing measures in parts of the phase space where such measures are available. Hence we explicitly establish a finite region of robust topological order. Moreover, we establish that the ANN can indeed access and learn defining traits of topological as well as broken symmetry phases using only the entanglement spectra of ground states as input.

Supervised learning magnetic skyrmion phases

Mazurenko, Vladimir

Experimental discovery of magnetic skyrmions [Science 323, 915 (2009)] has initiated a new scientific race aiming at the development of ultradense memory storage technologies and logical gates. From a technological point of view, skyrmions are of a great research interest because of their stability, – topological nature of a skyrmion prevents its transformation into a different magnetic configuration, – and because of the possibility to manipulate them with an electric current of a very low density. One of the difficult tasks when one studying the materials revealing the skyrmion excitations is the construction of the temperature\magnetic field phase diagrams. From theoretical side it requires the calculation of the different correlation functions (spin-spin correlation functions, specific heat), definition of the topological charge and visualisation of numerous magnetic configurations. Motivated by the recent results reported in [Nature Physics 13, 431 (2017)] we used a similar neural-network-based approach for classification of complex non-collinear magnetic configurations such as magnetic skyrmions and spin spirals. We construct the phase diagram for a ferromagnet with Dzyaloshinskii-Moriya interaction and quantitatively describe the transitional areas between different phases by using the values of the output neurons. To analyse the learning process the arguments of the hidden layer neurons were visualised. It was found that the network learns the magnetisation of a particular magnetic configuration. We defined an optimal number of the hidden neurons needed for classification of the topological magnetic phases. Our approach can be used for analysis of the experimental data obtained with spin-polarised scanning tunnelling microscopy technique.

Self-learning Monte Carlo simulations of classical and quantum many-body systems

Meinerz, Kai

The application of machine learning approaches has seen a dramatic surge across a diverse range of fields that aim to benefit from their unmatched core abilities of dimensional reduction and feature extraction. In the field of computational many-body physics machine learning approaches bear the potential to further improve one of the stalwarts in the field — Monte Carlo sampling techniques. Here we explore the capability of “self-learning” Monte Carlo approaches to dramatically improve the update quality in Markov chain Monte Carlo simulations. Such a self-learning approach employs reinforcement learning techniques to learn the distribution of accepted updates and is then used to suggest updates that are almost always accepted, thereby dramatically reducing autocorrelation effects. It can, in principle, be applied to all existing Monte Carlo flavors and is tested here for both classical and quantum Monte Carlo techniques applied to a variety of many-body problems.

Exact construction of deep-Boltzmann-machine network to represent ground states of many-body Hamiltonians

Nomura, Yusuke

We show a deterministic approach to generate deep-Boltzmann-machine (DBM) network to represent ground states of many-body lattice Hamiltonians. The approach reproduces the exact imaginary-time Hamiltonian evolution by dynamically modifying the DBM structure. The number of neurons grows linearly with the system size and imaginary time, respectively. Once the network is constructed, the physical quantities can be measured by sampling both the visible and hidden variables. The present construction of classical DBM network provides a novel framework of quantum-to-classical mappings (in special cases, it becomes equivalent to the path integral formalism). Reference: G. Carleo, Y. Nomura, and M. Imada, arXiv:1802.09558

Versatile machine learning solver using restricted Boltzmann machine

Nomura, Yusuke

Variational wave function written in terms of restricted Boltzmann machine (RBM) is shown is shown to be powerful in representing ground states of spin Hamiltonians. In the present study, we further improve the form of variational wave function by combining RBM with conventional wave functions used in physics. The combined wave function can be applied not only to bosonic models but also to fermionic models. The combined method substantially improves the accuracy beyond that ever achieved by RBM and conventional wave-function method separately, thus proving its power as an accurate solver. Reference: Y. Nomura, A. S. Darmawan, Y. Yamaji, and M. Imada, Phys. Rev. B 96, 205152 (2017)

Identification of the Berezinskii--Kosterlitz--Thouless transition in quantum and classical models with machine learning algorithms

Richter, Monika

In the last years machine learning has attracted, also among physicists, a great attention. It turned out that the algorithms, while having vast applications in the decision--making (for example in games such as chess or AlphaGo), pattern recognition, medical diagnosis and finance, can be applied also in the area of condensed matter physics. The study of many--body systems is a challenging task, as the size of the Hilbert space and consequently the amount of data to analyze, grows exponentially with the size of the system. Therefore, such systems seem to be a perfect 'material' for machine learning algorithms, which are especially well suited to deal with big and complex sets of data. Such an approach has already been successfully used, e.g., for the Ising model. By analyzing Monte Carlo-generated samples, \textsl{supervised}, as well as \textsl{unsupervised} learning methods (see e.g. Refs. \cite{key-1,key-2}) were able to correctly identify the transition temperature. The problem appears when one tries to deal with a non--conventional, more subtle type of phase transitions, which occurs for example in the 2--dimensional classical and quantum XY models. Due to the Mermin--Wagner theorem it is forbidden for systems described by these models to undergo the regular ferromagnet--paramagnet phase transition related to the spontaneous symmetry breaking. However, this theorem does not exclude the Berezinskii--Kosterlitz--Thouless (BKT) transition, which consists in the formation, at some critical temperature $T_{KT}$, of unbounded vortex--antivortex pairs. In our study, we demonstrate how one can handle this type of phase transitions. We make use of two different approaches. The first attempt is based on the idea already applied for the Ising model with the use of the feed--forward deep neural network \cite{key-1}. However, due to the continuous configurations of the spins, which occur in the models with the BKT transition, we do not train our neural network on the raw spin configurations. Instead, we transform the original data using trigonometric functions. This procedure makes the process of learning more efficient and results in more accurate predictions. In the second attempt we use the confusion scheme, which combines the \textsl{supervised} and \textsl{unsupervised learning} \cite{key-3} algorithms. The neural network is trained many times. Each time a different 'fictitious' critical temperature $T^*$ is assumed and all configurations generated at temperatures $T

Predicting correlations functions with neural networks

Schindler, Frank

In essence, the goal of any physical theory is to relate existing experimental observations to accurate predictions of further such observations. In particular in many-body theory, we aim to build models that allow the prediction of all possible correlation functions from the knowledge of a few such correlation functions. In this talk, I want to show to what extent the standard approach of solving this problem, which involves writing down a Hamiltonian and diagonalizing it, can be circumvented by the use of machine learning techniques. I will focus on spin chains and present results obtained with simple neural network architectures.

Effect of Electric Field on Breathing Pyrochlores

Sriluckshmy, PV

The coupling between conventional (Maxwell) and emergent electrodynamics in quantum spin ice has been studied by Lantagne-Hurtubise et al. (Phys. Rev. B 96, 125145) where they find that a uniform electric field can be used to tune the properties of both the ground state and excitations of the spin liquid. Extending the study to the case of breathing pyrochlores, we find a sufficiently strong electric field triggers a quantum phase transition into new U (1) quantum spin liquid phases along a direction that did not show a phase transition in the isotropic limit. We also analyse the phase diagram of breathing pyrochlores in the presence of Electric field using gauge mean field theory. Finally, we discuss experimental aspects of our results.

Finite-scaling scaling the MBL transition with deep neural networks

Théveniaut, Hugo

The many-body localization (MBL) transition separates an ergodic and a localized phase in a disordered interacting quantum system. It continues to defy a theoretical understanding, such as the issue of its universality class. We investigate the MBL transition using deep neural networks directly fed with wavefunctions and study in detail the influence of the neural network structure as well as finite-size effects using large systems. The input data is preprocessed in such a way that one neural network can handle different system sizes on equal footing, paving the way to an estimate of critical exponents of the MBL transition.

Machine learning the interacting ground state electron density

Vandermause, Jonathan

There is great interest in using machine learning methods to reduce the computational cost of materials simulation. Recent work has shown that it is possible to learn the mapping from external potential to electron density for small molecular systems using Kernel Ridge Regression, improving the accuracy of machine learning models trained on density functional theory calculations [1]. In this talk, we show how this potential-to-density mapping can be made more exact by training on Quantum Monte Carlo calculations and discuss extensions of this technique to solid state systems. [1] Brockherde et al. Bypassing the Kohn-Sham equations with machine learning. Nature Communications 8, 872 (2017).

Adaptive population Monte Carlo simulations

Weigel, Martin

Population annealing is a sequential Monte Carlo scheme that is potentially able to make use of highly parallel computational resources. Additionally, it promises to allow for the accelerated simulation of systems with complex free-energy landscapes, much alike to the much more well known replica-exchange or parallel tempering approach. We equip this method with self-adaptive and machine learning schemes for choosing the algorithmic parameters, including the temperature and sweep protocols as well as the population size. The resulting method is significantly more efficient for simulations of systems with complex free-energy landscapes than some more traditional approaches, and it is particularly well suited for massively parallel computing environments such as (clusters of) GPUs.

Interpretable Neural Networks for Learning Phase Diagrams

Wetzel, Sebastian

In very short time arti cial neural networks have achieved impressive results when tasked with calculating phase diagrams. These algorithms are mainly considered as black box algorithms. Hence, we cannot trust the results of arti cial neural networks blindly. Here, we discuss how to interpret neural networks when they are tasked with classifying phases in a supervised manner. Further, we employ an unsupervised neural network, the so called (variational) autoencoder, which can be interpreted naturally. It turns out that these algorithms have the potential to reveal the nature of the ordered phase.

Density Matrix Renormalization Group Study of One Dimensional Models beyond the Born-Oppenheimer Approximation

Yang, Mingru

We study one dimensional models of molecules and solids where both the electrons and nuclei are treated as quantum particles, going beyond the usual Born-Oppenheimer approximation. The continuous system is approximated by a grid which computationally resembles a ladder, with the electrons living on one leg and the nuclei on the other. To simulate DMRG well with this system, a three-site algorithm has been implemented. We also use a compression method to treat the long-range interactions between charged particles. We find that 1D diatomic molecules ("H2") with spin-1/2 nuclei in the spin triplet state will unbind when the mass of the nuclei reduces to only a few times larger than the electron mass. The molecule with nuclei in the singlet state always binds. The case of spin-0 bosonic nuclei is investigated as well.

Lattice model constructions for gapless domain walls between topological phases

Yang, Shuo

Lattice models of gapless domain walls between twisted and untwisted gauge theories of finite group G are constructed systematically. As simple examples, we numerically studied the gapless domain walls between twisted and untwisted Z_N (with N<6) gauge models in 2+1D using the state-of-art loop optimization of tensor network renormalization algorithm. We also studied the physical mechanism for these gapless domain walls and obtained quantum field theory descriptions that agree perfectly with our numerical results. By taking the advantage of the systematic classification and construction of twisted gauge models using group cohomology theory, we systematically construct general lattice models to realize gapless domain walls for arbitrary finite symmetry group G. Such constructions can be generalized into arbitrary dimensions and might provide us a systematical way to study gapless domain walls and topological quantum phase transitions. Refs: Chenfeng Bao, Shuo Yang, Chenjie Wang, Zheng-Cheng Gu, arXiv:1801.00719. Shuo Yang, Zheng-Cheng Gu, Xiao-Gang Wen, Phys. Rev. Lett. 118, 110504 (2017).

Cluster updating classical spin systems by equivalent Boltzmann machines

Yoshioka, Nobuyuki

Undoubtedly the construction of a global update method is the key to accelerating sampling and accurate understanding of physics in Monte Carlo simulations. A significant slow-down in the vicinity of the critical point of a system, for instance, could be overcome by such algorithm. Here, we focus on classical Ising systems with p-body interactions that can be mapped exactly to Boltzmann machines. We find equivalent expressions in extended space such that the well-established cluster update methods can be applied. In our presentation, we discuss the realization of significant reduction in the autocorrelation time of the Markov chains by the newly proposed rejection-free algorithm.