Research: |
|
|
In this work Exact Decoding of Quantum Error-Correcting Codes, we derive the first exact maximum-likelihood decoder for the repetition code under circuit-level noise.
The decoder, called PLANAR, first maps the optimal decoding problem to the computational of partition function of a spin-glass problem and solves it on planar graphs in polynomial time.
The approach extends to any quantum error-correcting code whose maximum-likelihood decoding maps to a planar spin-glass problem, including the surface code under independent code-capacity noise.
|
|
In arXiv:2111.03011 (published here ), we solved the sampling problem of the Google's Sycamore quantum computer, reducing the simulation time of Google's Sycamore quantum circuits from 10000 years to dozens of seconds using our newly proposed tensor network methods. Press: Science News,   ITPCAS
|
|
In arXiv:2103.03074 (published here), for the first time we computed exact bitstring amplitudes and probabilities for Google's Sycamore quantum supremacy circuits using our proposed Big-batch tensor network method, and obtained one million correlated bitstrings samples with Linear Cross Entropy Fidelity (XEB) 0.739, which is greater than Google's quantum computer. Thus the approach passes the Google's XEB test and works as a spoofing to Google's quantum computer.
|
|
In this paper, we present a unified exact approach to compute the ground state energy, identify the optimal configuration, and count the number of solutions for spin glasses, by introducing the tropical algebra defined on the semiring to tensor networks. The approach brings together the concepts from graphical models, tensor networks, differentiable programming, and quantum circuit simulation, and easily utilizes the computational power of graphical processing units (GPUs). For applications, we compute the exact ground state energy of Ising spin glasses on square lattice up to 1024 spins, on cubic lattice up to 216 spins, and on three regular random graphs up to 220 spins, on a single GPU; we also obtain exact ground state energy of Ising spin glass on the chimera graph of D-Wave quantum annealer of 512 qubits in less than 100 s.
|
|
Many hard problems in physics, computer science, and machine learning can be solved using tensor networks in principle. However, the contraction of tensor networks on irregular graphs is considered to be intractable. In this PRL paper we introduce a general method for approximately contracting arbitrary tensor networks, based on the matrix product states and the density matrix renormalization group method. Our method solves the computational intractability problem of contracting tensor networks in long-range interacting systems, gives extraordinary performance in the graphical models and simulation of quantum circuits.
|
|
Computing free energy, estimating physical quantities, and generating uncorrelated samples are fundamental problems in statistical mechanics. In this work we proposed a new framework for solving the statistical mechanics problems for systems with a finite size. The approach extends the celebrated variational mean-field approaches using autoregressive networks, a neural network model which supports direct sampling and exact calculation of normalized probability of configurations. Training of the network employs the policy gradient approach in reinforcement learning, which unbiasedly estimates the gradient of variational parameters. We have successfully applied our approach to several classic systems, including 2-d Ising models, Hopfield model, Sherrington--Kirkpatrick spin glasses, and the inverse Ising model. The paper is published in Physical Review Letter and is selected as Editors' Suggestion.
|
|
Probably lots of the readers are familiar with the Boltzmann machine which models joint probability distribution of data using Boltzmann distribution. The Boltzmann machine is a great contribution from Statistical Physics to Machine Learning. In this Physical Review X paper , we propose a fresh unsupervised machine learning model borrowed from Quantum Physics, which models the joint distribution of data using Born's rule. Thus we call it Born Machine. This model connects tensor networks and generative modeling. You can find a tutorial on the topic of tensor network, matrix product states, and generative learning.
|
|
Spectral methods are popular in detecting global structures in the given data that can be represented as a matrix. However when the data matrix is sparse or noisy, classic spectral methods usually fail to work, due to localization of eigenvectors (or singular vectors) induced by the sparsity or noise.
In this paper ( NIPS 2016) we propose a general method to solve the localization problem by learning a regularization matrix from the localized eigenvectors. Here is a Demo for the algorithm "X-Laplacian".
|
|
Many real-world networks are dynamic, with nodes changing their connections and affiliations over time in complicated ways. This situation makes community detection more challenging, but correlations across time provide a means to circumvent this issue.
In this Physical Review X paper, we derive a precise mathematical limit on our ability to recover the underlying community structure in a dynamic network, which depends only on the strength of the hidden communities and the rate at which nodes change their community membership.
|
|
Maximizing modularity is the most popular
method of detecting communities in networks. However, it is
prone to overfitting.
In this PNAS paper, with Cris Moore we proposed to solve this overfitting problem
using ideas from statistical physics, and developed an efficient algorithm
for detecting communities and hierarchies in large networks.
|
|
 |
Spectral algorithms are widely used for clustering, yet they often break down in sparse networks due to the emergence of localized eigenvectors.
In a PNAS paper with collaborators, we introduced a novel spectral algorithm built on the non-backtracking operator that is immune to this pathology and performs excellently on large sparse networks.
|
|
|
|
|