Research: 


Many hard problems in physics, computer science, and machine learning can be solved using tensor networks in principle. However, the contraction of tensor networks on irregular graphs is considered to be intractable. In this PRL paper we introduce a general method for approximately contracting arbitrary tensor networks, based on the matrix product states and the density matrix renormalization group method. Our method solves the computational intractability problem of contracting tensor networks in longrange interacting systems, gives extraordinary performance in the graphical models and simulation of quantum circuits.


Computing free energy, estimating physical quantities, and generating uncorrelated samples are fundamental problems in statistical mechanics. In this work we proposed a new framework for solving the statistical mechanics problems for systems with a finite size. The approach extends the celebrated variational meanfield approaches using autoregressive networks, a neural network model which supports direct sampling and exact calculation of normalized probability of configurations. Training of the network employs the policy gradient approach in reinforcement learning, which unbiasedly estimates the gradient of variational parameters. We have successfully applied our approach to several classic systems, including 2d Ising models, Hopfield model, SherringtonKirkpatrick spin glasses, and the inverse Ising model. The paper is published in Physical Review Letter and is selected as Editors' Suggestion.


Probably lots of the readers are familiar with the Boltzmann machine which models joint probability distribution of data using Boltzmann distribution. Boltzmann machine is a great contribution from Statistical Physics to Machine Leanring. In this Physical Review X paper we propose a fresh unsupervised machine learnig model borrowed from Quantum Physics, which models the joint distribution of datea using Born's rule. Thus we call it Born Machine. This model connects tensor networks and generative modeling. You can find a tutorial on the topic of tensor network, matrix product state, and generative learning.


Spectral methods are popular in detecting global structures in the given data that can be represented as a matrix. However when the data matrix is sparse or noisy, classic spectral methods usually fail to work, due to localization of eigenvectors (or singular vectors) induced by the sparsity or noise.
In this paper ( NIPS 2016) we propose a general method to solve the localization problem by learning a regularization matrix from the localized eigenvectors. Here is a Demo for the algorithm "XLaplacian".


Many realworld networks are dynamic, with nodes changing their connections and affiliations over time in complicated ways. This situation makes community detection more challenging, but correlations across time provide a means to circumvent this issue.
In this Physical Review X paper, we derive a precise mathematical limit on our ability to recover the underlying community structure in a dynamic network, which depends only on the strength of the hidden communities and the rate at which nodes change their community membership.


Maximizing modularity is the most popular
method of detecting communities in networks. However it is
prone to overfitting.
In this PNAS paper, with Cris Moore we proposed to solve this overfitting problem
using ideas from statistical physics, and developped an efficient algorithm
for detecting communites and hierarchies in large networks.



Spectral algorithms are popular method of clustering data. However they often fail in sparse networks, because of existence of localized eigenvectors.
In a paper published in PNAS, with collaborators we gave a new spectral algorithm based on the nonbacktracking operator that is immune to this disease, and works very well in large sparse networks. 



