Finding community structure in very large networks
The discovery and analysis of community structure in networks is a topic of considerable recent interest within the physics community, but most methods proposed so far are unsui...
Explore 387 academic publications
The discovery and analysis of community structure in networks is a topic of considerable recent interest within the physics community, but most methods proposed so far are unsui...
This paper considers a natural error correcting problem with real valued input/output. We wish to recover an input vector f/spl isin/R/sup n/ from corrupted measurements y=Af+e....
Abstract Suppose we wish to recover a vector x 0 ∈ ℝ 𝓂 (e.g., a digital signal or image) from incomplete and contaminated observations y = A x 0 + e ; A is an 𝓃 × 𝓂 matrix wi...
Profile hidden Markov models (profile HMMs) and probabilistic inference methods have made important contributions to the theory of sequence database homology search. However, pr...
The time-frequency and time-scale communities have recently developed a large number of overcomplete waveform dictionaries --- stationary wavelets, wavelet packets, cosine packe...
Suppose we are given a vector f in a class F ⊂ ℝN, e.g., a class of digital signals or digital images. How many linear measurements do we need to make about f to be able to reco...
Abstract The sparsity which is implicit in MR images is exploited to significantly undersample k ‐space. Some MR images such as angiograms are already sparse in the pixel repres...
This article is about a curious phenomenon. Suppose we have a data matrix, which is the superposition of a low-rank component and a sparse component. Can we recover each compone...
LIBLINEAR is an open source library for large-scale linear classification. It supports logistic regression and linear support vector machines. We provide easy-to-use command-lin...
Abstract We consider the problem of estimating sparse graphs by a lasso penalty applied to the inverse covariance matrix. Using a coordinate descent procedure for the lasso, we ...