The scales of many real life networks render some of the traditional computational models, such as polynomial time algorithms, too impractical to meet the computational demands. Thus algorithms which can be run in parallel based on local information (local algorithms) gained a lot of prominence recently as a viable alternative computational model. In this talk we discuss the power and limitations of local algorithms for solving optimization and inference problems on large networks.
On the positive side we will provide examples of local, even sublinear time algorithms, which are capable of solving some hard optimization problems on large graphs, when the underlying objective function is stochastic. This is surprising since the deterministic counterparts for these problems are NP-hard. We will further present computational results which demonstrate the practicality of the proposed algorithms and make them promising computational tools for real life networks.
On the negative side we establish a fundamental barrier for the power of local algorithms to solve optimization problems on random graphs, despite several conjectures put forward in the past. In particular, we refute a conjecture by Lovasz et al regarding the power of local algorithms to find nearly largest independent sets in random regular graphs. Similarly, we refute a conjecture put forward by statistical physicists regarding the power of so-called Belief Propagation and Survey Propagation algorithms to solve randomly generated constraint satisfaction problems. Our negative results exploit fascinating geometry of optimal solutions in random graphs, which was first predicted by physicists heuristically and confirmed recently by rigorous methods.
The talk will be completely self-contained requiring very little to no background in combinatorial optimization or statistical physics.
Joint work with David Goldberg, Theophane Weber, Dmitriy Katz and Madhu Sudan