The Future of Bayes 

 John Salvatier has a blog post on the  future of MCMC algorithms , focusing on differential methods, which use derivatives of the posterior to inform where the algorithm should move next. This allows for greater step length, faster convergence, and better handling of multimodal posteriors. Gelman  agrees   with the direction. There has been some recent work on implementing  automatic differentiation in R , which is the cornerstone of the algorithms Salvatier discusses. Perhaps we will see this moving into some of the more popular MCMC packages soon. 

 On a slightly different Bayes front, SSS-pal and former blogger  Justin Grimmer  has a  paper  on variational approximation, which is a method for deterministically approximating posteriors.  This approach is often useful when MCMC is extremely slow or impossible, since convergence under VA is both fast and guaranteed.