Call for Papers

The workshop seeks high quality, original and unpublished work on algorithms, theory and systems related to divergence methods for probabilistic inference.

The workshop is non-archival and the authors retain the copyrights to the entries and the rights to resubmit and publish at other venues.

Topics of interest include:

  • Choice of divergence function: Analysis of the properties of different divergence functions for probabilistic inference. Discovery of representer theorems - when they exist. Discovery of conjugate priors and other default methods that may improve the efficiency of inference.
  • Analysis of estimated posterior: Properties of the divergence based posterior estimates such as consistency, sample complexity and risk. Comparison to standard Bayesian methods to determine when the methods overlap.
  • Novel constraint structure: Development of novel models and approaches for incorporating constraints such as online constraint generation, human-in-the-loop learning and other feedback. The application of support / domain constraints and other constraint structures beyond expectation constraints. The application of complex structured regularizers and graph regularizers such as wordnet for semantic relationships between name entities.
  • Scalable inference for big data and complex data: Novel scalable inference methods for modern big data applications including   parallel, distributed and streaming architectures for large scale inference. Novel inference procedures with guarantees for large data. Novel models and inference methods for multi-relational and multi-modal data with complex interactions in domains such as marketing, healthcare, bio-informatics, and other large multimodal data domains.
  • Applications: Novel applications of the divergence based methods particularly for problems where Bayesian inference may be inappropriate or inefficient. Applications to various problem domains such as constrained clustering, natural language processing and scientific applications. Detailed empirical comparison of divergence based inference and Bayesian inference to determine cases in which each approach is best applied.

© 2013-2014 ICML | International Conference on Machine Learning