projects <- back to the main page of Dr. Bogdan Savchynskyy


Diversity and Uncertainty for Discrete Graphical Models

Inference for discrete graphical models (aka MRF/CRF/WCSP/...)

Solving MAP-inference exactly if its LP relaxation is good

Out-of-the-shelf combinatorial solvers (like CPLEX) can solve really big energy minimization instances, if you propelry apply them to a proper place in the model, where their power is actually needed, where the relaxed solution is not integer.
  • number of benchmark problems solved exactly;
  • beating specialized ILP solvers (e.g. best results on Protein Folding and very good on color segmentation);
  • coupling of arbitrary ILP and (approximate) LP solvers (e.g. TRW-S);
  • generalizable to arbitrary models and arbitrary LP relaxation.
B. Savchynskyy, J. Kappes, P. Swoboda, C. Schnörr
Global MAP-Optimality by Shrinking the Combinatorial Search Area with Convex Relaxation
NIPS-2013 [bib] [pdf] [presentation-pdf] [poster-pdf]  The code is available in OpenGM library  (inference class CombiLP )

See also results of our solver CombiLP in the OpenGM benchmark.

Work in progress.

Partial optimality for MAP-inference

Partial optimality result
Meta-algorithm to turn (approximate) inference solvers to provide a part of a globally optimal (non-relaxed!) solution of the MAP-inference problem.
  •  Overall polynomial time.
  •  Arbitrary solver delivering lower bound, e.g. TRW-S.
  •  Arbitrary model order.
  •  Arbitrary LP relaxation.
P. Swoboda, A. Shekhovtsov, J. Kappes, C. Schnörr, B. Savchynskyy
Partial Optimality by Pruning for MAP-inference with General Graphical Models ArXiv 1410.6641 [bib][pdf] - Best student paper Award at CVPR 2014!

A. Shekhovtsov, P. Swoboda,  B. Savchynskyy
Maximum Persistency via Iterative Relaxed Inference with Graphical Models
CVPR 2015 [extended abstract] [bib][pdf with supplementary material]

Check also [External project page] for a public code!

Work in progress.

Local polytope (LP) relaxation for MAP-inference problem.

Feasible primal bound for the relaxed MAP-inference
Efficient estimation of feasibe primal solutions for the relaxed MAP-inference problem. Profits:
  • vanishing duality gap,
  • adaptive algorithms (subgradient, bundle, smoothing...);
  • uniform sound stopping criteria for a number of different inference algorithms.
The technique is generalizable to other large-scale sparse convex problems.

B. Savchynskyy, S. Schmidt Getting Feasible Variable Estimates From Infeasible Ones: MRF Local Polytope Study.
In Advanced Structured Prediction, MIT Press, 2014 [bib] [pdf][poster-pdf] The code is available in OpenGM library (inference class PrimalLPBound )

Diminishing smoothing strategy

Smoothing technique
for solving LP relaxation of the MAP-inference problem. Profits:
  • guaranteed convergence to the optimum of the relaxed problem;
  • guaranteed convergence rate O(1/t) for the accelerated gradient (Nesterov) algorithm;
  • guaranteed convergence to the optimum of LP for fast dual block-coordinate ascent (smoothed TRW-S).
B. Savchynskyy, J. H. Kappes, S. Schmidt, C. Schnörr
A Study of Nesterov's Scheme for Lagrangian Decomposition and MAP Labeling
In CVPR 2011 - oral presentation
[bib][PDF with supplementary material][Presentation (pdf) at the Graphical Model Workshop in Kiev, Sep.-Oct.2010 and at CVPR2011]
The code is available in OpenGM library (inference class NesterovAcceleratedGradient)

B. Savchynskyy, S. Schmidt, J. H. Kappes, C. Schnörr
Efficient MRF Energy Minimization via Adaptive Diminishing Smoothing
In UAI, 2012, pp. 746-755. [bib] [PDF (revised version with appendix)][UAI Poster with additional comparison to ADLP algorithm]
The code is available in OpenGM 2.1.0 library (inference class ADSal)

See also results of our dual block-coordinate ascent algorithm ADSal in the OpenGM benchmark.

Journal paper coming soon.

There are also more new projects running... and a number of others, which I less involved to. See my publications for details.

Projects from my previous life...