Jeremias Berg defends PhD thesis

28.5.2018
On May 25 2018, Berg successfully defended his doctoral thesis on Solving Optimization Problems via Maximum Satisfiablity.

M.Sc. Jeremias Berg succesfully defended his doctoral dissertation Solving Optimization Problems via Maximum Satisfiablity: Encodings and Re-Encodings  on  May 25, 2018. The thesis work was conducted in the Constraint Reasoning and Optimization group of the Department of Computer Science at the University of Helsinki, with Associated Professor Matti Järvisalo as the main PhD supervisor. Associate Professor Inês Lynce (Universidade de Lisboa, Portugal) acted as the opponent, and Professor Lakhdar Saïs (University of Artois) and Professor Peter Stuckey (University of Melbourne) as  pre-examiners.

Solv­ing Op­tim­iz­a­tion Prob­lems via Max­imum Sat­is­fiablity: En­cod­ings and Re-En­cod­ings

NP-hard combinatorial optimization problems are commonly encountered in numerous different domains. As such efficient methods for solving instances of such problems can save time, money, and other resources in several different applications. This thesis investigates exact declarative approaches to combinatorial optimization within the maximum satisfiability (MaxSAT) paradigm, using propositional logic as the constraint language of choice. Specifically we contribute to both MaxSAT solving and encoding techniques.

In the first part of the thesis we contribute to MaxSAT solving technology by developing solver independent MaxSAT preprocessing techniques that re-encode MaxSAT instances into other instances. In order for preprocessing to be effective, the total time spent re-encoding the original instance and solving the new instance should be lower than the time required to directly solve the original instance. We show how the recently proposed label-based framework for MaxSAT preprocessing can be efficiently integrated with state-of-art MaxSAT solvers in a way that improves the empirical performance of those solvers. We also investigate the theoretical effect that label-based preprocessing has on the number of iterations needed by MaxSAT solvers in order to solve instances. We show that preprocessing does not improve best-case performance (in the number of iterations) of MaxSAT solvers, but can improve the worst-case performance. Going beyond previously proposed preprocessing rules we also propose and evaluate a MaxSAT-specific preprocessing technique called subsumed label elimination (SLE). We show that SLE is theoretically different from previously proposed MaxSAT preprocessing rules and that using SLE in conjunction with other preprocessing rules improves empirical performance of several MaxSAT solvers.

In the second part of the thesis we propose and evaluate new MaxSAT encodings to two important data analysis tasks: correlation clustering and bounded treewidth Bayesian network learning. For both problems we empirically evaluate the resulting MaxSAT-based solution approach with other exact algorithms for the problems. We show that, on many benchmarks, the MaxSAT-based approach is faster and more memory efficient than other exact approaches. For correlation clustering, we also show that the quality of solutions obtained using MaxSAT is often significantly higher than the quality of solutions obtained by approximative (inexact) algorithms. We end the thesis with a discussion highlighting possible further research directions.

More about the subject: Data Science