- November 22: MSE17 solver and benchmark descriptions are available.
- November 8: Solver sources are available for download.
- September 21: Slides of the MSE17 presentation at SAT'17 are available.
- September 9: Benchmarks used in the evaluations are available.
- September 9: Results of the evaluations are available.
- June 9: Submission deadline for both solvers and benchmarks extended to June 30.
- June 9: Details on sending a SIGTERM signal with runsolver added.
- June 9: Details on using a checker to test the solution added.
- June 6: "No restrictions" track announced.
- June 6: Rules updated: clarifications added (only one processor core can be used, incomplete ranking, no-restrictions track)
- June 6: Details on the incomplete tracks added.
- to assess the state of the art in the field of MaxSAT solvers,
- to collect and re-distribute a heterogeneous MaxSAT benchmark set for further scientific evaluations, and
- to promote MaxSAT as a viable option for solving instances of a wide range of NP-hard optimization problems.
- New MaxSAT benchmarks encoding instances of interesting NP-hard optimization problems, and
- implementations of MaxSAT solvers that will be evaluated within MSE 2017 on a heterogeneous collection of benchmarks.
The MaxSAT Evaluation website has been moved to https://maxsat-evaluations.github.io. Please check the new location for updates!
NEWS
About MSE 2017
The 2017 MaxSAT Evaluation (MSE 2017) is the 12th edition of MaxSAT evaluations, the primary competition-style event focusing on the evaluation of MaxSAT solvers organized yearly since 2006.
The main goals of MaxSAT Evaluation 2017 (MSE 2017) are
MSE 2017 welcomes contributions of two types from the community at large:
As a new development for 2017, MSE 2017 is run under a new organization as a collaboration between University of Helsinki (Finland), University of Lleida (Spain), University of Texas at Austin (USA), and University of Toronto (Canada).