IJCAI 2018, fourth day
IJCAI 2018 is being held in Stockholm from July the 13th to July the 19th as a part of the Federated AI Meeting. In this and other posts I will give some short comments on interesting things I’ve heard. If anything is wrong or misrepresented, please don’t hesitate to get in touch with me.
On the seventh day the main IJCAI conference concluded. The conference has many parallel tracks, comments here are from sessions that I thought might be interesting for me.
Constraints, Satisfiability and Search
A Framework for Constraint Based Local Search using Essence
The idea here is to keep the abstract types defined in Essence models to make a better local search solver. The idea is to use the Essence specification to generate reasonable neighborhoods.
find S : set (size 3) of in (1..5)
Since the set is of fixed cardinality, a model using three variables
and an alldifferent
constraint can be used to represent the set, as
compared to to a set of Booleans, one for each present value. The
neighborhood for the integer model will have smaller moves, just
changing one variable instead of swapping two variables to get the
same change.
given active: set (size m) of int (1..n)
find S: set (size m) of int (1..n)
such that
| active /\ S | = m -s
Neighborhoods of a type T
can be lifted to a neighborhood of a type
constructed from T
elements. Also, the different possible
neighborhoods are tried during solving and learning is used to find
which are performing better.
As an example a local search algorithm for a problem containing a multiset of sets. Compared with other solutions (Chuffed, OscaR CBLS, …) Essence LS was faster.
Stratification for Constraint-Based Multi-Objective Combinatorial Optimization
Similar to other talks on multi objective optimization, Virtual Machine Consolidation is used as an example. The problem is about relocating VMs among servers, balancing the gain from consolidation with the cost of migration.
A recent approach uses Minimal Correction Subsets (MCS) which is
effective to find the solution, but not very good at optimizing. The
idea is to use MCSs for finding the soft constraints to correct to get
a valid solution. There is a theoretical result that given all k
MCSs, all Pareto-optimal solutions p
are among them (where p <= k
)
However, this ignores the coefficients for the objectives. The idea is to use stratification and separating the different soft constraints from the different objectives. MCSs from the layers are mixed using random order to not have bias for a particular objective.
A Scalable Scheme for Counting Linear Extensions
A linear extension (LE) of a partial order (PO) is an extension of the order to
that it becomes a linear order still respecting the original partial
order. Computing the number of LEs is a #P
complete problem.
Counting LEs for a PO can be used in partial order planning, preference reasoning, learning Bayesian networks, and convex rank tests.
There exists an exact dynamic programming algorithm taking O(2^n *
n)
which can handle around 64 elements in practice.. There exists
two different approximation schemes, which both have fairly high
complexity (the better is around O(n^5)
).
The paper presents an algorithm that moves from a discrete problem into a continuous space, where every ordering constraint in the partial order gives an ordering constraints, which can estimated using lots of math. The resulting algorithm runs faster than the previous solutions for reasonably large sets (the cross-over is between 128 and 256).
It seems to me like it would be easy to encode the problem as a SAT and then use standard model counting solutions or approximations. I wonder if that is a reasonable approach.
Socially Motivated Partial Cooperation in Multi-agent Local Search
The two normal scenarios for multi-agent cooperation are typically self interested or fully cooperative. The paper discusses cooperating using common social good instead. The local search introduced uses social taboos with votes for goals.
A fun side-note is that the (very slick) slides which used graphics for road travel had a very different cultural outlook. It had a comparison between an agent that wanted to get home fast (using an SUV) and an agent that wanted to be climate responsible. The latter did not use a bike as I would have expected, but a dromedary.
Solving (Weighted) Partial MaxSAT by Dynamic Local Search for SAT
A Partial MaxSAT (PMS) problem is a MaxSAT where some clauses are hard constraints. Previous work for local search for PMS apparently used weighting schemes that gave very high static weights to the hard clauses. Here an alternative scheme is used that dynamically adjusts the weights of the clauses during search.
Linear Satisfiability Preserving Assignments
A satisfiability preserving assignment (SPA) is an assignment that when made will not make a satisfiable formula unsatisfiable. Not that it does not require preserving all models of the original formula, just the existence of a model. A lot of research into finding SPAs.
The paper gives relationships between various classes of SPAs and also a polyhedral linear way to find SPAs. The idea is to find SPAs for each constraint, transforming into (non-convex) real-valued vectors. These are approximated to be convex, and solutions are found using the intersection of these approximations.
Heuristic Search (K2)
Distributed Pareto Optimization for Subset Selection
A maximum coverage subset selection is a problem where the goal is to
find the k
sets whose union gives the best results. Other problems
can be formulated in the similar way, substituting other objective
functions.
Two important properties are monotonicity (choosing more elements gives better results) and submodularity (the more sets are added, the less impact each new set adds diminishes).
The paper here makes a distributed solver for general subset selection problems.
Best-Case and Worst-Case Behavior of Greedy Best-First Search
This paper continues the usage of high-water marks and benches analyzing how tie-breaking influences the search behavior.
Implementations of algorithms are made that extracts the best and worst case behavior of benchmark problems given different tie-breaking strategies. The experiments shows that there is quite a bit difference between problems that have “craters” in their structure and those that do not For crater-free standard tie-breaking is close to optimal, while for problems with craters in the structure there is quite a lot of performance difference between the standard tie-breakings and the best possible.
Anytime Focal Search with Applications
Focal search is a heuristic search style algorithm (A*, GBFS, …) that can make use of more types of heuristics.
Understanding Subgoal Graphs by Augmenting Contraction Hierarchies
For path planning pre-processing is very useful. The fastest need very much memory, while contraction hierarchies (CH) use much less memory. Sub goal graphs (SGG) on the other hand is slightly slower again, but can be applied more generally.
This paper creates a set of variants between SGG and CH. In particular, this makes it possible to use some graph properties for CH style properties. The results allows for using both algorithms good properties together. Seems very promising. Also interesting to contrast with Forward Search in Constraint Hierachies from SoCS 2018.
The experiments shows that it works very well on game maps, but not on random maps. This shows that it is useful to exploit existing structure.
Improving Local Search for Minimum Weight Vertex Cover by Dynamic Strategies
Minimum weight vertex cover is a very nice problem that can be used to model very many interesting problems. The most common way to solve these problems is using heuristics, and most commonly using local search.
One problem with trying to improve the state of the art is a lack or real and interesting real-world benchmarks. The paper creates a set of such benchmarks based on a map labeling problem.
The algorithm uses two different scoring functions and chooses dynamically. Interesting to compare with the LS for Partial MaxSAT which adjusts the functions dynamically.