Nonuniform Graph Partitioning
with Unrelated Weights
Abstract
We give a bicriteria approximation algorithm for the Minimum Nonuniform Partitioning problem, recently introduced by Krauthgamer, Naor, Schwartz and Talwar (2014). In this problem, we are given a graph on vertices and numbers . The goal is to partition the graph into disjoint sets satisfying so as to minimize the number of edges cut by the partition. Our algorithm has an approximation ratio of for general graphs, and an approximation for graphs with excluded minors. This is an improvement upon the algorithm of Krauthgamer, Naor, Schwartz and Talwar (2014). Our approximation ratio matches the best known ratio for the Minimum (Uniform) Partitioning problem.
We extend our results to the case of “unrelated weights” and to the case of “unrelated dimensional weights”. In the former case, different vertices may have different weights and the weight of a vertex may depend on the set the vertex is assigned to. In the latter case, each vertex has a dimensional weight if is assigned to . Each set has a dimensional capacity . The goal is to find a partition such that coordinatewise.
1 Introduction
We study the Minimum Nonuniform Partitioning problem, which was recently proposed by Krauthgamer, Naor, Schwartz and Talwar (2014). We are given a graph , parameter and numbers (capacities) . Our goal is to partition the graph into pieces (bins) satisfying capacity constraints so as to minimize the number of cut edges. The problem is a generalization of the Minimum Partitioning problem studied by Krauthgamer, Naor, and Schwartz (2009), in which all bins have equal capacity .
The problem has many applications (see Krauthgamer et al. 2014). Consider an example in cloud computing: Imagine that we need to distribute computational tasks – vertices of the graph – among machines, each with capacity . Different tasks communicate with each other. The amount of communication between tasks and equals the weight of the edges between the corresponding vertices and . Our goal is to distribute tasks among machines subject to capacity constraints so as to minimize the total amount of communication between machines.^{1}^{1}1 In this example, we need to solve a variant of the problem with edge weights.
The problem is quite challenging. Krauthgamer et al. (2014) note that many existing techniques do not work for this problem. Particularly, it is not clear how to solve this problem on tree graphs^{2}^{2}2Our algorithm gives a constant factor bicriteria approximation for trees. and consequently how to use Räcke’s (2008) tree decomposition technique. Krauthgamer et al. (2014) give an bicriteria approximation algorithm for the problem: the algorithm finds a partition such that for every and the number of cut edges is . The algorithm first solves a configuration linear program and then uses a new sophisticated method to round the fractional solution.
In this paper, we present a rather simple SDP based bicriteria approximation algorithm for the problem. We note that our approximation guarantee matches that of the algorithm of Krauthgamer, Naor, and Schwartz (2009) for the the Minimum Partitioning problem (which is a special case of Minimum Nonuniform Partitioning, see above). Our algorithm uses a technique of “orthogonal separators” developed by Chlamtac, Makarychev, and Makarychev (2006) and later used by Bansal, Feige, Krauthgamer, Makarychev, Nagarajan, Naor, and Schwartz (2011) for the Small Set Expansion problem. Using orthogonal separators, it is relatively easy to get a distribution over partitions such that for all and the expected number of cut edges is where . The problem is that for some , may be much larger than its expected size. The algorithm of Krauthgamer et al. (2014) solves a similar problem by first simplifying the instance and then grouping parts into “megabuckets”. We propose a simpler fix: Roughly speaking, if a set contains too many vertices, we remove some of these vertices and repartition the removed vertices into pieces again. Thus we ensure that all capacity constraints are (approximately) satisfied. It turns out that every vertex gets removed a constant number of times in expectation. Hence, the repartitioning step increases the number of cut edges only by a constant factor. Another problem is that may be much larger than . To deal with this problem, we transform the SDP solution (eliminating “short” vectors) and redefine thresholds so that becomes .
Our technique is quite robust and allows us to solve more general versions of the problem, Nonuniform Graph Partitioning with unrelated weights and Nonuniform Graph Partitioning with unrelated dimensional weights.
Minimum Nonuniform Graph Partitioning with unrelated weights captures the variant of the problem where we assign vertices (tasks/jobs) to unrelated machines and the weight of a vertex (the size of the task/job) depends on the machine it is assigned to.
Definition 1.1 (Minimum Nonuniform Graph Partitioning with unrelated weights).
We are given a graph on vertices and a natural number . Additionally, we are given normalized measures on (satisfying ) and numbers . Our goal is to partition the graph into pieces (bins) , …, such that so as to minimize the number of cut edges. Some pieces may be empty.
We will only consider instances of Minimum Nonuniform Graph Partitioning that have a feasible solution. We give an bicriteria approximation algorithm for the problem.
Theorem 1.2.
For every , there exists a randomized polynomialtime algorithm that given an instance of Minimum Nonuniform Graph Partitioning with unrelated weights finds a partition satisfying . The expected cost of the solution is at most , where is the optimal value, and . For graphs with excluded minors .
Nonuniform Graph Partitioning with unrelated dimensional weights further generalizes the problem. In this variant of the problem, we assume that we have resources (e.g. CPU speed, random access memory, disk space, network). Each piece has units of resource , and each vertex requires units of resource when it is assigned to piece . We need to partition the graph so that capacity constraints for all resources are satisfied. The dimensional version of Minimum (uniform) Partitioning was previously studied by Amir et al. (2014). In their problem, all are the same, and ’s do not depend on .
Definition 1.3 (Minimum Nonuniform Graph Partitioning with unrelated dimensional weights).
We are given a graph on vertices. Additionally, we are given nonnegative numbers and for , , . Our goal is to find a partition of into subject to capacity constraints so as to minimize the number of cut edges.
We present a bicriteria approximation algorithm for this problem.
Theorem 1.4.
For every , there exists a randomized polynomialtime algorithm that given an instance of Minimum Nonuniform Graph Partitioning with unrelated dimensional weights finds a partition satisfying
The expected cost of the solution is at most , where is the optimal value, . For graphs with excluded minors .
We note that this result is a simple corollary of Theorem 1.2 we let and then apply our result to measures (we describe the details in Appendix C).
We remark that our algorithms work if edges in the graph have arbitrary positive weights. However, for simplicity of exposition, we describe the algorithms for the setting where all edge weights are equal to one. To deal with arbitrary edge weights, we only need to change the SDP objective function.
Our paper strengthens the result of Krauthgamer et al. (2014) in two ways. First, it improves the approximation factor from to . Second, it studies considerably more general variants of the problem, Minimum Nonuniform Partitioning with unrelated weights and Minimum Nonuniform Partitioning with unrelated dimensional weights. We believe that these variants are very natural. Indeed, one of the main motivations for the Minimum Nonuniform Partitioning problem is its applications to scheduling and load balancing: in these applications, the goal is to assign tasks to machines so as to minimize the total amount of communication between different machines, subject to capacity constraints. The constraints that we study in the paper are very general and analogous to those that are often considered in the scheduling literature. We note that the method developed in Krauthgamer et al. (2014) does not handle these more general variants of the problem.
2 Algorithm
SDP Relaxation. Our relaxation for the problem is based on the SDP relaxation for the Small Set Expansion (SSE) problem of Bansal et al. (2011). We write the SSE relaxation for every cluster and then add consistency constraints similar to constraints used in Unique Games. For every vertex and index , we introduce a vector . In the integral solution, this vector is simply the indicator variable for the event “”. It is easy to see that in the integral case, the number of cut edges equals (1). Indeed, if and lie in the same , then for all ; if lies in and lies in (for ) then for and for . The SDP objective is to minimize (1).
We add constraint (2) saying that . We further add spreading constraints (4) from Bansal et al. (2011) (see also Louis and Makarychev (2014)). The spreading constraints above are satisfied in the integral solution: If , then and both sides of the inequality equal 0. If , then the left hand side equals , and the right hand side equals .
We write standard triangle inequalities (6) and (7). Finally, we add consistency constraints. Every vertex must be assigned to one and only one , hence constraint (5) is satisfied. We obtain the following SDP relaxation.
SDP Relaxation
(1) 
subject to
(2)  
(3)  
(4)  
(5)  
(6)  
(7) 
Small Set Expansion and Orthogonal Separators. Our algorithm uses a technique called “orthogonal separators”. The notion of orthogonal separators was introduced in Chlamtac, Makarychev, and Makarychev (2006), where it was used in an algorithm for Unique Games. Later, Bansal et al. (2011) showed that the following holds. If the SDP solution satisfies constraints (3), (4), (6), and (7), then for every , , and , there exist a distortion , and a probability distribution over subsets of such that for a random set (“orthogonal separator”) distributed according to this distribution, we have for ,

(always);

For all , ;

For all , .
We let . This statement was proved in Bansal et al. (2011) implicitly, so for completeness we prove it in the Appendix — see Theorem A.1. For graphs with excluded minors and bounded genus graphs, .
Algorithm. Let us examine a somewhat naïve algorithm for the problem inspired by the algorithm of Bansal et al. (2011) for Small Set Expansion. We shall maintain the set of active (yet unassigned) vertices . Initially, all vertices are active, i.e. . At every step , we pick a random index and sample an orthogonal separator as described above. We assign all active vertices from to the bin number :
and mark all newly assigned vertices as inactive i.e., we let . We stop when the set of active vertices is empty. We output the partition , where is the index of the last iteration.
We can show that the number of edges cut by the algorithm is at most , where is the distortion of orthogonal separators. Furthermore, the expected weight of each is . However, weights of some pieces may significantly deviate from the expectation and may be much larger than . So we need to alter the algorithm to guarantee that all sizes are bounded by simultaneously. We face a problem similar to the one Krauthgamer, Naor, Schwartz and Talwar (2014) had to solve in their paper. Their solution is rather complex and does not seem to work in the weighted case. Here, we propose a very simple fix for the naïve algorithm we presented above. We shall store vertices in every bin in layers. When we add new vertices to a bin at some iteration, we put them in a new layer on top of already stored vertices. Now, if the weight of the bin number is greater than , we remove bottom layers from this bin so that its weight is at most . Then we mark the removed vertices as active and jump to the next iteration. It is clear that this algorithm always returns a solution satisfying for all . But now we need to prove that the algorithm terminates, and that the expected number of cut edges is still bounded by .