I’ve an array of $N$ weights $w_i$, say $w_i={4, 5, 12, 16, 3, 10, 1}$, and I must divide this array into $P$ partitions such that partitions are optimally balanced, i.e. that most sum of weights of any partition is as small as potential. Fortunately the issue is constrained by the truth that the weights cannot be reordered. If the variety of partitions is three, the above instance would give the optimum partitions: ${4, 5, 12}, {16}, {3, 10, 1}$.
I’ve discovered environment friendly recipes (e.g. partition drawback, subset sum, Optimum Partition of E book Chapters, A partition algorithm, An algorithm for k-way array partitioning) for a lot of comparable issues for the circumstances the place the weights are unordered units and/or the variety of partitions is mounted at 2 or 3, however none that appear to precisely handle my drawback the place the variety of partitions is bigoted.
I’ve solved the issue myself utilizing divide-and-conquer algorithm (written in Python under), however it appears to be awfully sluggish for a lot of partitions (e.g. N=100, P=8). So I used to be pondering that there acquired to be a greater means, utilizing dynamic programming or another intelligent methods?
Does anybody have any solutions?
Sluggish Python divide-and-conquer algorithm:
def findOptimalPartitions(weights, num_partitions):
if num_partitions == 1:
# If there is just one partition, it should begin on the first index
# and have a dimension equal to the sum of all weights.
return numpy.array([0], dtype=int), sum(weights)
# Initially we let all partitions begin at zero, that means that each one however the
# final partition will get zero parts, and the final will get all of them.
partition_offsets = numpy.array([0] * num_partitions)
max_partition_size = sum(weights)
# We now divide the weigths into two partitions that cut up at index n.
# We all know that every partition ought to have no less than one factor, so there
# is not any level in looping over all parts.
for n in vary(1, len(weights) - num_partitions):
first_partition_size = sum(weights[:n])
if first_partition_size > max_partition_size:
# If the primary partition dimension is bigger than the very best presently
# discovered, there isn't any level in looking additional.
break
# The second partition that begins at n we now additional cut up into
# subpartitions in a recursive method.
subpartition_offsets, best_subpartition_size =
findOptimalPartitions(weights[n:], num_partitions - 1)
# If the utmost dimension of any of the present partitions is smaller
# than the present greatest partitioning, we replace the very best partitions.
if ((first_partition_size < max_partition_size)
and (best_subpartition_size < max_partition_size)):
# The primary partition all the time begin at 0. The others begin at
# ones from the subpartition relative to the present index, so
# add the present index to these.
partition_offsets[1:] = n + subpartition_offsets
# Discover the utmost partition dimension.
max_partition_size = max(first_partition_size, best_subpartition_size)
return partition_offsets, max_partition_size