# #StackBounty: #ds.algorithms #space-complexity Problem in the paper "Stable Minimum Space Partitioning in Linear Time"

### Bounty: 50

The paper Stable Minimum Space Partitioning in Linear Time describes an algorithm that stably sorts a binary array (an array whose elements can only have two distinct values) in $$O(n)$$ time complexity and $$O(1)$$ space complexity. After reading the paper, I am not really convinced that the algorithm described is actually $$O(1)$$ space complexity which is one of the main selling points of the algorithm that distinguishes it from other partitioning algorithms.

Although the algorithm uses constant memory (technically it is $$O(log n)$$ bits, but we can consider it $$O(1)$$ in practical use), it doesn’t use $$O(1)$$ counters. The authors define $$O(1)$$ space as "we assume that a constant number of extra storage locations, each capable for storing a word of $$O(log_2 n)$$ bits, is available" (page 2). But later in algorithm D, the authors clearly use $$O(log(n)/log(log(n)))$$ counters thereby contradicting their earlier statement of $$O(1)$$ space complexity.

The reason why having a constant number of counters is important is that, without it, the algorithm most probably wouldn’t work out that well in practical use. The algorithm can only have $$O(log_2 n)$$ extra space if we use a variable number of integer counters that are of variable size. The only way we can have a variable number of integer counters of variable size is if we dynamically allocate them and that would drastically slow down the algorithm, hence making the algorithm impractical.

Is the algorithm indeed not $$O(1)$$ space complexity, or am I misunderstanding the paper somehow?

Get this bounty!!!

This site uses Akismet to reduce spam. Learn how your comment data is processed.