1. Introduction

In this tutorial, we’ll talk about the complexity of Shellsort.

2. Shellsort

The algorithm uses the following property of arrays. Every subsequence of a sorted array is also sorted.

So, if x_1 \leq x_2 \leq \ldots \leq x_n is a sorted array, however we choose indices i_1 < i_2 < \ldots < i_m, it will hold that x_{i_1} \leq x_{i_2} \leq \ldots \leq x_{i_m}. For instance:

A sorted subsequence

Shellsort considers the subsequences in which there’s a constant gap between the elements. For example, if x has 16 elements, and the gap is 4, we have five subsequences or chains, as we may call them:

Chains in Shellsort

In each pass, Shellsort sorts individual chains using Insertion Sort, reduces the gap, and sorts the new chains in the subsequent pass. The rationale is that by sorting the chains with large gaps, the elements may “jump” over large portions of the array, so there will be less work when sorting chains with smaller gaps. Additionally, by reducing the gaps over time, the chains get larger and include more indices, so they place the elements closer to their correct positions.

2.1. Pseudocode

Here’s the pseudocode of Shellsort:

We’ll base the recursive algorithm on our definition:

Rendered by QuickLaTeX.com

As we see, if the gap is h_k, there are h_k+1 chains because there are h_k empty spaces for other chains between the first two elements of the first chain: x[1] and x[1+h_k].

The gap sequence h_1 > h_2 > \ldots > h_m needs to end with 1. That means that in the last pass, Shellsort applies Insertion Sort to the entire input array. Because of that, we know the algorithm always sorts its input correctly.

3. Complexity

The time complexity of Shellsort depends on the gap sequence \boldsymbol{h_1, h_2, \ldots, h_m}. Currently, there is no ready-to-use formula we could plug a specific sequence in to get the complexity. The analysis must be done case by case.

Over time, several gap sequences were proposed. The resulting complexities vary from O (n^2) to O \left(n (\log n)^2 \right). What’s more, there are sequences for which we still don’t know their asymptotic complexity.

Regardless of the gap sequence, the space complexity is O(n) since Shellsort uses constant space for auxiliary variables.

3.1. Shell’s Sequence

This is the sequence in Shell’s original formulation of the algorithm. In the first pass, the gap is \ceil{\frac{n}{2}}. In the second, it’s \ceil{\frac{n}{4}}, then \ceil{\frac{n}{8}} in the third, and so on until the last pass in which the gap is 1.

Without loss of generality, let’s assume that the input x contains integers from 1 to n. The worst case occurs when n is a power of two, 1, 2, 3, \ldots, n/2 occupy the odd positions, and n/2+1, n/2+2, \ldots, n are at the even ones. Then, the sub-arrays [x_1, x_3, \ldots, x_{n-1}] and [x_2, x_4, \ldots, x_n] get sorted independently. After the second-to-last pass, the two sub-arrays are interleaved:

Interleaved sub-arrays

When the Insertion Sort sorts such a permutation of x, it performs swaps only in its odd iterations (except the very first one) since all the elements at the even positions are greater than all the preceding ones:

Swaps in the Insertion Sort in the last pas

So, the number of swaps Insertion Sort performs in the end is:

    \[1+2+3+\ldots+\frac{n}{2}-1 = \frac{\frac{n}{2}\times \frac{n}{2}}{2} = \frac{n^2}{8} \in \Theta(n^2)\]

No matter the total number of comparisons and the number of swaps in all but passes but the last one, Shellsort using this gap sequence has quadratic time complexity in the worst case.

3.2. Other Sequences

Another sequence we can use is Hibbard’s. Its general term is 2^k-1 for k=\floor{\log_2 n}, \floor{\log_2 n}-1, \ldots, 2, 1. For example, if n=14, the gap sequence is 7, 3, 1. Shellsort using this sequence has the worst-case time complexity of O(n^{1.5}).

Similar holds for the sequence \frac{3^k-1}{2} such that k=1,2,\ldots and the maximum term isn’t larger than \ceil{\frac{n}{3}}. The resulting time complexity is also O(n^{1.5}).

Pratt’s sequence yields the (currently known) best asymptotic behavior. It contains integers of the form 2^p 3^q: 1, 2, 3, 4, 6, 8, 9, and so on. The time complexity, in this case, is O \left(n (\log n)^2 \right). However, since there are many gaps, Shellsort has a lot of passes, which makes it slow in practice.

4. Conclusion

In this article, we talked about Shellsort’s complexity. It depends on the gap sequence we use. As of now, we don’t know which sequence is the optimal one since the mathematical analysis is quite complex. If using the right sequence, we can get the complexity of O\left(n (\log n)^2 \right), which is close to the theoretical lower bound of O(n \log n) for comparison-based sort algorithms.

Comments are closed on this article!