hakk

software development, devops, and other drivel
Tree lined path

Quick Sort

Quick Sort is a widely-used sorting algorithm known for its efficiency and simplicity. It follows the divide-and-conquer strategy to sort an array or list of elements.

Here’s how Quick Sort works:

  1. Partitioning: Choose a pivot element from the array. Rearrange the array so that all elements less than the pivot come before it, and all elements greater than the pivot come after it. After this partitioning step, the pivot element is in its final sorted position.

  2. Recursion: Recursively apply the partitioning step to the sub-arrays formed by the pivot. Repeat this process until the entire array is sorted.

  3. Base Case: The base case of the recursion is when the sub-array has zero or one element, in which case it is already considered sorted.

Here’s a simple implementation of Quick Sort in Python:

def quick_sort(arr):
    if len(arr) <= 1:
        return arr
    
    pivot = arr[len(arr) // 2]  # Choosing the pivot as the middle element
    left = [x for x in arr if x < pivot]  # Elements less than pivot
    middle = [x for x in arr if x == pivot]  # Elements equal to pivot
    right = [x for x in arr if x > pivot]  # Elements greater than pivot
    
    return quick_sort(left) + middle + quick_sort(right)

# Example usage:
arr = [3, 6, 8, 10, 1, 2, 1]
print("Original array:", arr)
sorted_arr = quick_sort(arr)
print("Sorted array:", sorted_arr)

This code defines a function quick_sort() that takes an array arr as input and returns the sorted array. It uses list comprehensions to partition the array into elements less than, equal to, and greater than the pivot, and then recursively applies the sorting algorithm to the sub-arrays. Finally, it concatenates the sorted sub-arrays to get the final sorted array.

Time complexity

The time complexity of Quick Sort depends on several factors, including the choice of the pivot and the input data’s initial distribution. However, in the average case, Quick Sort has a time complexity of (O(n log n)), where (n) is the number of elements in the array to be sorted.

Here’s a breakdown of why Quick Sort has this time complexity:

  1. Partitioning Step: The key operation in Quick Sort is partitioning, where the array is divided into two sub-arrays based on a chosen pivot element. This partitioning step takes (O(n)) time, where (n) is the number of elements in the array. In each partitioning step, every element in the array is compared to the pivot exactly once.

  2. Recursion: After partitioning, Quick Sort recursively sorts the two sub-arrays created by the partitioning step. If the partitioning is balanced (i.e., the pivot divides the array into approximately equal halves), each recursive call processes roughly half of the array. In the average case, the partitioning is balanced, leading to (O(log n)) levels of recursion.

Combining the partitioning step’s (O(n)) time complexity with the (O(log n)) levels of recursion yields an average-case time complexity of (O(n log n)).

However, it’s essential to note that in the worst case, Quick Sort can degrade to (O(n^2)) time complexity. This occurs when the chosen pivot consistently partitions the array into unbalanced sub-arrays, such as when the pivot is the smallest or largest element in the array. However, the likelihood of this worst-case scenario decreases significantly with randomized pivot selection or carefully chosen pivot strategies (e.g., median-of-three).

CaseTime ComplexitySpace Complexity
BestO(n log n)O(log n)
AverageO(n log n)O(log n)
WorstO(n^2)O(n)