Sorting is one of the most fundamental operations in computer science. While comparison-based sorting algorithms like QuickSort, Merge Sort, and Bubble Sort are commonly used, non-comparison-based sorting techniques like Counting Sort can significantly improve performance in specific scenarios.
In this guide, we’ll explore Counting Sort, how it works, its advantages, disadvantages, and when you should use it over other sorting algorithms.
Table of Contents
- What is Counting Sort?
- How Does Counting Sort Work?
- Step-by-Step Example
- Counting Sort Algorithm Implementation
- Counting Sort in Python
- Counting Sort in JavaScript
- Time and Space Complexity Analysis
- Advantages and Disadvantages of Counting Sort
- Counting Sort vs. Other Sorting Algorithms
- When Should You Use Counting Sort?
- Real-World Applications
- Conclusion
What is Counting Sort?
Counting Sort is a non-comparison-based sorting algorithm that sorts elements by counting the frequency of each distinct value and using this information to determine their position in the sorted array.
It works best for sorting integers or elements with a limited range since it relies on indexing rather than element comparison.
Key Characteristics of Counting Sort:
Non-comparison-based: It doesn’t compare elements directly.
Works best for small range values: Suitable when the range of numbers is not significantly larger than the number of elements.
Stable sort: Maintains the order of equal elements.
How Does Counting Sort Work?
The Process of Counting Sort:
- Find the maximum value in the input array.
- Create a count array of size (max value + 1) and initialize all values to zero.
- Determine the frequency of each number in the input array.
- Modify the count array so that each element at index i stores the sum of previous counts (cumulative frequency).
- Place elements into the output array based on their frequency positions.
- Transfer the sorted elements back into the original array.
Step-by-Step Example of Counting Sort
Let's sort the array:
[4, 2, 2, 8, 3, 3, 1]
Step 1: Find Maximum Value
The highest number is 8, so we create a count array of size 9 (0 to 8).
Step 2: Count Occurrences
We count how many times each number appears:
Index: 0 1 2 3 4 5 6 7 8
Count: 0 1 2 2 1 0 0 0 1
Step 3: Compute Cumulative Frequency
Index: 0 1 2 3 4 5 6 7 8
Count: 0 1 3 5 6 6 6 6 7
Step 4: Place Elements in Sorted Order
We iterate through the original array in reverse and place each element at its correct index.
Final Sorted Array:
[1, 2, 2, 3, 3, 4, 8]
Counting Sort Implementation
Python Implementation
def counting_sort(arr):
max_val = max(arr)
count = [0] * (max_val + 1)
output = [0] * len(arr)
# Count occurrences
for num in arr:
count[num] += 1
# Cumulative sum
for i in range(1, len(count)):
count[i] += count[i - 1]
# Build the output array
for num in reversed(arr):
output[count[num] - 1] = num
count[num] -= 1
return output
arr = [4, 2, 2, 8, 3, 3, 1]
print(counting_sort(arr))
JavaScript Implementation
function countingSort(arr) {
let maxVal = Math.max(...arr);
let count = new Array(maxVal + 1).fill(0);
let output = new Array(arr.length);
// Count occurrences
arr.forEach(num => count[num]++);
// Cumulative sum
for (let i = 1; i < count.length; i++) {
count[i] += count[i - 1];
}
// Build the output array
for (let i = arr.length - 1; i >= 0; i--) {
output[count[arr[i]] - 1] = arr[i];
count[arr[i]]--;
}
return output;
}
console.log(countingSort([4, 2, 2, 8, 3, 3, 1]));
Time and Space Complexity Analysis
Complexity | Best Case | Average Case | Worst Case |
---|---|---|---|
Time Complexity | O(n + k) | O(n + k) | O(n + k) |
Space Complexity | O(n + k) | O(n + k) | O(n + k) |
- n = number of elements
- k = range of numbers
Advantages and Disadvantages of Counting Sort
Advantages:
✅ Faster than comparison-based sorting (for small range values)
✅ Stable sorting algorithm
✅ Efficient for small integer datasets
Disadvantages:
🚫 Consumes extra space (not in-place)
🚫 Inefficient for large numbers or floats
🚫 Doesn’t work well with negative numbers (without modification)
Counting Sort vs. Other Sorting Algorithms
Algorithm | Time Complexity | Best For |
---|---|---|
Counting Sort | O(n + k) | Small-range integers |
QuickSort | O(n log n) | General-purpose sorting |
Merge Sort | O(n log n) | Stable sorting with large datasets |
Radix Sort | O(nk) | Sorting large numbers efficiently |
When Should You Use Counting Sort?
- When sorting integers within a small range
- When a stable sorting algorithm is required
- When comparison-based sorting is too slow
Avoid Counting Sort when dealing with floating-point numbers, large datasets with high variability, or when memory usage is a concern.
Real-World Applications of Counting Sort
- Sorting student grades (0-100)
- Distribution of votes in elections
- Sorting limited-range datasets (e.g., age groups in a population study)
Conclusion
Counting Sort is a powerful non-comparison-based sorting algorithm that provides excellent performance for small-range integer values. While it may not be suitable for general-purpose sorting, it outperforms QuickSort and Merge Sort in scenarios where elements have a limited range.
If you're dealing with integer datasets with known constraints, Counting Sort is a great choice for fast and stable sorting!
Common FAQs for "Counting Sort"
Q1. What is Counting Sort used for?
Counting Sort is used for sorting numbers with a limited range efficiently. It works best when the range of numbers is small compared to the input size, making it ideal for frequency-based sorting and integer-based datasets.
Q2. Is Counting Sort better than QuickSort?
Counting Sort is faster than QuickSort (O(n log n)) when sorting small integers but is not suitable for large datasets with a wide range of numbers. QuickSort is more versatile for general-purpose sorting.
Q3. What is the time complexity of Counting Sort?
Counting Sort has a time complexity of O(n + k), where n is the input size and k is the value range. It outperforms comparison-based algorithms when k is small.
Q4. Why is Counting Sort not used for all sorting problems?
Counting Sort is not efficient for large numbers or floating-point values because it requires additional memory for counting frequency. It also does not work for negative numbers without modifications.
Q5. Is Counting Sort stable?
Yes, Counting Sort is a stable sorting algorithm, meaning it preserves the relative order of equal elements in the sorted output, making it useful for sorting objects with multiple attributes.
Q6. Can Counting Sort be used for negative numbers?
By modifying the algorithm to shift negative numbers into the positive range, Counting Sort can handle negative numbers, but this increases complexity and may require additional space.
Q7. What is the space complexity of Counting Sort?
The space complexity of Counting Sort is O(n + k) due to the additional memory needed for the counting array, making it less space-efficient compared to in-place sorting algorithms like QuickSort.
Related Blogs
- Master JavaScript Sorting Algorithms: Efficiency & Analysis
- QuickSort Algorithm Explained in JavaScript & Python
- Merge Sort vs Quick Sort: Key Differences, Pros, & Use Cases
- Bucket Sort Algorithm: How It Works & When to Use It
- Radix Sorting Faster Algorithm Than QuickSort Explained
- Heap Sort Algorithm: Efficient Sorting Using a Binary Heap
- Shell Sort Algorithm: Fastest Sorting Method Explained
Radix Sorting Faster Algorithm Than QuickSort Explained
Heap Sort Algorithm: Efficient Sorting Using a Binary Heap
About Muhaymin Bin Mehmood
Front-end Developer skilled in the MERN stack, experienced in web and mobile development. Proficient in React.js, Node.js, and Express.js, with a focus on client interactions, sales support, and high-performance applications.