 # Improving worst-case running time of insertion sort using binary search

This was an interested question posted by me here: http://stackoverflow...g-binary-search

The while loop uses linear search to scan backwards. However, we know that the array within the while loop is already sorted. So we can replace the linear search with binary search instead so that O(n) will change to O(lg n). However, my take on this is that it won't contribute to reducing the overall time because we would still have to move the elements one index forward which will always take (number of backward steps(n)) times. So overall, the running time stays O(n^2) and there is no way to achieve O(n lg n) for this case. Please tell me if I am approaching this in a wrong way.

```INSERTION-SORT(A)
for j = 2 to length[A]
do key = A[j]
``` i = j - 1
```
while i > 0 and A[i] > key
```A[i+1] = A[i]

i = i - 1
```
A[i+1] = key
```

Anyone one of you who have studied algo analysis should find it interesting.

You could end up creating a hybrid algorithm but I would rather choose a well-established and proven algorithm. There's no point in re-inventing the wheel for algorithms like this where several computer scientists with doctorate degrees and countless years of research experience have worked on. Is there any reason you're using insertion sort instead of another O(n log(n)) algorithm like merge sort (like the built-in one in Java?)

From what I know, it'd only beneficial to use insertion sort in a very small dataset or an almost already sorted data.

P.S. I believe what you're trying to achieve has been implemented in hybrid sorting algos like timsort and patience sort. Not sure if either of them are stable though.