Improving worst-case running time of insertion sort using binary search

This was an interested question posted by me here: http://stackoverflow...g-binary-search

The while loop uses linear search to scan backwards. However, we know that the array within the while loop is already sorted. So we can replace the linear search with binary search instead so that O(n) will change to O(lg n). However, my take on this is that it won't contribute to reducing the overall time because we would still have to move the elements one index forward which will always take (number of backward steps(n)) times. So overall, the running time stays O(n^2) and there is no way to achieve O(n lg n) for this case. Please tell me if I am approaching this in a wrong way.

INSERTION-SORT(A)

for j = 2 to length[A]

do key = A[j]

 i = j - 1

while i > 0 and A[i] > key

A[i+1] = A[i]

i = i - 1

A[i+1] = key

Anyone one of you who have studied algo analysis should find it interesting.

You could end up creating a hybrid algorithm but I would rather choose a well-established and proven algorithm. There's no point in re-inventing the wheel for algorithms like this where several computer scientists with doctorate degrees and countless years of research experience have worked on. Is there any reason you're using insertion sort instead of another O(n log(n)) algorithm like merge sort (like the built-in one in Java?)

From what I know, it'd only beneficial to use insertion sort in a very small dataset or an almost already sorted data.

P.S. I believe what you're trying to achieve has been implemented in hybrid sorting algos like timsort and patience sort. Not sure if either of them are stable though.

[quote=“Asad_N, post:2, topic:16612”]

You could end up creating a hybrid algorithm but I would rather choose a well-established and proven algorithm. There’s no point in re-inventing the wheel for algorithms like this where several computer scientists with doctorate degrees and countless years of research experience have worked on. Is there any reason you’re using insertion sort instead of another O(n log(n)) algorithm like merge sort (like the built-in one in Java?)

From what I know, it’d only beneficial to use insertion sort in a very small dataset or an almost already sorted data.

P.S. I believe what you’re trying to achieve has been implemented in hybrid sorting algos like timsort and patience sort. Not sure if either of them are stable though.

[/quote]

Asad I already knew about mergesort, quicksort, etc. I just posted this question as it is somewhat tricky and confusing (consider it as an interview question). From the first guess, anyone would think that this might help to reduce the complexity to something less than O(n^2)…However, to make room for the new elements in the array, it would still need n iterations which would defy the purpose of binary search. I was not talking about reinventing an algo and generally speaking, no such algo can now be created which has an average running time of less than O(n log n). So reinventing was not wat I asked.