Since we are dividing the subarray in an recursive manner, I think that the Time complexity of the algorithm should be O(nlogn).
For example an array size of 1000 and assuming that we are dividing the array into subarrays of size 5, the number of the first subarrays will be 1000/5=200. After finding the medians of those subarrays which for one array it will take O(1), we will recursively find the medians of the medians, and the new number of subarrays will be 200/5=40 and we will find those medians as well until we have the number of subarrays less then 5. The total runtime will be something like this:
T(100)=200*O(1)+40*O(1)+8*O(1)+3*O(1)
and For size n:
T(n)=n/5*O(1)+n/25*O(1)+n/125*O(1)+n/625*O(1)+...=O(n)+O(n)O(n)+O(n)+O(n)+...
the number of O(n)'s will be ceil(log5(n)). Thus:
T(n)=ceil(log5(n))*O(n)=O(nlogn)
Why is this wrong?
You forget that
The lesson is that
with
n
terms does not necessarily hold.