I currently try to learn java Arrays.binarySearch, and see Oracle doc below:
Arrays.binarySearch(int[] a, int fromIndex, int toIndex, int key)
Returns:
index of the search key, if it is contained in the array within the specified range; otherwise, (-(insertion point) - 1)**
When not found, I don't know why return -(insertion point) - 1 here? Why not just return -1 or -(insertion point)? What's the idea for -(insertion point) - 1?
Because it provides extra information that would not otherwise be available. So it needs to -insertion point to communicate that info. The problem is, that the insertion point may be 0. So how can you tell the difference between a match at 0, and a failure at insertion point 0? You must -1 to make that distinction possible.