When reading documentation for Rust crate Voracious Sort I noticed how much faster it is than the comparison-based algorithm (PDQ) provided by Rust standard library, with both stable and in-place versions available.
Why aren't we using radix for everything? Of course, radix sorts can only be applied to integral types.
However, most other types of data can be (more or less easily) transformed into integral types (common examples are IEEE754 floats or strings). My question thus is:
What types of data are required (or are significantly faster) to be sorted with comparisons, and why?
Or, even more generally:
Is it possible for any (deterministic?) comparator of keys to be transformed into (finite?) radix-sortable key mapping function?
It seems like it should be a simple proof, but I can't quite put it together.
I do understand that other algorithms are better for some distributions of input data. I'm asking about types of data, not their distributions.
You can actually use Radix sort for pretty much any kind of keys.
The problem is how to write the interface for a generic sorting function.
To use a generic comparison-based sorting function, you just need to provide a comparator that can tell you whether one key is greater than or less than another.
To use a generic radx-sort function, you would need to implement a view of the keys that presents them to the sort function as a sequence of digits. For all but the simplest kinds of keys that can be pretty difficult, and not very efficient. It's not the kind of thing you want to do just to call a sort function, and so there's not enough demand for a generic radix-sort to have it included in standard libraries.