I have a sequence of 1760 integers, ranging from 129 to 250, and there is no discernible pattern to these integers. I'm working on a very small embedded system, and can't afford to waste almost 2 KB on a lookup table. I'd like to come up with a function that allows me to lookup a value given an index (in the 0 to 1759 range).
I know that minimal perfect hashing would allow me to map distinct values onto a set of consecutive integers, but I'm looking to map a set of consecutive integers onto non-distinct values.
Is brute force over millions of years the only way to do this? Is there some approach that would allow for a much smaller lookup table (say, around 256 bytes or less)?
What process generates your 1760 integers? Unfortunately, without knowing a little more about the source of your data it will be difficult (as you say, "millions of years") to find such a function, if it exists. Claude Shannon proved that random noise is at maximum information entropy and therefore impossible to compress. So if there is no discernible pattern to your integers, that indeed qualifies as random noise.
Going back to the lookup table, you can reduce the size of your table by 1/8 by recognizing that your integers are all within the range 129-250 which only requires 7 bits to represent. With some bit manipulation tricks in the table lookup you will only require 1760 * 7/8 = 1540 bytes or a 12.5% savings. It's not a lot but it's a start; here's some sample code to illustrate what I mean.
Sample code
Output
I'm just printing out the raw and compressed/uncompressed data at each index to verify storage and retrieval.
There's some work to be done if your input data length is no longer a multiple of 8 but this should get you started.