using RandomForest algorithm to regression i found in iternet than after predicting they normalize the predicted results that means we suppose that the result is pred
pred = pred = pred*(np.exp(-pred/100)*2+1)
do you have any idea why normalize the results of prediction and why this formula and which kind of normalization of the predicted results can be done ?
In short: dont do such a random thing.
More detailed answer: There is no reason for doing any kind of "fixed equation" postprocessing. The only reason to apply f(x) to your predictions is if you applied f^-1(x) before training. In other words - if you somehow transformed your data prior to training you need to apply inverse transformation to the prediction to get back to the original space.
Just to show how useless is the equation provided, consider regression problem with negative outputs, for example befween -10000 and 0. Lets say that your model is not perfect and predicts -9900 instead of -10000, according to this "rule" you get (-9900)*(np.exp(-(-9900)/100)*2+1) which is something among the lines of -200,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 (-2e47).