I am writing code where random numbers are being sampled from a uniform distribution who's bounds vary for certain iterations within a for loop, e.g.,
std::mt19937 generator{0};
for(int i = 0; i < n; ++i)
{
if(conditions are met)
{
// low and hi bounds change for each iteration
std::uniform_int_distribution<int> U(low, hi);
auto sample = U(generator);
}
}
This is the way I currently have the code written, but it's creating and deleting a temporary std::uniform_int_distribution<int>
object for each iteration where the conditions are met. Is this an expensive process? Could compiler optimizations construct the object outside of the for loop, and instead reconstructs the bounds of the uniform distribution within the if statement? I'm not sure if that's necessarily faster.
Are there other approaches that might be better?
std::uniform_int_distribution
is usually not expensive to create and contains only the min/max pair of values as its members. At least that is the case with stdlibc++, libc++ and VC++. A good optimizing compiler should be able to eliminate any trace of it entirely.However, that's not guaranteed, furthermore, a distribution is allowed to be stateful. Though that's more often the case with
normal_distribution
, notuniform_int_distribution
.Technically it's possible to re-use a distribution and re-initialize it's parameters in a loop:
The generated code will probably be identical (or slightly worse - check to be sure), and it doesn't solve the issue with
normal_distribution
as changing the params resets the state.