say I have a std::vector
with N elements. I would like to copy every n-th element of it to a new vector, or average up to that element then copy it (downsample the original vector). So I want to do this
std::vector<double> vec(N);
long n = 4;
std::vector<double> ds(N/n);
for(long i = 0; i < ds.size(); i+=n)
{
ds[i] = vec[i*n];
}
or
for(long i = 0; i < ds.size(); i+=n)
{
double tmp = 0;
for(long j = 0; j < n; j++)
{
tmp += vec[i*n+j];
}
ds[i] = tmp/static_cast<double>(n);
}
Is there a way to do this using the standard algorithms of C++? Like using std::copy with binary functions? I have billions of elements that I want to treat this way, and I want this to be as fast as possible.
PS: I would prefer not to use external libraries such as boost.
For readability, the loop would be a good idea, as pointed out by Vlad in the comments. But if you really want to do someting like this, you could try:
If you want to average, it's getting worse as you'd have to similar tricks combining
transform()
withcopy_if()
.Edit: If you're looking for performance, you'd better stick to the loop, as stressed in the comments by davidhigh: it will avoid the overhead of the call to the lambda function for each element.
If you're looking for an algorithm because you're doing this very often, you'd better write your own generic one.