Define the displacement of a sequence to be the difference between the maximum and minimum elements.
Given a sequence of integers, find the maximum displacement over all contiguous subsequences of length m
.
For example, if our sequence is [1, 5, 7, 0, 2, -4] and m = 3,
- [1, 5, 7] has displacement 6.
- [5, 7, 0] has displacement 7.
- [7, 0, 2] has displacement 7.
- [0, 2, -4] has displacement 6.
- So the maximum displacement is 7.
If we let n denote the length of the input sequence, then my solution below runs in O(nlog(m)) time. Is there any way to do better? I feel like there must be a linear-time algorithm I'm missing. For the purposes of this question, all I care about is the asymptotic time complexity.
#include <vector>
#include <set>
#include <iostream>
int find_max_displacement(std::vector<int> seq, int m){
std::multiset<int> subseq;
// insert the m items of first subsequence into tree
for (int i = 0; i < m; i++){
subseq.insert( seq[i] );
}
int max_disp = *subseq.rbegin() - *subseq.begin(); // max minus min
for (int i = 0; i < seq.size() - m; i++){
subseq.erase(subseq.find(seq[i])); // kick oldest element out of subsequence
subseq.insert( seq[i+m] ); // insert new element into subsequence
int new_disp = *subseq.rbegin() - *subseq.begin();
if (new_disp > max_disp){
max_disp = new_disp;
}
}
return max_disp;
}
int main(){
std::vector<int> arr {1, 5, 7, 0, 2, -4};
int max_disp = find_max_displacement(arr, 3);
std::cout << max_disp << std::endl;
return 0;
}
You are right, there is a linear time algorithm for this. You can compute an array with the sliding maximum and an array with the sliding minimum and find the largest difference between these two arrays.
Computing a sliding maximum in linear time is a standard problem, there is a good explanation of different techniques here. In case the link breaks, here is the description of the linear time algorithm from that link: