Given an integer array nums
and an integer k
, return the maximum sum of a non-empty subsequence of that array such that for every two consecutive integers in the subsequence, nums[i]
and nums[j]
, where i < j
, the condition j - i <= k
is satisfied.
A subsequence of an array is obtained by deleting some number of elements (can be zero) from the array, leaving the remaining elements in their original order.
Input: nums = [10,2,-10,5,20], k = 2 Output: 37 Explanation: The subsequence is [10, 2, 5, 20].
Input: nums = [-1,-2,-3], k = 1 Output: -1 Explanation: The subsequence must be non-empty, so we choose the largest number.
Input: nums = [10,-2,-10,-5,20], k = 2 Output: 23 Explanation: The subsequence is [10, -2, -5, 20].
1 <= k <= nums.length <= 105
-104 <= nums[i] <= 104
use std::collections::VecDeque;
impl Solution {
pub fn constrained_subset_sum(nums: Vec<i32>, k: i32) -> i32 {
let k = k as usize;
let mut deque = VecDeque::new();
let mut ret = *nums.iter().max().unwrap();
for i in 0..nums.len() {
if i - deque.front().unwrap_or(&(i, 0)).0 > k {
deque.pop_front();
}
let x = nums[i] + deque.front().unwrap_or(&(0, 0)).1;
if x > 0 {
while deque.back().unwrap_or(&(0, i32::MAX)).1 <= x {
deque.pop_back();
}
deque.push_back((i, x));
ret = ret.max(x);
}
}
ret
}
}