Why do we take the minimum value between the current jump-count and the jumps needed to reach the current index + 1?
in code, it’s dp[end] = Math.min(dp[end], dp[start]+1);
I understand the dp[end] = dp[start] + 1 when end=start+1; end <= start+jumps[start] .
But I don’t understand why do we compare dp[end] itself with dp[start]+1. According to dp[end] = dp[start] + 1, dp[end]> dp[start] + 1.
Did I miss anything?