educative.io

Time Complexity is O(N^2 * N!) not O(N * N!)

in every top level iteration we will be increasing the length of every permutation. The size for every permutation will be 1 in the first iteration, then 2 in the second iteration, until N. The time complexity to insert into a list which is increasing in size every iteration is O(N^2) not O(N): O(1+2+3…+N) = O((N(N+1))/2) = O(N^2)

1 Like

Hi Educative team,
Could you please update on this finding ^ by Ory_Band?
Thank you

Hi @Ory_Band, I think it is O(N) instead, for ex : input = [1, 2, 3] --> N = 3
At the end, we will have 6 permutations equivalent to 3! <=> N!. So we have total of steps equals to 1! + 2! + 3! = 9, each step we copy the old list to the new list that take maximum N steps. The result is O( ( sum of the factorials of the first N numbers) * N ) instead.

That’s how I understand current algorithm runtime complexity:

  • To iterate over all elements in the input array takes O(n);

  • In each iteration we can have up to n! permutations in the queue (as @Nhat_Pham mentioned at each iteration amount of current permutations in the queue will be growing - 1! then 2! then 3! up to n!, but since I’m not sure how to express this using some formula let’s just consider that it will be n! in a worst case for each iteration). So to iterate through all of them takes O(n!) in a worst case.

  • Within each permutation we have to iterate through all of its indices (which takes O(n)), insert current array element into the index position and then copy the result + add to the queue (add + copy takes O(n + n) -> O(n)). Overall this step takes O(n * n) - to iterate through each index of current permutation and create new copy with new element in in.

So overall running time complexity should be something like O(n^3 * n!).
What do I miss here?

1 Like

The key point here is how many times of each element in the permutations, including the immediate ones, are being copied. For example, given set [1, 5, 3].

In the first step, there is only one permutation [1], so #number of element being copied: 1 * 1!
In the 2nd step: there are two permutations [1, 5], [5, 1], so #number of element being copied: 2*2!
. . .
In the nth step: there are n! permutations and the copies are n * n!.

Therefore, the time complexity equals the total number of times of the elements being copied, which is
1 * 1! + 2 * 2! + … + n * n! = (n+1)! -1 = n * n! + n! - 1 = O(n * n!)

1 Like

The time complexity is O(N * N!) as noted in the solution, but the explanation is not very thorough, so here is a step-by-step explanation for those who are interested and want to get a good understanding of general time complexity calculations for these harder problems.

Solution overview

The solution basically go through number by number from the input, creating the permutations for the numbers processed so far. For each new number, it just has to go through the permutations created so far and add the new number in every single position possible.

The first for loop is going through the levels, each level being the permutations of the numbers processed so far. Therefore:

Layer 0 (0 numbers processed) = 0! permutations = 1 = [[]]
Layer 1 (1 numbers processed) = 1! permutations = 1 = [[1]]
Layer 2 (2 numbers processed) = 2! permutations = 2 = [[3,1],[1,3]]
Layer 3 (3 numbers processed) = 3! permutations = 6 = [[5,3,1],[3,5,1],[3,1,5],[5,1,3],[1,5,3],[1,3,5]]

In its first iteration we are going to create the permutations for level 1. In its second iteration, for level 2, and so on.

The second for loop is going through the permutations created in the previous level.

The third for loop is adding the new number in each possible position for each of the permutations of the previous level (which are iterated through by the second for loop), therefore creating the permutations for the current level.

Time complexity

The easiest way to understand the time complexity in this case is by thinking how many times the inner-most loop (the third for loop) gets executed.

This is the for loop that is creating each of the permutations for each level, so for each level it executes l! times, where l is the number of the level (i.e., how many numbers we are creating the permutations for). In the last level, l = N, where N is the size of our input.

So, in total, the inner-most for gets executed ∑l! where l ranges from 1 to N.
Therefore, the inner-most for gets executed a total of 1! + 2! + ... + N! times.

Note: if you are thinking this should have been 0! + 1! + 2! + ... + (N-1)! then you are thinking of the second for, which is iterating through the permutations of the previous level. The third for is actually creating the new l! permutations.

Now, what is the time complexity of the inner-most for loop? Its two longer operations are:

  1. Copying the previous level’s permutation, of size l-1.
  2. Adding the new number in a given position, which can take from 0 to l-1 operations internally depending on where the element is being added.

In either case, the time complexity is O(l), so the overall time complexity of the for loop is O(l + l) = O(l).

Adding both conclusions so far, we have that the overall “time complexity” (with some poetic license) is ∑(L! * O(L)), with L=1..N, since that would be the number of times the O(L) operations (inside the inner-most for loop) get executed. That translates to:

(1! * 1) + (2! * 2) + ... + (N! * N)

And so the question now is how to simplify this. For that I’ll provide you with two ways. The first one is the formula way, but I prefer the second one because it’s more applicable to other problems.

The formula way

(1! * 1) + (2! * 2) + ... + (N! * N) = (N + 1)! – 1

This page explains how to arrive at this simplification.

Once we have (N + 1)! – 1, we can do what @Hongzhe did above and get that:

[(N + 1)!] – 1 = [(N + 1) * N!] – 1 = [(N * N!) + N!] - 1 = N * N! + N! - 1
which leads to
O(N * N! + N! - 1) = O(N * N!).

The more general way

Assuming we don’t know the formula above or don’t know how to arrive to it, are we doomed? No, there is a more general way that you can apply even to other problems when you get to a similar place. It requires some exploration, as follows:

We start with our target: (1! * 1) + (2! * 2) + ... + (N! * N)

Then we ask: can we say that only the last term matters for complexity?

If that sounds like a weird question, let me preface it with an example: if we have a + b + c + ... + x + y + z and can prove that a + b + c + ... + x + y <= z, then the sum of a to z can never be greater than 2*z, since the max value for a + b + c + ... + x + y is z. And in that case the “complexity” would be O(2 * z) = O(z), which is the last term. That means that for that sum, only the last term (z) matters for the complexity.

So let’s explore if that is the case for our problem: (1! * 1) + (2! * 2) + ... + (N! * N)

Can we prove that (1! * 1) + (2! * 2) + ... + ((N-1)! * (N-1)) <= (N! * N)?

Let’s decompose the right-hand side of the inequation to get similar terms to the left-hand side in hopes that we can compare the two.

(1! * 1) + (2! * 2) + ... + (N-1)! * (N-1) <= (N-1)! * N * N

Interesting, we are now asking whether N times (N-1)! * N is greater than the left-hand side.

We know that the left-hand side has N elements, and that they are increasing from left to right. So we know that (N-1)! * (N-1) (the last element) is our highest element in the left-hand side. So the left-hand side surely has to be smaller than N times (N-1)! * (N-1), which would be achieved if we replaced each element of the left-hand side with the highest element.

What we are saying by this reasoning above is that (1! * 1) + (2! * 2) + ... + ((N-1)! * (N-1)) < N * ((N-1)! * (N-1)).

And it’s easy to also say that N * ((N-1)! * (N-1)) < (N-1)! * N * N, because if you cancel the same elements on both sides you get (N-1) < N.

So given we have that:

(1! * 1) + (2! * 2) + ... + ((N-1)! * (N-1)) < N * ((N-1)! * (N-1)) < (N-1)! * N * N

then we have that:

(1! * 1) + (2! * 2) + ... + ((N-1)! * (N-1)) < (N-1)! * N * N.

Which means we just proved that:

(1! * 1) + (2! * 2) + ... + ((N-1)! * (N-1)) <= (N! * N)

which is the inequation before we decomposed the right-hand side.

Therefore, we have that the last element in (1! * 1) + (2! * 2) + ... + (N! * N) is the only one that matters for the complexity, since the sum of all the elements before it can never be greater than it.

So we have that the time complexity is O(N! * N).

If it sounds tricky, try arriving to this on paper and you should see it more clearly. Once you get this reasoning you will have a nice tool for other problems as well.

In summary, the trick is: increase the left-hand side in order to more easily compare it with the right-hand side. If the increased left-hand side is smaller, then surely the left-hand side before increasing is smaller too.

With this reasoning you could even solve it in different ways, such as:

  1. Ask: (1! * 1) + (2! * 2) + ... + ((N-1)! * (N-1)) <= (N! * N)?
  2. Increase the left-hand side to: (1! * N) + (2! * N) + ... + ((N-1)! * N) <= (N! * N)
  3. Simplify to N * (1! + 2! + ... + (N-1)!) <= N! * N
  4. Simplify to 1! + 2! + ... + (N-1)! <= N!
  5. Decompose the right-hand side: 1! + 2! + ... + (N-1)! <= N * (N-1)!
  6. Increase the left-hand side to: (N-1) * (N-1)! <= N * (N-1)!
  7. Simplify to: (N-1) <= N
  8. Given the inequation is true, then the inequation we posed on #1 above is also true.

And don’t worry, this is a harder time complexity calculation than usual. If you get this one, consider yourself Advanced.

Good luck on your studies!

9 Likes