#### [SOLVED] Determining complexity for recursive functions (Big O notation)

I have a Computer Science Midterm tomorrow and I need help determining the complexity of these recursive functions. I know how to solve simple cases, but I am still trying to learn how to solve these harder cases. These were just a few of the example problems that I could not figure out. Any help would be much appreciated and would greatly help in my studies, Thank you!

``````int recursiveFun1(int n)
{
if (n <= 0)
return 1;
else
return 1 + recursiveFun1(n-1);
}

int recursiveFun2(int n)
{
if (n <= 0)
return 1;
else
return 1 + recursiveFun2(n-5);
}

int recursiveFun3(int n)
{
if (n <= 0)
return 1;
else
return 1 + recursiveFun3(n/5);
}

void recursiveFun4(int n, int m, int o)
{
if (n <= 0)
{
printf("%d, %d\n",m, o);
}
else
{
recursiveFun4(n-1, m+1, o);
recursiveFun4(n-1, m, o+1);
}
}

int recursiveFun5(int n)
{
for (i = 0; i < n; i += 2) {
// do something
}

if (n <= 0)
return 1;
else
return 1 + recursiveFun5(n-5);
}
`````` #### @higlu 2020-04-18 22:18:38

The key here is to visualise the call tree. Once done that, the complexity is:

``````nodes of the call tree * complexity of other code in the function
``````

the latter term can be computed the same way we do for a normal iterative function.

Instead, the total nodes of a complete tree are computed as

``````                  C^L - 1
-------  , when C>1
/   C - 1
/
# of nodes =
\
\
L        , when C=1
``````

Where C is number of children of each node and L is the number of levels of the tree (root included).

It is easy to visualise the tree. Start from the first call (root node) then draw a number of children same as the number of recursive calls in the function. It is also useful to write the parameter passed to the sub-call as "value of the node".

So, in the examples above:

1. the call tree here is C = 1, L = n+1. Complexity of the rest of function is O(1). Therefore total complexity is L * O(1) = (n+1) * O(1) = O(n)
``````n     level 1
n-1   level 2
n-2   level 3
n-3   level 4
... ~ n levels -> L = n
``````
1. call tree here is C = 1, L = n/5. Complexity of the rest of function is O(1). Therefore total complexity is L * O(1) = (n/5) * O(1) = O(n)
``````n
n-5
n-10
n-15
... ~ n/5 levels -> L = n/5
``````
1. call tree here is C = 1, L = log(n). Complexity of the rest of function is O(1). Therefore total complexity is L * O(1) = log5(n) * O(1) = O(log(n))
``````n
n/5
n/5^2
n/5^3
... ~ log5(n) levels -> L = log5(n)
``````
1. call tree here is C = 2, L = n. Complexity of the rest of function is O(1). This time we use the full formula for the number of nodes in the call tree because C > 1. Therefore total complexity is (C^L-1)/(C-1) * O(1) = (2^n - 1) * O(1) = O(2^n).
``````               n                   level 1
n-1             n-1          level 2
n-2     n-2     n-2     n-2      ...
n-3 n-3 n-3 n-3 n-3 n-3 n-3 n-3    ...
...                ~ n levels -> L = n
``````
1. call tree here is C = 1, L = n/5. Complexity of the rest of function is O(n). Therefore total complexity is L * O(1) = (n/5) * O(n) = O(n^2)
``````n
n-5
n-10
n-15
... ~ n/5 levels -> L = n/5
`````` We can prove it mathematically which is something I was missing in the above answers.

It can dramatically help you understand how to calculate any method. I recommend reading it from top to bottom to fully understand how to do it:

1. `T(n) = T(n-1) + 1` It means that the time it takes for the method to finish is equal to the same method but with n-1 which is `T(n-1)` and we now add `+ 1` because it's the time it takes for the general operations to be completed (except `T(n-1)`). Now, we are going to find `T(n-1)` as follow: `T(n-1) = T(n-1-1) + 1`. It looks like we can now form a function that can give us some sort of repetition so we can fully understand. We will place the right side of `T(n-1) = ...` instead of `T(n-1)` inside the method `T(n) = ...` which will give us: `T(n) = T(n-1-1) + 1 + 1` which is `T(n) = T(n-2) + 2` or in other words we need to find our missing `k`: `T(n) = T(n-k) + k`. The next step is to take `n-k` and claim that `n-k = 1` because at the end of the recursion it will take exactly O(1) when `n<=0`. From this simple equation we now know that `k = n - 1`. Let's place `k` in our final method: `T(n) = T(n-k) + k` which will give us: `T(n) = 1 + n - 1` which is exactly `n` or `O(n)`.
2. Is the same as 1. You can test it your self and see that you get `O(n)`.
3. `T(n) = T(n/5) + 1` as before, the time for this method to finish equals to the time the same method but with `n/5` which is why it is bounded to `T(n/5)`. Let's find `T(n/5)` like in 1: `T(n/5) = T(n/5/5) + 1` which is `T(n/5) = T(n/5^2) + 1`. Let's place `T(n/5)` inside `T(n)` for the final calculation: `T(n) = T(n/5^k) + k`. Again as before, `n/5^k = 1` which is `n = 5^k` which is exactly as asking what in power of 5, will give us n, the answer is `log5n = k` (log of base 5). Let's place our findings in `T(n) = T(n/5^k) + k` as follow: `T(n) = 1 + logn` which is `O(logn)`
4. `T(n) = 2T(n-1) + 1` what we have here is basically the same as before but this time we are invoking the method recursively 2 times thus we multiple it by 2. Let's find `T(n-1) = 2T(n-1-1) + 1` which is `T(n-1) = 2T(n-2) + 1`. Our next place as before, let's place our finding: `T(n) = 2(2T(n-2)) + 1 + 1` which is `T(n) = 2^2T(n-2) + 2` that gives us `T(n) = 2^kT(n-k) + k`. Let's find `k` by claiming that `n-k = 1` which is `k = n - 1`. Let's place `k` as follow: `T(n) = 2^(n-1) + n - 1` which is roughly `O(2^n)`
5. `T(n) = T(n-5) + n + 1` It's almost the same as 4 but now we add `n` because we have one `for` loop. Let's find `T(n-5) = T(n-5-5) + n + 1` which is `T(n-5) = T(n - 2*5) + n + 1`. Let's place it: `T(n) = T(n-2*5) + n + n + 1 + 1)` which is `T(n) = T(n-2*5) + 2n + 2)` and for the k: `T(n) = T(n-k*5) + kn + k)` again: `n-5k = 1` which is `n = 5k + 1` that is roughly `n = k`. This will give us: `T(n) = T(0) + n^2 + n` which is roughly `O(n^2)`.

I now recommend reading the rest of the answers which now, will give you a better perspective. Good luck winning those big O's :) #### @nhahtdh 2012-11-20 07:55:00

For the case where `n <= 0`, `T(n) = O(1)`. Therefore, the time complexity will depend on when `n >= 0`.

We will consider the case `n >= 0` in the part below.

1.

``````T(n) = a + T(n - 1)
``````

where a is some constant.

By induction:

``````T(n) = n * a + T(0) = n * a + b = O(n)
``````

where a, b are some constant.

2.

``````T(n) = a + T(n - 5)
``````

where a is some constant

By induction:

``````T(n) = ceil(n / 5) * a + T(k) = ceil(n / 5) * a + b = O(n)
``````

where a, b are some constant and k <= 0

3.

``````T(n) = a + T(n / 5)
``````

where a is some constant

By induction:

``````T(n) = a * log5(n) + T(0) = a * log5(n) + b = O(log n)
``````

where a, b are some constant

4.

``````T(n) = a + 2 * T(n - 1)
``````

where a is some constant

By induction:

``````T(n) = a + 2a + 4a + ... + 2^(n-1) * a + T(0) * 2^n
= a * 2^n - a + b * 2^n
= (a + b) * 2^n - a
= O(2 ^ n)
``````

where a, b are some constant.

5.

``````T(n) = n / 2 + T(n - 5)
``````

where n is some constant

Rewrite `n = 5q + r` where q and r are integer and r = 0, 1, 2, 3, 4

``````T(5q + r) = (5q + r) / 2 + T(5 * (q - 1) + r)
``````

We have `q = (n - r) / 5`, and since r < 5, we can consider it a constant, so `q = O(n)`

By induction:

``````T(n) = T(5q + r)
= (5q + r) / 2 + (5 * (q - 1) + r) / 2 + ... + r / 2 +  T(r)
= 5 / 2 * (q + (q - 1) + ... + 1) +  1 / 2 * (q + 1) * r + T(r)
= 5 / 4 * (q + 1) * q + 1 / 2 * (q + 1) * r + T(r)
= 5 / 4 * q^2 + 5 / 4 * q + 1 / 2 * q * r + 1 / 2 * r + T(r)
``````

Since r < 4, we can find some constant b so that `b >= T(r)`

``````T(n) = T(5q + r)
= 5 / 2 * q^2 + (5 / 4 + 1 / 2 * r) * q + 1 / 2 * r + b
= 5 / 2 * O(n ^ 2) + (5 / 4 + 1 / 2 * r) * O(n) + 1 / 2 * r + b
= O(n ^ 2)
`````` #### @Dimitar Dimitrov 2013-09-03 04:18:07

I recently failed an interview question (and by extend the interview) that has to do with analyzing the time and space complexity of a recursive fibonacci function. This answer is epic and it helped a lot, I love it, I wish I could up vote you twice. I know it's old but do you have anything similar for calculating space - maybe a link, anything ? #### @Snowfish 2020-01-03 22:59:41

For No.4, even though the result is the same, shouldn't the induction be the following? T(n) = a + 2T(n-1) = a + 2a + 4T(n-1) = 3a + 4a + 8T(n-1) = a * (2^n - 1) + 2^n * T(0) = a * (2^n - 1) + b * 2^n = (a + b) * 2^n - a = O(2^n) #### @Shubham 2017-05-16 01:29:31

One of the best ways I find for approximating the complexity of the recursive algorithm is drawing the recursion tree. Once you have the recursive tree:

``````Complexity = length of tree from root node to leaf node * number of leaf nodes
``````
1. The first function will have length of `n` and number of leaf node `1` so complexity will be `n*1 = n`
2. The second function will have the length of `n/5` and number of leaf nodes again `1` so complexity will be `n/5 * 1 = n/5`. It should be approximated to `n`

3. For the third function, since `n` is being divided by 5 on every recursive call, length of recursive tree will be `log(n)(base 5)`, and number of leaf nodes again 1 so complexity will be `log(n)(base 5) * 1 = log(n)(base 5)`

4. For the fourth function since every node will have two child nodes, the number of leaf nodes will be equal to `(2^n)` and length of the recursive tree will be `n` so complexity will be `(2^n) * n`. But since `n` is insignificant in front of `(2^n)`, it can be ignored and complexity can be only said to be `(2^n)`.

5. For the fifth function, there are two elements introducing the complexity. Complexity introduced by recursive nature of function and complexity introduced by `for` loop in each function. Doing the above calculation, the complexity introduced by recursive nature of function will be `~ n` and complexity due to for loop `n`. Total complexity will be `n*n`.

Note: This is a quick and dirty way of calculating complexity(nothing official!). Would love to hear feedback on this. Thanks. #### @Ben Forsrup 2018-01-29 15:16:15

Excellent answer! I have a question on the fourth function. If it would have had three recursive calls, would the answer be (3^n). Or would you still just say (2^n)? #### @Julian A. 2018-02-25 20:55:57

@Shubham: #4 doesn't seem right to me. If the number of leaves is `2^n` then the height of the tree must be `n`, not `log n`. The height would only be `log n` if `n` represented the total number of nodes in the tree. But it doesn't. #### @Shubham 2018-03-01 00:41:37

@BenForsrup: It will be 3^n because every node will have three child nodes. Best way to be sure about this is to draw the recursive tree yourselves with dummy values. #### @Fintasys 2019-05-28 14:54:03

#2 should be n-5 not n/5 #### @coder 2012-11-20 06:37:40

The time complexity, in Big O notation, for each function, is in numerical order:

1. The first function is being called recursively n times before reaching base case so its `O(n)`, often called linear.
2. The second function is called n-5 for each time, so we deduct five from n before calling the function, but n-5 is also `O(n)`. (Actually called order of n/5 times. And, O(n/5) = O(n) ).
3. This function is log(n) base 5, for every time we divide by 5 before calling the function so its `O(log(n))`(base 5), often called logarithmic and most often Big O notation and complexity analysis uses base 2.
4. In the fourth, it's `O(2^n)`, or exponential, since each function call calls itself twice unless it has been recursed n times.
5. As for the last function, the for loop takes n/2 since we're increasing by 2, and the recursion take n-5 and since the for loop is called recursively therefore the time complexity is in (n-5) *(n/2) = (2n-10) * n = 2n^2- 10n, due to Asymptotic behavior and worst case scenario considerations or the upper bound that big O is striving for, we are only interested in the largest term so `O(n^2)`.

Good luck on your midterms ;) #### @coder 2012-11-20 07:19:41

your right about the fifth, the n will decrease for the for loop but for the fourth I don't think its n^2 for its like a tree each time your calling the recursion twice so it should be 2^n plus that was your answer in the comment earlier. #### @nhahtdh 2012-11-20 07:26:37

Yes, the 4th one is 2^n, my deleted comment has a typo. But you should fix your post since it is saying log(2^n) #### @coder 2012-11-20 07:30:24

oh, seriously I didn't notice it, thank u, truly I wrote the log by mistake :\$ #### @Gwater17 2017-03-04 01:48:26

Can someone explain the last one in more detail? I get that the recursiveFunc5 is 1/5 * n like problem 2 and I get that the loop is 1/2 * n. Since the loop runs for every recursive call I thought the answer would be (1/5n ^ 1/2n) and since coefficients are dropped it'd ultimately be n^n #### @bjc 2017-09-14 09:08:56

@MJGwater Let the running time of the loop is m. When the recursive run 1 time, it takes m to execute the loop. When the recursive run 2 times, the loop is also run 2 times, so it takes 2m... and so on. So it's '*', not '^'. #### @Stanislav Iegorov 2019-02-04 19:51:32

I think the fifth point should be (n-5) * (n/2) = (n^2 - 5n) / 2 which still leads to O(n^2) #### @Ridhwaan Shakeel 2019-03-14 00:11:43

For #4, is O(n^2) the worst case, considering it makes two recursive calls end-to-end unless it wasnt recursed n times? #### @coder 2019-03-14 13:42:04

@RidhwaanShakeel No there is no best case and worst case in the fourth question since n is not affected by any factor it is only decremented by one recursively. so each time it enters the loop it will call itself twice, if n = 5 then it will call itself 2^5, you can try to draw it as a tree so you may visualize. I hope this makes clear. #### @Jack 2019-04-17 17:28:12

@coder The explanation for 5 seems odd. If incrementing by 2 results in `n/2` iterations of the `for` loop, why would decrementing by 5 not result in `n/5` recursive calls? This would still result in `O(n^2)` but seems like a more intuitive explanation. Why mix subtraction and division when they're essential doing the same thing? #### @rmutalik 2020-04-10 20:00:58

@coder so for #4, if there were 3 recursive calls in the function definition, it would have a time complexity of O(3^n)? And for 5 recursive calls it would be O(5^n), correct? #### @Optider 2020-05-03 10:02:55

@Jack Yes, I was also wondering the same. It should be `n/5` not `n-5`. And ultimately, whole will boil down to `O(N^2)`. #### @mavis 2020-06-01 17:47:41

After all, subtraction is a repeated division.

### [SOLVED] How do I calculate big O notation for my recursive function?

• 2017-10-27 04:54:42
• FatimahFatCakes
• 393 View
• 0 Score