2012-11-20 06:28:37 8 Comments

I have a Computer Science Midterm tomorrow and I need help determining the complexity of these recursive functions. I know how to solve simple cases, but I am still trying to learn how to solve these harder cases. These were just a few of the example problems that I could not figure out. Any help would be much appreciated and would greatly help in my studies, Thank you!

```
int recursiveFun1(int n)
{
if (n <= 0)
return 1;
else
return 1 + recursiveFun1(n-1);
}
int recursiveFun2(int n)
{
if (n <= 0)
return 1;
else
return 1 + recursiveFun2(n-5);
}
int recursiveFun3(int n)
{
if (n <= 0)
return 1;
else
return 1 + recursiveFun3(n/5);
}
void recursiveFun4(int n, int m, int o)
{
if (n <= 0)
{
printf("%d, %d\n",m, o);
}
else
{
recursiveFun4(n-1, m+1, o);
recursiveFun4(n-1, m, o+1);
}
}
int recursiveFun5(int n)
{
for (i = 0; i < n; i += 2) {
// do something
}
if (n <= 0)
return 1;
else
return 1 + recursiveFun5(n-5);
}
```

### Related Questions

#### Sponsored Content

#### 28 Answered Questions

### [SOLVED] What is tail recursion?

**2008-08-29 03:48:03****Ben Lever****452659**View**1709**Score**28**Answer- Tags: algorithm language-agnostic functional-programming recursion tail-recursion

#### 41 Answered Questions

### [SOLVED] What is a plain English explanation of "Big O" notation?

**2009-01-28 11:10:32****Arec Barrwin****700571**View**4949**Score**41**Answer- Tags: algorithm complexity-theory computer-science big-o time-complexity

#### 9 Answered Questions

### [SOLVED] How to find time complexity of an algorithm

**2012-06-14 11:21:15****Yasser Shaikh****685872**View**893**Score**9**Answer- Tags: algorithm time-complexity complexity-theory

#### 11 Answered Questions

### [SOLVED] Computational complexity of Fibonacci Sequence

**2008-12-11 20:20:25****Juliet****307862**View**332**Score**11**Answer- Tags: time-complexity big-o complexity-theory fibonacci

#### 23 Answered Questions

### [SOLVED] Big O, how do you calculate/approximate it?

**2008-08-06 10:18:16****sven****425601**View**881**Score**23**Answer- Tags: algorithm optimization complexity-theory big-o performance

#### 1 Answered Questions

### complexity for recursive functions (Big O notation)

**2016-03-30 08:49:55****user3469277****824**View**0**Score**1**Answer- Tags: time-complexity big-o

#### 1 Answered Questions

#### 1 Answered Questions

### [SOLVED] Big O of recursive functions with multiple function calls?

**2017-03-29 13:07:15****asteegs****601**View**0**Score**1**Answer- Tags: recursion time-complexity big-o

## 5 comments

## @higlu 2020-04-18 22:18:38

The key here is to visualise the call tree. Once done that, the complexity is:

the latter term can be computed the same way we do for a normal iterative function.

Instead, the total nodes of a complete tree are computed as

Where C is number of children of each node and L is the number of levels of the tree (root included).

It is easy to visualise the tree. Start from the first call (root node) then draw a number of children same as the number of recursive calls in the function. It is also useful to write the parameter passed to the sub-call as "value of the node".

So, in the examples above:

O(n)O(n)O(log(n))O(2^n).O(n^2)## @OhadM 2020-02-22 16:41:18

We can prove it mathematically which is something I was missing in the above answers.

It can

dramaticallyhelp you understand how to calculate any method. I recommend reading it from top to bottom to fully understand how to do it:`T(n) = T(n-1) + 1`

It means that the time it takes for the method to finish is equal to the same method but with n-1 which is`T(n-1)`

and we now add`+ 1`

because it's the time it takes for the general operations to be completed (except`T(n-1)`

). Now, we are going to find`T(n-1)`

as follow:`T(n-1) = T(n-1-1) + 1`

. It looks like we can now form a function that can give us some sort of repetition so we can fully understand. We will place the right side of`T(n-1) = ...`

instead of`T(n-1)`

inside the method`T(n) = ...`

which will give us:`T(n) = T(n-1-1) + 1 + 1`

which is`T(n) = T(n-2) + 2`

or in other words we need to find our missing`k`

:`T(n) = T(n-k) + k`

. The next step is to take`n-k`

and claim that`n-k = 1`

because at the end of the recursion it will take exactly O(1) when`n<=0`

. From this simple equation we now know that`k = n - 1`

. Let's place`k`

in our final method:`T(n) = T(n-k) + k`

which will give us:`T(n) = 1 + n - 1`

which is exactly`n`

or`O(n)`

.`O(n)`

.`T(n) = T(n/5) + 1`

as before, the time for this method to finish equals to the time the same method but with`n/5`

which is why it is bounded to`T(n/5)`

. Let's find`T(n/5)`

like in 1:`T(n/5) = T(n/5/5) + 1`

which is`T(n/5) = T(n/5^2) + 1`

. Let's place`T(n/5)`

inside`T(n)`

for the final calculation:`T(n) = T(n/5^k) + k`

. Again as before,`n/5^k = 1`

which is`n = 5^k`

which is exactly as asking what in power of 5, will give us n, the answer is`log5n = k`

(log of base 5). Let's place our findings in`T(n) = T(n/5^k) + k`

as follow:`T(n) = 1 + logn`

which is`O(logn)`

`T(n) = 2T(n-1) + 1`

what we have here is basically the same as before but this time we are invoking the method recursively 2 times thus we multiple it by 2. Let's find`T(n-1) = 2T(n-1-1) + 1`

which is`T(n-1) = 2T(n-2) + 1`

. Our next place as before, let's place our finding:`T(n) = 2(2T(n-2)) + 1 + 1`

which is`T(n) = 2^2T(n-2) + 2`

that gives us`T(n) = 2^kT(n-k) + k`

. Let's find`k`

by claiming that`n-k = 1`

which is`k = n - 1`

. Let's place`k`

as follow:`T(n) = 2^(n-1) + n - 1`

which is roughly`O(2^n)`

`T(n) = T(n-5) + n + 1`

It's almost the same as 4 but now we add`n`

because we have one`for`

loop. Let's find`T(n-5) = T(n-5-5) + n + 1`

which is`T(n-5) = T(n - 2*5) + n + 1`

. Let's place it:`T(n) = T(n-2*5) + n + n + 1 + 1)`

which is`T(n) = T(n-2*5) + 2n + 2)`

and for the k:`T(n) = T(n-k*5) + kn + k)`

again:`n-5k = 1`

which is`n = 5k + 1`

that is roughly`n = k`

. This will give us:`T(n) = T(0) + n^2 + n`

which is roughly`O(n^2)`

.I now recommend reading the rest of the answers which now, will give you a better perspective. Good luck winning those big O's :)

## @nhahtdh 2012-11-20 07:55:00

For the case where

`n <= 0`

,`T(n) = O(1)`

. Therefore, the time complexity will depend on when`n >= 0`

.We will consider the case

`n >= 0`

in the part below.1.

where a is some constant.

By induction:

where a, b are some constant.

2.

where a is some constant

By induction:

where a, b are some constant and k <= 0

3.

where a is some constant

By induction:

where a, b are some constant

4.

where a is some constant

By induction:

where a, b are some constant.

5.

where n is some constant

Rewrite

`n = 5q + r`

where q and r are integer and r = 0, 1, 2, 3, 4We have

`q = (n - r) / 5`

, and since r < 5, we can consider it a constant, so`q = O(n)`

By induction:

Since r < 4, we can find some constant b so that

`b >= T(r)`

## @Dimitar Dimitrov 2013-09-03 04:18:07

I recently failed an interview question (and by extend the interview) that has to do with analyzing the time and space complexity of a recursive fibonacci function. This answer is epic and it helped a lot, I love it, I wish I could up vote you twice. I know it's old but do you have anything similar for calculating space - maybe a link, anything ?

## @Snowfish 2020-01-03 22:59:41

For No.4, even though the result is the same, shouldn't the induction be the following? T(n) = a + 2T(n-1) = a + 2a + 4T(n-1) = 3a + 4a + 8T(n-1) = a * (2^n - 1) + 2^n * T(0) = a * (2^n - 1) + b * 2^n = (a + b) * 2^n - a = O(2^n)

## @Shubham 2017-05-16 01:29:31

One of the best ways I find for approximating the complexity of the recursive algorithm is drawing the recursion tree. Once you have the recursive tree:

`n`

and number of leaf node`1`

so complexity will be`n*1 = n`

The second function will have the length of

`n/5`

and number of leaf nodes again`1`

so complexity will be`n/5 * 1 = n/5`

. It should be approximated to`n`

For the third function, since

`n`

is being divided by 5 on every recursive call, length of recursive tree will be`log(n)(base 5)`

, and number of leaf nodes again 1 so complexity will be`log(n)(base 5) * 1 = log(n)(base 5)`

For the fourth function since every node will have two child nodes, the number of leaf nodes will be equal to

`(2^n)`

and length of the recursive tree will be`n`

so complexity will be`(2^n) * n`

. But since`n`

is insignificant in front of`(2^n)`

, it can be ignored and complexity can be only said to be`(2^n)`

.For the fifth function, there are two elements introducing the complexity. Complexity introduced by recursive nature of function and complexity introduced by

`for`

loop in each function. Doing the above calculation, the complexity introduced by recursive nature of function will be`~ n`

and complexity due to for loop`n`

. Total complexity will be`n*n`

.Note: This is a quick and dirty way of calculating complexity(nothing official!). Would love to hear feedback on this. Thanks.

## @Ben Forsrup 2018-01-29 15:16:15

Excellent answer! I have a question on the fourth function. If it would have had three recursive calls, would the answer be (3^n). Or would you still just say (2^n)?

## @Julian A. 2018-02-25 20:55:57

@Shubham: #4 doesn't seem right to me. If the number of leaves is

`2^n`

then the height of the tree must be`n`

, not`log n`

. The height would only be`log n`

if`n`

represented the total number of nodes in the tree. But it doesn't.## @Shubham 2018-03-01 00:41:37

@BenForsrup: It will be 3^n because every node will have three child nodes. Best way to be sure about this is to draw the recursive tree yourselves with dummy values.

## @Fintasys 2019-05-28 14:54:03

#2 should be n-5 not n/5

## @coder 2012-11-20 06:37:40

The time complexity, in Big O notation, for each function, is in numerical order:

`O(n)`

, often calledlinear.`O(n)`

. (Actually called order of n/5 times. And, O(n/5) = O(n) ).`O(log(n))`

(base 5), often calledlogarithmicand most often Big O notation and complexity analysis uses base 2.`O(2^n)`

, orexponential, since each function call calls itself twice unless it has been recursedntimes.As for the last function, the for loop takes n/2 since we're increasing by 2, and the recursion take n-5 and since the for loop is called recursively therefore the time complexity is in (n-5) *(n/2) = (2n-10) * n = 2n^2- 10n, due to Asymptotic behavior and worst case scenario considerations or the upper bound that big O is striving for, we are only interested in the largest term so

`O(n^2)`

.Good luck on your midterms ;)

## @coder 2012-11-20 07:19:41

your right about the fifth, the n will decrease for the for loop but for the fourth I don't think its n^2 for its like a tree each time your calling the recursion twice so it should be 2^n plus that was your answer in the comment earlier.

## @nhahtdh 2012-11-20 07:26:37

Yes, the 4th one is 2^n, my deleted comment has a typo. But you should fix your post since it is saying log(2^n)

## @coder 2012-11-20 07:30:24

oh, seriously I didn't notice it, thank u, truly I wrote the log by mistake :$

## @Gwater17 2017-03-04 01:48:26

Can someone explain the last one in more detail? I get that the recursiveFunc5 is 1/5 * n like problem 2 and I get that the loop is 1/2 * n. Since the loop runs for every recursive call I thought the answer would be (1/5n ^ 1/2n) and since coefficients are dropped it'd ultimately be n^n

## @bjc 2017-09-14 09:08:56

@MJGwater Let the running time of the loop is m. When the recursive run 1 time, it takes m to execute the loop. When the recursive run 2 times, the loop is also run 2 times, so it takes 2m... and so on. So it's '*', not '^'.

## @Stanislav Iegorov 2019-02-04 19:51:32

I think the fifth point should be (n-5) * (n/2) = (n^2 - 5n) / 2 which still leads to O(n^2)

## @Ridhwaan Shakeel 2019-03-14 00:11:43

For #4, is O(n^2) the worst case, considering it makes two recursive calls end-to-end unless it wasnt recursed n times?

## @coder 2019-03-14 13:42:04

@RidhwaanShakeel No there is no best case and worst case in the fourth question since n is not affected by any factor it is only decremented by one recursively. so each time it enters the loop it will call itself twice, if n = 5 then it will call itself 2^5, you can try to draw it as a tree so you may visualize. I hope this makes clear.

## @Jack 2019-04-17 17:28:12

@coder The explanation for 5 seems odd. If incrementing by 2 results in

`n/2`

iterations of the`for`

loop, why would decrementing by 5 not result in`n/5`

recursive calls? This would still result in`O(n^2)`

but seems like a more intuitive explanation. Why mix subtraction and division when they're essential doing the same thing?## @rmutalik 2020-04-10 20:00:58

@coder so for #4, if there were 3 recursive calls in the function definition, it would have a time complexity of O(3^n)? And for 5 recursive calls it would be O(5^n), correct?

## @Optider 2020-05-03 10:02:55

@Jack Yes, I was also wondering the same. It should be

`n/5`

not`n-5`

. And ultimately, whole will boil down to`O(N^2)`

.## @mavis 2020-06-01 17:47:41

After all, subtraction is a repeated division.