#### [SOLVED] Computational complexity of Fibonacci Sequence

By Juliet

I understand Big-O notation, but I don't know how to calculate it for many functions. In particular, I've been trying to figure out the computational complexity of the naive version of the Fibonacci sequence:

``````int Fibonacci(int n)
{
if (n <= 1)
return n;
else
return Fibonacci(n - 1) + Fibonacci(n - 2);
}
``````

What is the computational complexity of the Fibonacci sequence and how is it calculated?

#### @bob 2019-12-20 22:40:22

It is simple to calculate by diagramming function calls. Simply add the function calls for each value of n and look at how the number grows.

The Big O is O(Z^n) where Z is the golden ratio or about 1.62.

Both the Leonardo numbers and the Fibonacci numbers approach this ratio as we increase n.

Unlike other Big O questions there is no variability in the input and both the algorithm and implementation of the algorithm are clearly defined.

There is no need for a bunch of complex math. Simply diagram out the function calls below and fit a function to the numbers.

Or if you are familiar with the golden ratio you will recognize it as such.

This answer is more correct than the accepted answer which claims that it will approach f(n) = 2^n. It never will. It will approach f(n) = golden_ratio^n.

``````2 (2 -> 1, 0)

4 (3 -> 2, 1) (2 -> 1, 0)

8 (4 -> 3, 2) (3 -> 2, 1) (2 -> 1, 0)
(2 -> 1, 0)

14 (5 -> 4, 3) (4 -> 3, 2) (3 -> 2, 1) (2 -> 1, 0)
(2 -> 1, 0)

(3 -> 2, 1) (2 -> 1, 0)

22 (6 -> 5, 4)
(5 -> 4, 3) (4 -> 3, 2) (3 -> 2, 1) (2 -> 1, 0)
(2 -> 1, 0)

(3 -> 2, 1) (2 -> 1, 0)

(4 -> 3, 2) (3 -> 2, 1) (2 -> 1, 0)
(2 -> 1, 0)
``````

#### @Nico Haase 2020-01-11 15:46:00

Can you provide any source for that claim about the golden ratio?

You model the time function to calculate `Fib(n)` as sum of time to calculate `Fib(n-1)` plus the time to calculate `Fib(n-2)` plus the time to add them together (`O(1)`). This is assuming that repeated evaluations of the same `Fib(n)` take the same time - i.e. no memoization is use.

`T(n<=1) = O(1)`

`T(n) = T(n-1) + T(n-2) + O(1)`

You solve this recurrence relation (using generating functions, for instance) and you'll end up with the answer.

Alternatively, you can draw the recursion tree, which will have depth `n` and intuitively figure out that this function is asymptotically `O(2``n``)`. You can then prove your conjecture by induction.

Base: `n = 1` is obvious

Assume `T(n-1) = O(2``n-1``)`, therefore

`T(n) = T(n-1) + T(n-2) + O(1)` which is equal to

`T(n) = O(2``n-1``) + O(2``n-2``) + O(1) = O(2``n``)`

However, as noted in a comment, this is not the tight bound. An interesting fact about this function is that the T(n) is asymptotically the same as the value of `Fib(n)` since both are defined as

`f(n) = f(n-1) + f(n-2)`.

The leaves of the recursion tree will always return 1. The value of `Fib(n)` is sum of all values returned by the leaves in the recursion tree which is equal to the count of leaves. Since each leaf will take O(1) to compute, `T(n)` is equal to `Fib(n) x O(1)`. Consequently, the tight bound for this function is the Fibonacci sequence itself (~`θ(1.6``n``)`). You can find out this tight bound by using generating functions as I'd mentioned above.

#### @Andrew Rollings 2008-12-11 20:38:34

Also Proof by induction. Nice. +1

#### @Captain Segfault 2008-12-11 21:15:20

Although the bound is not tight.

@Captain Segfault: Yeah. I clarified the answer. You'd get the tight bound using GF method as I had written above.

#### @David Rodríguez - dribeas 2008-12-11 23:08:44

Itake your StackOverflowException as a joke. The exponential time is perceivable quite easily with rather small values for n.

#### @Devendra 2014-07-17 06:22:55

@MehrdadAfshari can you explain why u take T(n-1) = O(2^n-1). T(n-1) should be (n^2), because Fibonacci have calls T(n-1)+T(n-2) so after summing all the cost (2*2*2....) should be 2^n.

#### @Richard Fung 2016-04-28 05:33:00

Is the proof by induction really correct? It seems like you could equally just assume T(n) = O(n) and then you would have T(n) = O(n-1) + O(n-2) + O(1) = O(n) so T(n) = O(n) but that is obviously not true? If it is correct someone please explain..

#### @amnn 2016-07-03 10:36:14

@RichardFung The logic used here is not precise, the induction hypothesis is too weak, because it already includes the big-O inside it. The correct hypothesis is to say that T(n) <= c*2^n for some fixed c, and then from the conclusion of the inductive proof, you can infer that T(n) = O(2^n)

#### @M. Evers 2017-08-18 17:19:46

This proof is looks wrong, but it's close. The hypothesis here is too weak. The hypothesis states `T(n-1) = O(2^(n-1))`, but NOT that `T(n-2) = O(2^(n-2))` which Mehrdad uses in the last step of the proof. We don't know `T(n-2) = O(2^(n-2))`. The easiest solution to this is to use Complete induction, rather than partial. Show the statement is true for Fib#1 and Fib#2, then you can assume `T(n-2) = O(2^(n-2))`, and the proof holds.

#### @bob 2019-12-20 22:32:21

"Alternatively, you can draw the recursion tree, which will have depth n and intuitively figure out that this function is asymptotically O(2n)." - This is completely false. The time complexity is O(golden_ratio^n). It never comes close to O(2^n). If you could reach out toward infinity it would get close to O(golden_ratio^n). That is what an asymptote is, the distance between the two lines must approach 0.

#### @nikhil kekan 2018-11-29 06:30:13

Recursive algorithm's time complexity can be better estimated by drawing recursion tree, In this case the recurrence relation for drawing recursion tree would be T(n)=T(n-1)+T(n-2)+O(1) note that each step takes O(1) meaning constant time,since it does only one comparison to check value of n in if block.Recursion tree would look like

``````          n
(n-1)      (n-2)
(n-2)(n-3) (n-3)(n-4) ...so on
``````

Here lets say each level of above tree is denoted by i hence,

``````i
0                        n
1            (n-1)                 (n-2)
2        (n-2)    (n-3)      (n-3)     (n-4)
3   (n-3)(n-4) (n-4)(n-5) (n-4)(n-5) (n-5)(n-6)
``````

lets say at particular value of i, the tree ends, that case would be when n-i=1, hence i=n-1, meaning that the height of the tree is n-1. Now lets see how much work is done for each of n layers in tree.Note that each step takes O(1) time as stated in recurrence relation.

``````2^0=1                        n
2^1=2            (n-1)                 (n-2)
2^2=4        (n-2)    (n-3)      (n-3)     (n-4)
2^3=8   (n-3)(n-4) (n-4)(n-5) (n-4)(n-5) (n-5)(n-6)    ..so on
2^i for ith level
``````

since i=n-1 is height of the tree work done at each level will be

``````i work
1 2^1
2 2^2
3 2^3..so on
``````

Hence total work done will sum of work done at each level, hence it will be 2^0+2^1+2^2+2^3...+2^(n-1) since i=n-1. By geometric series this sum is 2^n, Hence total time complexity here is O(2^n)

#### @Jason Cohen 2008-12-11 20:26:21

Just ask yourself how many statements need to execute for `F(n)` to complete.

For `F(1)`, the answer is `1` (the first part of the conditional).

For `F(n)`, the answer is `F(n-1) + F(n-2)`.

So what function satisfies these rules? Try an (a > 1):

an == a(n-1) + a(n-2)

Divide through by a(n-2):

a2 == a + 1

Solve for `a` and you get `(1+sqrt(5))/2 = 1.6180339887`, otherwise known as the golden ratio.

So it takes exponential time.

#### @Andrew Rollings 2008-12-11 20:37:58

Proof by induction. Nice. +1

#### @molbdnilo 2014-08-14 12:46:55

No, 1 isn't equal to 1 + 1. The function that satisifes those rules is mentioned in the question.

#### @Da Teng 2016-04-10 00:35:50

The answer is not wrong. It's right asymptomatically. The other solution is negative so doesn't physically make sense.

#### @frank 2016-08-03 20:22:06

Can someone explain how a^n == a^(n-1) + a^(n-2) satisfy these rules? How is it satisfied exactly, please be specific.

#### @Lucia 2019-05-10 17:50:28

@frank Remember that the time complexity of any tree is O(no_of_branches ^ depth_of_tree)? For any branch fib(N), given that it's not a perfect tree at the bottom, in the end, the height is N, but the "average number of branches" is slightly less than 2. So, to get this exact number `x`, we assume for any branch fib(n), the the O(n) is `x^n`. Thus `x^n = x^n-1 + x^n-2`.

#### @Miguel 2017-09-30 22:21:52

The naive recursion version of Fibonacci is exponential by design due to repetition in the computation:

At the root you are computing:

F(n) depends on F(n-1) and F(n-2)

F(n-1) depends on F(n-2) again and F(n-3)

F(n-2) depends on F(n-3) again and F(n-4)

then you are having at each level 2 recursive calls that are wasting a lot of data in the calculation, the time function will look like this:

T(n) = T(n-1) + T(n-2) + C, with C constant

T(n-1) = T(n-2) + T(n-3) > T(n-2) then

T(n) > 2*T(n-2)

...

T(n) > 2^(n/2) * T(1) = O(2^(n/2))

This is just a lower bound that for the purpose of your analysis should be enough but the real time function is a factor of a constant by the same Fibonacci formula and the closed form is known to be exponential of the golden ratio.

In addition, you can find optimized versions of Fibonacci using dynamic programming like this:

``````static int fib(int n)
{
/* memory */
int f[] = new int[n+1];
int i;

/* Init */
f[0] = 0;
f[1] = 1;

/* Fill */
for (i = 2; i <= n; i++)
{
f[i] = f[i-1] + f[i-2];
}

return f[n];
}
``````

That is optimized and do only n steps but is also exponential.

Cost functions are defined from Input size to the number of steps to solve the problem. When you see the dynamic version of Fibonacci (n steps to compute the table) or the easiest algorithm to know if a number is prime (sqrt(n) to analyze the valid divisors of the number). you may think that these algorithms are O(n) or O(sqrt(n)) but this is simply not true for the following reason: The input to your algorithm is a number: n, using the binary notation the input size for an integer n is log2(n) then doing a variable change of

``````m = log2(n) // your real input size
``````

let find out the number of steps as a function of the input size

``````m = log2(n)
2^m = 2^log2(n) = n
``````

then the cost of your algorithm as a function of the input size is:

``````T(m) = n steps = 2^m steps
``````

and this is why the cost is an exponential.

#### @Tony Tannous 2017-08-10 15:39:12

You can expand it and have a visulization

``````     T(n) = T(n-1) + T(n-2) <
T(n-1) + T(n-1)

= 2*T(n-1)
= 2*2*T(n-2)
= 2*2*2*T(n-3)
....
= 2^i*T(n-i)
...
==> O(2^n)
``````

#### @Quazi Irfan 2017-10-04 07:40:02

I understand the first line. But why is there a less than character `<` at the end? How did you get `T(n-1) + T(n-1)`?

#### @Tony Tannous 2017-10-04 07:42:30

@QuaziIrfan :D that is an arrow. -> [(not less than). Sorry for confusion regarding the last line]. For the first line, well... `T(n-1) > T(n-2)` So I can change `T(n-2)` and put `T(n-1)`. I will only get a higher bound which is still valid for `T(n-1) + T(n-2)`

#### @J.P. 2014-04-15 21:38:56

I agree with pgaur and rickerbh, recursive-fibonacci's complexity is O(2^n).

I came to the same conclusion by a rather simplistic but I believe still valid reasoning.

First, it's all about figuring out how many times recursive fibonacci function ( F() from now on ) gets called when calculating the Nth fibonacci number. If it gets called once per number in the sequence 0 to n, then we have O(n), if it gets called n times for each number, then we get O(n*n), or O(n^2), and so on.

So, when F() is called for a number n, the number of times F() is called for a given number between 0 and n-1 grows as we approach 0.

As a first impression, it seems to me that if we put it in a visual way, drawing a unit per time F() is called for a given number, wet get a sort of pyramid shape (that is, if we center units horizontally). Something like this:

``````n              *
n-1            **
n-2           ****
...
2           ***********
1       ******************
0    ***************************
``````

Now, the question is, how fast is the base of this pyramid enlarging as n grows?

Let's take a real case, for instance F(6)

``````F(6)                 *  <-- only once
F(5)                 *  <-- only once too
F(4)                 **
F(3)                ****
F(2)              ********
F(1)          ****************           <-- 16
F(0)  ********************************    <-- 32
``````

We see F(0) gets called 32 times, which is 2^5, which for this sample case is 2^(n-1).

Now, we want to know how many times F(x) gets called at all, and we can see the number of times F(0) is called is only a part of that.

If we mentally move all the *'s from F(6) to F(2) lines into F(1) line, we see that F(1) and F(0) lines are now equal in length. Which means, total times F() gets called when n=6 is 2x32=64=2^6.

Now, in terms of complexity:

``````O( F(6) ) = O(2^6)
O( F(n) ) = O(2^n)
``````

#### @Avik 2016-02-22 10:08:51

F(3) only gets called 3 times and not 4 times. The second pyramid is wrong.

#### @Dukeling 2017-06-13 19:20:06

F(3) = 3, F(2) = 5, F(1) = 8, F(0) = 5. I would fix it, but I don't think I can salvage this answer with an edit.

#### @benkc 2014-02-28 01:21:33

The proof answers are good, but I always have to do a few iterations by hand to really convince myself. So I drew out a small calling tree on my whiteboard, and started counting the nodes. I split my counts out into total nodes, leaf nodes, and interior nodes. Here's what I got:

``````IN | OUT | TOT | LEAF | INT
1 |   1 |   1 |   1  |   0
2 |   1 |   1 |   1  |   0
3 |   2 |   3 |   2  |   1
4 |   3 |   5 |   3  |   2
5 |   5 |   9 |   5  |   4
6 |   8 |  15 |   8  |   7
7 |  13 |  25 |  13  |  12
8 |  21 |  41 |  21  |  20
9 |  34 |  67 |  34  |  33
10 |  55 | 109 |  55  |  54
``````

What immediately leaps out is that the number of leaf nodes is `fib(n)`. What took a few more iterations to notice is that the number of interior nodes is `fib(n) - 1`. Therefore the total number of nodes is `2 * fib(n) - 1`.

Since you drop the coefficients when classifying computational complexity, the final answer is `θ(fib(n))`.

#### @benkc 2014-02-28 01:23:04

(No, I didn't draw a full 10-deep call tree on my whiteboard. Just 5-deep.) ;)

#### @Peter Cordes 2018-03-01 18:49:12

Nice, I was wondering how many extra additions recursive Fib did. It's not just adding `1` to a single accumulator `Fib(n)` times, but interesting that it's still exactly `θ(Fib(n))`.

#### @Peter Cordes 2018-03-01 18:52:46

Note that some (most) recursive implementations spend time adding `0`, though: recursion base cases are `0` and `1`, because they do `Fib(n-1) + Fib(n-2)`. So probably the `3 * Fib(n) - 2` from this link-only answer is more accurate for the total number of nodes, not `2 * Fib(n) - 1`.

#### @pgaur 2012-11-18 00:46:27

Well, according to me to it is `O(2^n)` as in this function only recursion is taking the considerable time (divide and conquer). We see that, the above function will continue in a tree until the leaves are approaches when we reach to the level `F(n-(n-1))` i.e. `F(1)`. So, here when we jot down the time complexity encountered at each depth of tree, the summation series is:

``````1+2+4+.......(n-1)
= 1((2^n)-1)/(2-1)
=2^n -1
``````

that is order of `2^n [ O(2^n) ]`.

#### @Kunal B. 2016-07-31 01:07:48

No explanation provided

#### @Bob Cross 2008-12-11 21:10:03

There's a very nice discussion of this specific problem over at MIT. On page 5, they make the point that, if you assume that an addition takes one computational unit, the time required to compute Fib(N) is very closely related to the result of Fib(N).

As a result, you can skip directly to the very close approximation of the Fibonacci series:

``````Fib(N) = (1/sqrt(5)) * 1.618^(N+1) (approximately)
``````

and say, therefore, that the worst case performance of the naive algorithm is

``````O((1/sqrt(5)) * 1.618^(N+1)) = O(1.618^(N+1))
``````

PS: There is a discussion of the closed form expression of the Nth Fibonacci number over at Wikipedia if you'd like more information.

#### @SwimBikeRun 2018-08-15 01:19:18

Thanks for the course link. Very nice observation too

#### @Dave L. 2008-12-11 21:03:08

It is bounded on the lower end by `2^(n/2)` and on the upper end by 2^n (as noted in other comments). And an interesting fact of that recursive implementation is that it has a tight asymptotic bound of Fib(n) itself. These facts can be summarized:

``````T(n) = Ω(2^(n/2))  (lower bound)
T(n) = O(2^n)   (upper bound)
T(n) = Θ(Fib(n)) (tight bound)
``````

The tight bound can be reduced further using its closed form if you like.

### [SOLVED] Generating Fibonacci numbers in Haskell?

• 2009-07-09 18:41:14
• Lucky
• 43035 View
• 50 Score