## What Is The Computational Complexity Of The (function to compute the) Fibonacci Series?

Bear Giles | January 14, 2016This question indirectly came up during a recent interview. What is the computational complexity of the function to compute the Fibonacci Series. I’m referring specifically to the classic recursive format

- public int fib(int n) {
- if (n < 0) throw new IllegalArgumentException();
- if (n == 0) return 1;
- if (n == 1) return 1;
- return fib(n-1) + fib(n-2);
- }

public int fib(int n) { if (n < 0) throw new IllegalArgumentException(); if (n == 0) return 1; if (n == 1) return 1; return fib(n-1) + fib(n-2); }

I could look it up but this is a chance to exercise my analysis of algorithms skills.

The memoized version is clearly *O(n)* in space and time. If you don’t need to reuse computations you can make it *O(n)* in time and *O(1)* in space.

The non-memoized version is a little harder to compute. We can start by defining the function OpsFib(n) which is the number of operations required to compute Fib(n), where an “operation” is either an addition or a memory lookup. We can immediately look at the first few terms:

- OpsFib(0) = 1 = Fib(0) // lookup, f(0) = 0
- OpsFib(1) = 1 = Fib(1) // lookup, f(1) = 0
- OpsFib(2) = OpsFib(1) + OpsFib(0) = (Fib(1) + f(1)) + (Fib(0) + f(0)) + 1 = Fib(2) + 1 = Fib(2) + f(2)
- OpsFib(3) = OpsFib(2) + OpsFib(1) = (Fib(2) + f(2)) + (Fib(1) + f(1)) + 1 = Fib(2) + Fib(1) + f(2) + 1 = Fib(3) + f(3)
- OpsFib(4) = OpsFib(3) + OpsFib(2) = (Fib(3) + f(3)) + (Fib(2) + f(2)) + 1 = Fib(3) + Fib(3) + f(3) + f(4) + 1 = Fib(4) + f(4)
- OpsFib(5) = OpsFib(4) + OpsFib(3) = (Fib(4) + f(4)) + (Fib(3) + f(3)) + 1 = Fib(4) + Fib(4) + f(4) + f(3) + 1 = Fib(5) + f(5)

OpsFib(0) = 1 = Fib(0) // lookup, f(0) = 0 OpsFib(1) = 1 = Fib(1) // lookup, f(1) = 0 OpsFib(2) = OpsFib(1) + OpsFib(0) = (Fib(1) + f(1)) + (Fib(0) + f(0)) + 1 = Fib(2) + 1 = Fib(2) + f(2) OpsFib(3) = OpsFib(2) + OpsFib(1) = (Fib(2) + f(2)) + (Fib(1) + f(1)) + 1 = Fib(2) + Fib(1) + f(2) + 1 = Fib(3) + f(3) OpsFib(4) = OpsFib(3) + OpsFib(2) = (Fib(3) + f(3)) + (Fib(2) + f(2)) + 1 = Fib(3) + Fib(3) + f(3) + f(4) + 1 = Fib(4) + f(4) OpsFib(5) = OpsFib(4) + OpsFib(3) = (Fib(4) + f(4)) + (Fib(3) + f(3)) + 1 = Fib(4) + Fib(4) + f(4) + f(3) + 1 = Fib(5) + f(5)

If we generalize to an arbitrary *n* it that we have two equations:

- OpsFib(n) = OpsFib(n-1) + OpsFib(n-2) = Fib(n-1) + f(n-1) + Fib(n-2) + f(n-2) + 1 = Fib(n) + f(n)
- f(n) = f(n-1) + f(n-2) + 1

OpsFib(n) = OpsFib(n-1) + OpsFib(n-2) = Fib(n-1) + f(n-1) + Fib(n-2) + f(n-2) + 1 = Fib(n) + f(n) f(n) = f(n-1) + f(n-2) + 1

where *f(n)* is similar to the Fibonacci series but has a slightly different initial condition. By inspection we can also write it as *f(n) = Fib(n-2) + n – 2* or something close to that. The details don’t matter since *O(n)* will be dominated by the first term.

We know that Fib(n) ~ *φ^n* for larger *n* so *O(Fib(n)) = O(φ^n) = O(2^n)*. That is, for the classic recursive implementation *O(Fib(n))* is exponential.

**What about the other recursive implementation?**

You might remember that I did a separate recursive implementation in https://invariantproperties.com/?p=1553 [Fibonacci and Lucas Sequences]. What is it’s order?

This is a lot harder to determine. Normally any time you have a recursive function of F(n) = F(n/b) then *O(F(n)) = O(lg n)*. However it’s also true that any recursive function of F(n) = aF(n-1) + bF(n-1) then *O(F(n) = O(2^n)*. We can combine these to see that a recursive function of F(n) = aF(n/2) + bF(n/2) will have O(F(n)) = O(2^lg n) = O(n).

HOWEVER this implementation, like the traditional recursive definition, will hit F(n) for smaller n many, many times. We will see tremendous benefits from memoization. As an informed guess I would say O(lg(n)) < O(Fib(n)) < O(lg(n)^{2}) but it’s nothing but a guess. It will definitely be faster than either a memoized classic recursive implementation or a non-memoized new implementation.