Understanding Big O Notation - Time complexity

March 29, 2023

This section contains some math, but don't worry, don't panic, it means the rest of the log will be mathy but we'll survive.

Objectives

  • Describe Big O notations
  • Simplify Big O expressions
  • Define "time complexity" and "space complexity"
  • Evaluate the time complexity and space complexity of different algorithms using Big O notation
  • Describe that a logarith is

What is the idea here?

Every algorithm has maybe 20 or more solutions, How do we not which one is the best? Imagine we have multiple implementations of the same function,

Example: "Write a function that accepts a string input and return a reversed copy"

If you go to in the internet you'll find:

  • Using arrays
  • Built in methods
  • etc

It would be great if would have a sort of a system, that let us know which one is great, pretty good, Ok, etc. and let us also compare among each others. How good or not as expected is a solution. We can have a numeric representation of how a solution or algorith we'll behave.

Why is important?

It is not only important some times to find a solution that works, some times when we have to handle tons of millions of bytes of data a different solution can save significant amount of time each moment a pipelines runs or in development productivity.

  • Performance matters at the point in which it can save time, money, effort, it is important to understand how performance impacts
  • seful for discussing trade-offs between different approaches
  • On Debugging it is great to understand why something can be slowing down or crashing your application. To know if there are innefecient chunks that can help our applications.
  • It is a topic that comes up in job interviews

First example

We want to write a function that calculates the sum of all numbers from 1 to n

Option 1

function add(n){
	let result = 0;
	for (let i = 1; i <= n; i+++) {
		result = result + 1;
	}
	return result;
}

Option 2

function add() {
	return n * (n + 1) / 2;
}

Which one is better? What does better means in this scenario?

  • Faster is better?
  • Less memory is better?
  • How long the code is?
  • How important is readable?
  • Less chacarcters?

Usually people prioritize a faster and less memory-intensive code. We are not saying the other points are not important. But we can focus at the moment in Speed.

How do we evaluate speed

Option A

let timestamp1 = performance.now();
callFunction(10000); // example
let timestamp2 = performance.now();
console.log(timestamp2 - timestamp2)

How precise it was? Even in the same machine it may be a bit different each time you run or test a function. Test the examples above to see if option 1 or option 2 is faster. How do we then give a note to that speed?

  • Will it be by getting a percentage?
  • By only getting in in milli seconds?
  • By normalizing and then getting a range

The problem with time

  • Different machines will record different times, that doesn't mean the measure will be innacurate .
  • The same machine may record different times.
  • For fast algorithm, speed measurements may not be precise enough.
  • How can we talk in a general way to start communicating speed?

If not time, then what?

Rather than counting seconds, we can count the the number of simple operations the computer has perform.

What does we mean by simple operations

  • multiplications
  • addition
  • divisions
  • substractions
  • assignments

Examples

function add() {
	return n * (n + 1) / 2;
}

Number of operations: 3

  • 1 multiplication
  • 1 addition
  • 1 division
function add(n){
	let result = 0;
	for (let i = 1; i <= n; i+++) {
		result = result + 1;
	}
	return result;
}

Number of operations: 5n + 2

  • n additions for result
  • n assignments for result
  • n additions for i
  • n assigments for i
  • n comparisons?
  • 2 assigments for initialization

We focus in the big picture

Regardless the exact number, we will be getting th general idea. The number of operations grows roughly proportionally with n

Visualize time complexities

Visit website: https://rithmschool.github.io/function-timer-demo/

Paste images from website

Definition

Big O notation allow us to talk fomrally about how the untime of an algorithm grows as the inputs grow

We won't care about the details only about the big details.

We say that an algorithm id O(f(n)) if the numbers of simple operations the computer has to do is eventually less than a constant times f(n) as n increases

  • Linear f(n) = n
  • Quadratic f(n) = n^2
  • Constant f(n) = 1
  • etc

Operations O(n) inside of O(n) operations will be quadratic

Simplifying expressions

  • O(2n) -> O(n)
  • O(500) -> O(1)
  • O(13n^2) -> O(n^2)

Smaller terms also don't matter

  • O(n+10) -> O(n)
  • O(1000n + 50) -> O(n)
  • O(n^2 + 5n + 8) -> O(n^2)

Insert image of Big O graph

Examples

O(n)

function logUpTo(n) {
 for (var i = 1; i <= n; i++) {
 console.log(i);
 }
}

O(1)

1.  function logAtMost10(n) {
2.  for (var i = 1; i <= Math.min(n, 10); i++) {
3.  console.log(i);
4.  }
5.  }

O(n^2)

1.  function subtotals(array) {
2.  var subtotalArray = Array(array.length);
3.  for (var i = 0; i < array.length; i++) {
4.  var subtotal = 0;
5.  for (var j = 0; j <= i; j++) {
6.  subtotal += array[j];
7.  }
8.  subtotalArray[i] = subtotal;
9.  }
10.  return subtotalArray;
11.  }

© 2024.