menu

Search By Label

In short, O(1) means that it takes a constant time, like 14 nanoseconds, or three minutes no matter the amount of data in the set.
def add_items(n):
    return n + n + n
 
print add_items(10)
O(n) means it takes an amount of time linear with the size of the set, so a set twice the size will take twice the time. You probably don't want to put a million objects into one of these.
This is always the best scenario possible to take and is represented as constant in the initial image.
It is also known as Quadratic Time complexity. In this type the running time of algorithm grows as square function of input size. This can cause very large time for large inputs. Here, the notation O in O(n2) represents its worst cases complexity.

The O(N^2) time complexity means that the running time of an algorithm grows quadratically with the size of the input. It often involves nested loops, where each element in the input is compared with every other element.

A code example is:
def print_items(n):
    for i in range(n):
        for j in range(n):
            print(i,j) 

print_items(10)

When you have an algorithm like the following one where there are more scenarios and the expected notation would be O(n^2 + n) the rule is keeping only the dominan value resulting on O(n^2)
def print_items(n):
    for i in range(n):
        for j in range(n):
            print(i,j)
    
    for k in range(n):
        print(k)

print_items(10)
Big O notation focuses on the worst case, which is O(n) for the simple search. It's a guarantee that the simple search will never be slower than O(n) time. In a code where the worst scenario is go thought all the elements the notation is O(n)
def print_items(n):
    for i in range(n):
        print(i)

print_items(10)

The time complexity increases proportionally to the given input.
Big O notation is a way of describing the speed or complexity of a given algorithm.

Simply, Big O notation tells you the number of operations an algorithm will perform. It takes its name from the “Big O” in front of the estimated number of operations.

What the Big O notation does not tell you is how fast the algorithm will be in seconds. There are too many factors that influence how long it takes for an algorithm to run. Instead, you will use the Big O notation to compare different algorithms by the number of operations they do.
The Time complexity of an algorithm can be defined as a measure of the amount of time taken to run an algorithm as a function of the size of the input. In simple words, it tells about the growth of time taken as a function of input data.

Time complexity is categorized in 3 types:
  • Best-Case Time Complexity: The minimum amount of time required by an algorithm to run, based on the given most suitable input is known as best-case complexity.
  • Average Case Complexity: The average amount of time required by an algorithm to run, taking into consideration all the possible inputs is known as average-case complexity.
  • Worst Case Complexity: The maximum amount of time required by an algorithm to run, taking into consideration possible worst case is known as worst-case complexity.