Logo  # Understanding Time Complexity with Interview Examples

Prasun Das| July 26, 2022 at 7:26 PM | 8 minutes Read

# Introduction:

Understanding time complexity of a function is imperative for every programmer to execute a program in the most efficient manner. Time complexity of an algorithm can be defined as the amount of time taken by the algorithm to run as the function of the given input. Time complexity is a very important metric to understand which is a better, faster and more efficient algorithm. Before we dive deep, let's take an example -

A relatively new Macbook and an old computer are used for executing a linear search through an array of 1000 elements. The Macbook takes 1 second while the old computer takes 5 seconds to run the program and give the result.

This apparently hints that the macbook has a better time complexity than the old computer, right?

No. The truth is that their time complexity remains the same.

Important to note is that time complexity != time taken.

# Significance Of Time Complexity:

• It gives us a rough idea of the time taken as a function of the length of the input.
• It helps us to distinguish between two algorithms on the basis of their efficiency; generally the algorithm with better time complexity is preferred.
• Understanding time complexity helps the programmer to avoid the TLE error.

Also, note that time complexity of a particular algorithm is one of the most discussed topics in a typical tech interview round.

# Types Of Time Complexity Notations:

Time complexity being a function of the length of input, a relation exists between the input data size and the number of operations executed in a given interval of time. The most important time complexity notations are -

• Big O notation
• Big Omega notation
• Little O notation
• Little Omega notation
• Big Theta notation

Note that always prioritise finding the worst case scenario of these notations as they matter the most. Worst case analysis actually gives us the maximum number of basic operations that have to be performed during the execution of the algorithm. It assumes that the input is in the worst possible state and hence maximum work must be done to put things right.

Let’s discuss these notations in detail.

# Big O Notation:

The Big O notation simply gives the upper bound of the particular algorithm. It is noted by O(). Note that the complexity will never exceed this, it can be executed at a lesser complexity but never higher than this. If an algo has complexity of O(N3) it simply means that the graph of the relationship between input data size and the number of operations executed can’t exceed N3. Let’s try to find out the time complexity in the Big O notation for some cases:

```int a = 0, b = 0;

for (i = 0; i < N; i++) {
a = a + rand();
}

for (j = 0; j < M; j++) {
b = b + rand();
}
```

Now, the first loop is O(N) and the second loop is O(M). Since N and M are independent variables, it can’t be determined which one is the leading term. Therefore Time complexity of the given problem will be O(N+M).

Since the variable size does not depend on the size of the input, therefore the Space Complexity in this case will be constant or O(1).

• When f(n) = 2n2+67n

Time complexity will be O(n2). Here, we ignore the lower degree and the constants. Make it a point to always ignore the less dominating terms, in this case the lower degree terms and constant values are ignored.

• When f(n) = n2 + log n

Time complexity will be O(n2), since the time complexity of log n < time complexity of n2, the former is ignored.

```int a = 0;
for (i = 0; i < N; i++)  {
for (j = N; j > i; j--) {
a = a + i + j;
}
}
```

The first loop runs for ‘N’ number of times while the nested loop runs for the same ‘N’ number of times. This results in a time complexity of O(N2).

# Big Omega Notation:

Big omega notation is analogically the opposite of the Big O notation. Note that it is denoted by Ω(). It gives us the lower bound value of the algo. When an algo is said to have Ω(n2) , it implies that the particular algorithm will take at least n2 time complexity. It can take more than that but never less than that.

# Little O Notation:

Little o notation refers to the upper bound that can not be tight or the loose upper bound of f(n). Note that it is denoted by o().

# Little Omega Notation:

Little omega notation just like Big omega notation gives us the lower bound of a certain f(n). The only difference being that the little omega gives the loose lower bound of f(n).

# Big Theta Notation:

Big theta notation describes the average case complexity of an algorithm. It is denoted by 𝚹().

# Space Complexity:

‘Space Complexity’ is another term which often hovers around when talking about time complexity. Space Complexity is the working space or storage that is required by any algorithm. It is directly proportional to the amount of input taken by the particular algorithm. To calculate space complexity, we have to calculate the space taken up by the variables in an algorithm. The lesser space, the faster the algorithm executes. Note that time and space complexity are unrelated. Let’s take an example:

```function fibonacci(n) {
int x = 0, y = 1, z;
if (n === 0) {
return x;
}
if (n === 1) {
return y;
}
for (int i = 2; i <= n; ++i) {
z = x + y;
x = y;
y = z;
}
return z;
}

```

In this case we are using a fixed space of 4 variables. The space complexity is thus can be said to be O(1).

It must be noted that there’s no point in using more space to solve a problem if the same can be done with lesser space complexity.

# Important Notes:

There are different types of time complexities used like the constant time, linear time, logarithmic time, quadratic time, etc. Refer the next line to visualise the order of their respective time complexities:

INCREASING TIME COMPLEXITY →

O(1), O(log n), O(n), O(n logn), O(n2), O(2n), O(n!)

It is essential for a programmer to have a basic knowledge about the time complexity values of different sorting and searching algorithms. Take a look at the underlying table: Always keep the bigger picture in mind while formulating an algo on the basis of its time complexity. For instance, when we compare the time complexities of linear search and binary search we see that for smaller input values, O(log n) takes more time than O(n). However when the data grows considerably larger in size, log n complexity takes less time than linear complexity. To summarise, keeping the bigger numbers in mind, O(log n) is more efficient than O(n) - time complexity of binary search is better than that of linear search.

A good way to understand the changing time complexities of different algorithms relative to each other is by referring to their graphical representations. The efficiency or running time of an algorithm is stated as a function relating the input length to the number of steps, known as time complexity, or volume of memory, known as space complexity.

Worst-case − The maximum number of steps taken on any instance of size a.

Best-case − The minimum number of steps taken on any instance of size a.

Average case − An average number of steps taken on any instance of size a.

Amortised − A sequence of operations applied to the input of size averaged over time.

# Important Sample Questions:

## 1.What is the time, space complexity of following code :

``` int a = 0, b = 0;
for (i = 0; i < N; i++) {
a = a + rand();
}
for (j = 0; j < M; j++) {
b = b + rand();
}

```

Assume that rand() is O(1) time, O(1) space function.

O(N * M) time, O(1) space

O(N + M) time, O(N + M) space

O(N + M) time, O(1) space (SOL)

O(N * M) time, O(N + M) space

O(N * M) time, O(N * M) space

Solution:

The first loop is O(N) and the second loop is O(M). Since you don't know which is bigger, you say **this is O(N + M)**. This can also be written as O(max(N, M)).

Since there is no additional space being utilised, the space complexity is constant / O(1)

## 2.What is time complexity of following code :

```
int count = 0;
for (int i = N; i > 0; i /= 2) {
for (int j = 0; j < i; j++) {
count += 1;
}
}

```

O(N * N)

O(N * log N)

O(N * log(log(N)))

O(N) (Solution)

O(N * Sqrt(N))

Complete Solution

In the first iteration, the j loop runs N times.

In the second iteration, the j loop runs N / 2 times.

In the ith iteration, the j loop runs N / 2^i times.

So, the total number of runs of loop = N + N / 2 + N / 4 + ... 1 = **N * ( 1 + 1/2 + 1/4 + 1/8 + ... ) < 2 * N**

## 3.What is the time, space complexity of the following code:

```
int a = 0, b = 0;
for (i = 0; i < N; i++) {
a = a + rand();
}
for (j = 0; j < M; j++) {
b = b + rand();
}

```

O(N * M) time, O(1) space

O(N + M) time, O(N + M) space

O(N + M) time, O(1) space

O(N * M) time, O(N + M) space

Solution:

3. O(N + M) time, O(1) space

Explanation: The first loop is O(N) and the second loop is O(M). Since we don’t know which is bigger, we say this is O(N + M). This can also be written as O(max(N, M)).

Since there is no additional space being utilized, the space complexity is constant / O(1)

## 4. What is the time complexity of the following code:

```int i, j, k = 0;
for (i = n / 2; i <= n; i++) {
for (j = 2; j <= n; j = j * 2) {
k = k + n / 2;
}
}

```

O(n)

O(nLogn)

O(n^2)

O(n^2Logn)

Solution:

2. O(nLogn)

Explanation: If you notice, j keeps doubling till it is less than or equal to n. Several times, we can double a number till it is less than n would be log(n).

Let’s take the examples here.

for n = 16, j = 2, 4, 8, 16

for n = 32, j = 2, 4, 8, 16, 32

So, j would run for O(log n) steps.

i runs for n/2 steps.

So, total steps = O(n/ 2 * log (n)) = O(n*logn)

## 5. What will be the time complexity of the following code?

```var value = 0;
for(var i=0;i<n;i++)
for(var j=0;j<i;j++)
value += 1;

```

n

(n+1)

n(n-1)/2

n(n+1)/2

Output:

Solution:

3. n(n-1)/2

Explanation: First for loop will run for (n) times and another for loop will be run for (n-1) times so overall time will be n(n-1)/2.

# Conclusion:

In this blog, we have introduced the basic concepts of Time complexity and space complexity and their importance while designing algorithms. Also, we had seen what are the different types of time complexities used for the various kinds of functions. The main takeaway from this should be that we don’t actually care about the time taken but the relationship of how the time grows with respect to change in input. Make sure to go through these concepts before the interview round as questions from here are inevitable.

#time-complexity#last-minute-notes#algorithms#fundamentals to learn#interview tips and tricks
View Count:3.4k
12

## Similar Articles

### DSA Cheatsheet Curated list (Leetcode)

If you're looking to prepare for a job in the tech industry, Data Structures and Algorithms (DSA) are essential to master. Practising problems on plat......  Each and every programmer needs a strong grasp in DSA to write efficient and optimised codes. However, DSA has a reputation of being one of the most f......  ### Campus Placement Roadmap for Batch 2024

Its that time round the calendar again, but different for you all as for the first time now. You all will be facing the on campus placement season.Wit......  WHAT IS WEB3?Web3 is the catch-all term for the vision of an upgraded version of the internet aimed at giving power back to the users. It is truly the......  ### TIPS ON PASSING ATS WITH YOUR RESUME (How to write Resumes)

The job hunting landscape has evolved in different aspects since the advent of the internet. Thanks to the internet, online applications for a particu......  ### How to write cold emails / dms / How to approach [with template example]

Writing a cold email might be challenging, but the proper pearl of knowledge can spare business years of pain and thousands of dollars. But how can yo......  ### Operating System INTERVIEW QUESTIONS

Operating Systems is one of the core subjects for a computer science student. As such a lot of important questions are asked on this subject in interv......  ### LAST MINUTE INTERVIEW TIPS

It has been observed that a majority of applicants experience anxiety, self doubt and miscellaneous emotions which might cloud their minds. It is a no...  ### Cloud Computing Cheatsheet

Getting started with cloud computing ?The words "cloud" and "web" will be interchangeable. What kind of business operates today without the internet? ...  ### Software Engineering — SOFTWARE MAINTENANCE [Part-4]

SOFTWARE MAINTENANCE:Software Maintenance can be defined as the process of modifying a software product after its delivery. The main purpose of softwa...  ### Software Engineering — Software Process Activities [Part - 3]

A quick summary upto what we did this far&nbsp; from this series:Software EngineeringSoftware engineering is a discipline of engineering concerned wit...  ### INTERVIEW TIPS AND TRICKS FOR FRESHERS

Increased competition for fewer jobs in the current economy means that just being right for the role on paper, is not enough on its own anymore. You h...  ### Software Engineering — Software Development lifecycle and software process(Part 2)

&nbsp;Software MethodologyA software methodology (which is also known as software process) is a set of related development Phases that leads to the pr...  ### Software Engineering [Part 1] - Introduction

Software Engineering is a subject that requires when you are a software developer as well as when you are in your college preparing for a semester or ...  ### SQL Interview Questions

Are you getting ready for your SQL developer job interview?You've come to the correct place.This tutorial will assist you in brushing up on your SQL s...  ### SQL CHEATSHEET

Introduction: What is SQL?To understand SQL, we must first understand databases and database management systems (DBMS).Data is essentially a collectio...  ### Computer Networking Cheatsheet (Last Minute Notes)

&nbsp;Networking: Importance in real life!Using a computer for work, as well as for personal use, ha...  ### Last Minute Preparation Cheat Sheet for Quantitative Aptitude

FORMULA LIST:ALGEBRA :1.Sum of first n natural numbers = n(n+1)/22.Sum of the squares of first n nat...  ### Database Management System Last Minute Notes [Part - 2]

ER-DiagramWhat are ER Models ?An Entity Relationship Diagram (ERD) is a visual representation of dif...  ### Database Management System Last Minute Notes [Part - 1]

What is actually DBMS?A Database Management System (DBMS) is system software used to manage the orga...  ### Final year Placement Roadmap in India

Nowadays students tend to go more for off campus placements rather than on campus placements.One of ...  ### C Cheatsheet

C CHEATSHEETINTRODUCTION:C programming language was developed by Dennis Ritchie of Bell Labs in 1972...  ### C Interview Questions — 2022

C is a general purpose high-level language most popular amongst coders, it is the most compatible, e...  ### C++ STL Cheat Sheet

C++ programming language has templates which basically allows functions and classes to work with gen...  ### C++ CHEAT SHEET

C++ was conceived in the late 1970s to overcome the limitations of C. It&nbsp; is an extension of th...  ### Python [Advanced] Interview Questions — 2022

Interview Questions are a great source to test your knowledge about any specific field. But remember...  ### Basic Python [Core] Interview Questions for Freshers and Short Sample Answers — 2022

The most popular high-level, multipurpose programming language right now is Python.Python supports p...  ### Python Cheat Sheet - Learn the basics of Python

IntroductionPython is a high-level programming language with dynamic binding and high-level inherent...  ### Basic Java Interview Questions for Freshers and Short Sample Answers — 2022

Interview Questions are a great source to test your knowledge about any specific field. But remember...  ### Java OOP Cheat Sheet — A Quick Guide to Object-Oriented Programming in Java

Object-Oriented Programming or better known as OOPs is one of the major pillars of Java that has lev...  ### Learn Java: Basics to Advanced Concepts [Java Cheatsheet]

Java is a high-level programming language known for its robustness, object-oriented nature, enhanced...  