Knowing approximately how long your algorithm will take is key to writing efficient and dynamic code. Time complexity of your algorithm can be very important to your code the especially the larger the dataset that you are working with. It is important as a developer to have a system of to analyze your code, to make it most efficient regardless of what machine it is running on. This is where asymptotic analysis comes in; asymptotic analysis is a algorithm time complexity analysis that doesn’t depend on any machine specific constants. This process uses three types of notations: θ (theta), O (big O), and Ω (Omega) (1) along with these notations all use the letter ‘n’ to denote the time complexity meaning it takes ‘n’ number tasks to complete this algorithm, meaning each task takes the same amount of time. All of which have different exact uses but all essentially work the same; we will first briefly go over each one then go over some time complexity examples using big O notation as it is the most commonly used notation.
Omega notation bounds the algorithm by its lower bound and is known as the getting the best possible time complexity your algorithm can run at. This can be useful to know, especially just to find out what is the most optimal data set to use an algorithm for. Theta notation bounds the time complexity of an algorithm from below and above giving a kind of “average” time complexity. Given the structure of the algorithm theta can be useful to use to know how long your algorithm will most likely take on a non-edge case data set. Lastly, Big O which is the most used algorithm for obvious reasons as it is the upper bound of the time complexity of a particular algorithm. It is always important to know the worst-case scenario, if you have and algorithm that runs extremely fast in most cases but will take 100 times longer in one case that can be a huge problem.
First let get the time complexity of a simple loop.
for(int i = 0; i < 5; i++) {
System.out.println(“Hello”);
}
This loop will run 5 times from i=0 to i=5 so it will have a constant time complexity, as it runs two-line this means the total time t(n) = 2*(5) = 10 but the asymptotic analysis doesn’t care about the exact time it only cares that it is constant so it denotes it as O(1) meaning it will always take the same amount of time no matter what. Now let’s check out another variable loop.
for(int i = 0; i < n; i++) {
System.out.println(“Hello”);
}
This loop you no longer know exactly how long it is going to take, but it is dependent on how big of a value ‘n’ is. So, the time complexity is t(n) = 2*n. Again, we do not care about the constants so the Big O is O(n) as it will roughly take n times to run this loop.
Thanks for reading my brief overview of time complexity I hope it’s a good launching off point for more in-depth time complexity analysis. Here is also a quick algorithm time complexity cheat sheet.
Sources: