Basic Time Complexity
Considering this is the first chapter, I will not provide an exhaustive explanation of time and space complexity. The detailed Practical Guide to Algorithm Time and Space Complexity Analysis is scheduled after you have learned the core frameworks of several common algorithms. By then, your knowledge base will be sufficient to easily understand the various scenarios of time and space complexity analysis.
Since later in this chapter, we will implement common sorting algorithms and data structures, I will analyze their time complexity. Thus, it is necessary to introduce the concept of time/space complexity in advance, along with simplified methods for analyzing them, to avoid confusion for beginners.
For beginners, you only need to remember the following points:
- Time and space complexity are represented using Big O notation (such as , etc.). These are estimates, not precise calculations, and only the term with the highest growth rate is retained.
For example, is equivalent to , and is equivalent to .
- Time complexity measures the execution efficiency of an algorithm, while space complexity measures its memory consumption. Both are more desirable when lower.
For instance, an algorithm with a time complexity of is more efficient than one with , and an algorithm with a space complexity of consumes less memory than one with .
Of course, we generally need to specify what n
represents, such as the length of the input array.
- How to estimate? For now, you can simply understand that: time complexity is related to the nesting levels of for loops; space complexity relates to how much space the algorithm allocates to store data.
Note
When I mention estimating time complexity based on the nesting levels of for loops, it is a simplified method and not entirely accurate. The correct approach will be introduced in the Practical Guide to Algorithm Time and Space Complexity Analysis, but this estimation method is sufficient for beginners learning the content of this chapter.
Let's look at a few examples for a more intuitive understanding.
Example One, Time Complexity , Space Complexity :
// input an integer array, return the sum of all elements
int getSum(int[] nums) {
int sum = 0;
for (int i = 0; i < nums.length; i++) {
sum += nums[i];
}
return sum;
}
The algorithm includes a for loop that iterates through the nums
array, so the time complexity is , where n
represents the length of the nums
array.
Our algorithm only uses a sum
variable. Since nums
is provided as input in the problem, it is not counted in the space complexity of our algorithm, making the space complexity .
Example 2, Time Complexity , Space Complexity :
// Does the array contain two numbers whose sum is target?
boolean hasTargetSum(int[] nums, int target) {
for (int i = 0; i < nums.length; i++) {
for (int j = i + 1; j < nums.length; j++) {
if (nums[i] + nums[j] == target) {
return true;
}
}
}
return false;
}
The algorithm includes two nested for loops, so the time complexity is , where n
represents the length of the nums
array.
Our algorithm only uses two variables, i
and j
, which is a constant level of space consumption, so the space complexity is .
You might argue that the inner for loop does not traverse the entire array and may return early, meaning the actual execution times of the algorithm might be less than n^2
. Does this mean the time complexity is still ?
Yes, it is still . As mentioned, Big O notation is an estimate and does not require precise calculation. For different inputs, the actual execution times of the algorithm might indeed be less than n^2
, but we do not need to focus on that.
We just need to recognize that when we see nested for loops, the time complexity is .
Example Three, Time Complexity , Space Complexity :
void exampleFn(int n) {
int[] nums = new int[n];
}
This function creates an array of size n
, so the space complexity is . It is important to note that allocating space for an array and initializing it also requires time, so the time complexity is also .
Example Four, Time Complexity , Space Complexity :
// input an integer array, return a new array where each element is the square of the corresponding element in the original array
int[] squareArray(int[] nums) {
int[] res = new int[nums.length];
for (int i = 0; i < nums.length; i++) {
res[i] = nums[i] * nums[i];
}
return res;
}
Initializing the res
array in the algorithm requires a time complexity of , which includes a for loop, also having a time complexity of . Therefore, the total time complexity remains , where n
represents the length of the nums
array.
We declare a new array res
with the same length as the nums
array, so the space complexity is .
For beginners, understanding these basic analyses of time and space complexity is sufficient for now. Let's continue learning.