I am wondering what BIG O would be in this case? I thought it was O(1) since it has fixed number of iteration (the array.length is fixed)… and even if in the worst case (3999), still the max iteration is fixed… but also I saw the answer can be O(n), because it is a linear relationship with the value of num. For smaller values of num, the while loop will iterate fewer times, and for larger values, it will iterate more times, but the growth rate is linear.
I assume that even if the iteration is fixed, if it’s linear, the time complexity would be O(n)?
function convertRomanNumerals(num) {
let romanNumerals = [
[1000, "M"],
[900, "CM"],
[500, "D"],
[400, "CD"],
[100, "C"],
[90, "XC"],
[50, "L"],
[40, "XL"],
[10, "X"],
[5, "V"],
[4, "IV"],
[1, "I"]
];
// 1 =<num =d<3999
// 1990 = M CM XC
// 1666 = M DC LX VI
let result = "";
for (let i = 0; i < romanNumerals.length; i++) {
while (num >= romanNumerals[i][0]) {
result += romanNumerals[i][1];
num -= romanNumerals[i][0];
}
}
return result;
}
console.log(convertRomanNumerals(2023))
2
Answers
We can count the number of iterations and get some statistics out of it
It increases but very slowly so it’s not O(1) for sure
In order to properly understand complexity, we need to properly understand the concept of infinity.
First, there is the concept of actual infinity, which is either infinitely large or infinitely small.
Second, there is the concept of cluster point, that is, a threshold beyond which we do not care about the actual bounds, given the fact that it is too large or too small.
For example, if you have to walk 30 000 miles, then it is not actual infinity, but it is beyond the treshold where you still care about the size.
So, when people analyze functions, then they either arbitrarily or with good reason choose an error limit, that is, the maximum distance between the value you are approaching and the actual value, that is, the maximum error you allow. This specifies where your cluster point is.
Now, complexity is the function which determines the pattern which your algorithm follows as, with the increase of the input size, its number of steps approaches infinity.
Your situation is that your input size is guaranteed to always be small, so you will never reach the cluster point when it becomes too slow. But your function’s complexity is not determined on how it performs when the size of the input is next to nothing, but it is determined in terms of how quickly it reaches the cluster point where you no longer wait for it to be executed.