I am attempting to find the average case time complexity, not amortized analysis, of a binary counter. As I am not entirely confident in my time complexity analyzing skills, I would like to confirm that my average case analysis of the pseudocode provided below is correct.
Let k be the length of the array.
Increment(Array) i = 0 while i < k and Array[i] == 1 Array[i] = o i = i + 1 if i < k Array[i] = 1
In order to find the average time taken, I find the average amount of bits flipped per run. As a result, I found this to be O(2+k/(2^k)), which equals O(1) for a large k.
Is this the correct average case running time? If not, how would I begin to approach this problem?
I am assuming each input has the same probability to occur
This means that each bit is independently on or off with probability 1/2.
The geometric distribution is the relevant distribution for the complexity: you flip coins, and end the experiment on the first tail outcome (there is nothing further to carry).
The mean of the geometric distribution here is exactly 2 (see above link, or derive it from basic principles), so the average complexity is indeed O(1).