Time Complexity of Miller Rabin Algorithm

The short answer is, log n is the number of digits of n. So it will keeps coming up, because a lot of operations has length proportional to the digits of n.

r is O(log n). d is O(n).

Initial factoring of n into 2^r d is O(log n), because its run time is proportional to r.

Raising to power of d modulo n takes O(log d) multiplication operations using exponentiation by squaring, which is within O(log n). Each multiplication multiply 2 numbers with at most O(log n) bits and a modulo operation between a number with at most O(log n) bits and n. Each takes O(log^2 n) using naive school book's algorithm (but can certainly be done faster). So you have total of O(log^3 n) number of operations.

The loop inside run at most r times, so it runs at most O(log n) times. Each run of the loop take O(log^2 n) operations using the same argument as above (naive school book's algorithm), for a total of O(log^3 n).

So total is O(log^3 n).

You can certainly improve this bound if you don't use naive school book algorithm. I believe the state of the art research will get you O(log^2 n log log n).

0 like 0 dislike