Complexity Theory

Divide and conquer! 💪

The time complexity describes the amount of computer time it takes to run an algorithm. It is estimated by counting the number of elementary operations performed by the algorithm. Since an algorithm’s running time may vary among different inputs of the same size, one commonly considers the worst-case time. Algorithmic complexities are classified according to the type of function appearing in the big O notation. Here are some examples:

Name Time complexity
Constant time $\mathcal{O}(1) $
Logarithmic time $\mathcal{O}(\log n) $
Linear time $\mathcal{O}(n) $
Linearithmic time $\mathcal{O}(n \log n)$
Quadratic time $\mathcal{O}(n^2) $
Polynomial time $2^{\mathcal{O}(\log n)} = poly(n)$
Exponential time $2^{poly(n)}$

Multiplication algorithms

Schoolbook multiplication

Consider this example $102 \times 257$:

One step at a time! 🪜

Scalar multiplication

Let $\mathbb{G}$ be a commutative group under the addition law ($+$) and let $G \in \mathbb{G}$ be its generator. The group has an order $p$. We define a scalar multiplication $k \cdot G$ as $\underbrace{G+…+G}_\text{$k$~times}$.

  1. Consider $105 \cdot G$. How many additions does it involve?
  2. Now consider the binary decomposition of $105 = 2^6 + 2^5 + 2^3 + 2^0$. It can be written as

$105 = ((((((((((1 · 2 + 1) · 2) + 0) · 2) + 1) · 2) + 0) · 2) + 0) · 2) + 1$. Given that a doubling (multiplication by 2) has the same cost as an addition, how many additions $105 \cdot G$ involves?