One step at a time! 🪜
How complex is scalar multiplication?
- tags
- #Group Theory #Scalar Multiplication #Algorithmic
- categories
- Complexity Theory Algorithms
- reading time
- 2 minutes
Scalar multiplication
Let $\mathbb{G}$ be a commutative group under the addition law ($+$) and let $G \in \mathbb{G}$ be its generator. The group has an order $p$. We define a scalar multiplication $k \cdot G$ as $\underbrace{G+…+G}_\text{$k$~times}$.
- Consider $105 \cdot G$. How many additions does it involve?
- Now consider the binary decomposition of $105 = 2^6 + 2^5 + 2^3 + 2^0$. It can be written as
$105 = ((((((((((1 · 2 + 1) · 2) + 0) · 2) + 1) · 2) + 0) · 2) + 0) · 2) + 1$. Given that a doubling (multiplication by 2) has the same cost as an addition, how many additions $105 \cdot G$ involves?
- What is the complexity of the naive algorithm?
- Let $n$ be the number of bits in $k$. We can write $k = k_0 + 2 k_1 + 2^2 k_2 + … + 2^{n-1} k_{n-1}$ where $k_i \in \{0,1\}$. Can you derive an algorithm for computing $k \cdot G$ in less than $k-1$ additions?
- Assuming that on average half of the $k_i$’s are 1, how many additions do you need? What is the complexity of your algorithm?
- We want now to compute $k \cdot G_1 + k’ \cdot G_2$ where $G_1, G_2 \in \mathbb{G}$. The obvious solution would be to compute $k \cdot G_1$ and $k’ \cdot G_2$ separately and add them. Can you do better?
Hint: start an accumulator with $G_1+G_2$ and scan $k$ and $k’$ bits simultaneously.
