Skip to content

Instantly share code, notes, and snippets.

@arthurpaulino
Last active March 17, 2024 14:31
Show Gist options
  • Save arthurpaulino/c1399dca24b1d354d88119d2ce8ce548 to your computer and use it in GitHub Desktop.
Save arthurpaulino/c1399dca24b1d354d88119d2ce8ce548 to your computer and use it in GitHub Desktop.

ZK snippets for the non gifted

This is a personal attempt to pierce through the hard walls of ZK constructions. I don't want to skip any detail that would obscure my own understanding. Hopefully this is enough to have the same effect on the reader.

Disclaimer

Elegance are succintness is non-goals. If there is a slower argument, mechanical even, I will choose it.

R1CS / QAP equivalence

Let's start with the following R1CS

$$\mathbb{A}Z \circ \mathbb{B}Z = \mathbb{C}Z$$

for fixed matrices $\mathbb{A}$, $\mathbb{B}$ and $\mathbb{C}$ with $n$ lines and $m$ columns each and the vector $Z$.

For each column $j$ of $\mathbb{A}$, $\mathbb{B}$ and $\mathbb{C}$, we construct the respective polynomial functions $p_\mathbb{A_\mathcal{j}}$, $p_\mathbb{B_\mathcal{j}}$ and $p_\mathbb{C_\mathcal{j}}$ such that:

  • $p_\mathbb{A_\mathcal{j}}(i) = \mathbb{A_\mathcal{ij}}$
  • $p_\mathbb{B_\mathcal{j}}(i) = \mathbb{B_\mathcal{ij}}$
  • $p_\mathbb{C_\mathcal{j}}(i) = \mathbb{C_\mathcal{ij}}$

where $i$ assumes values from $0$ to $n - 1$, cycling through the line indices.

In other words, each matrix will have $m$ associated polynomials, indexed by $j$, built to return the $i$-th element from its $j$-th column.

Let's also construct the polynomial function $d(i) = i(i - 1)(i - 2)...(i - (n - 1))$.

Claim

Define the polynomial function $t$ as in

$$t(i) = \sum_{j = 0}^{m - 1} Z_j p_\mathbb{A_\mathcal{j}}(i) \times \sum_{j = 0}^{m - 1} Z_j p_\mathbb{B_\mathcal{j}}(i) - \sum_{j = 0}^{m - 1} Z_j p_\mathbb{C_\mathcal{j}}(i)$$

$Z$ is a solution for the original R1CS iff $d(i)$ divides $t(i)$ for all $0 \le i < n$.

Proof

$\Longrightarrow$

Since $Z$ is a solution to the R1CS, for every line $i$ of $\mathbb{A}$, $\mathbb{B}$ and $\mathbb{C}$, let's call them $L_\mathbb{A_\mathcal{i}}$, $L_\mathbb{B_\mathcal{i}}$ and $L_\mathbb{C_\mathcal{i}}$ respectively, we have that

$$(L_\mathbb{A_\mathcal{i}} \cdot Z)(L_\mathbb{B_\mathcal{i}} \cdot Z) = L_\mathbb{C_\mathcal{i}} \cdot Z \hspace{1cm}(1)$$

Now let's take a closer look at the dot product $L_\mathbb{A_\mathcal{i}} \cdot Z$. It's the sum of the pairwise products between elements of $\mathbb{A}$'s $i$-th line and the elements in $Z$. And that's precisely $\Sigma_{j = 0}^{m - 1} Z_j p_\mathbb{A_\mathcal{j}}(i)$ by construction of every $p_\mathbb{A_\mathcal{j}}$ for $0 \le j < m$. This is true because we've assumed $i$ to be in the range $0 \le i < n$ from the beginning, where such polynomials behave as we planned.

Analogously, the same can be said about $L_\mathbb{B_\mathcal{i}} \cdot Z$ and $L_\mathbb{C_\mathcal{i}} \cdot Z$. So we substitute in $(1)$:

$$ \sum_{j = 0}^{m - 1} Z_j p_\mathbb{A_\mathcal{j}}(i) \times \sum_{j = 0}^{m - 1} Z_j p_\mathbb{B_\mathcal{j}}(i) = \sum_{j = 0}^{m - 1} Z_j p_\mathbb{C_\mathcal{j}}(i) $$

$$ \sum_{j = 0}^{m - 1} Z_j p_\mathbb{A_\mathcal{j}}(i) \times \sum_{j = 0}^{m - 1} Z_j p_\mathbb{B_\mathcal{j}}(i) - \sum_{j = 0}^{m - 1} Z_j p_\mathbb{C_\mathcal{j}}(i) = 0 $$

The LHS is $t(i)$. So $t(i)$ equals $0$, which is divisible by $d(i)$. Furthermore, $i$ has been kept as a generic line index for $\mathbb{A}$, $\mathbb{B}$ and $\mathbb{C}$. Thus the argument above is valid for all $0 \le i < n$.

$\Longleftarrow$

If $d(i)$ divides $t(i)$, then there exists a polynomial function $h$ such that $t(i) = d(i)h(i)$. Well, $d(i)$ was constructed to evaluate to $0$ at every $0 \le i < n$. Thus $t(i)$ also evaluates to $0$ at every $0 \le i < n$.

Now we basically do the reverse path of what we've done before:

$$ \sum_{j = 0}^{m - 1} Z_j p_\mathbb{A_\mathcal{j}}(i) \times \sum_{j = 0}^{m - 1} Z_j p_\mathbb{B_\mathcal{j}}(i) - \sum_{j = 0}^{m - 1} Z_j p_\mathbb{C_\mathcal{j}}(i) = 0 $$

$$ \sum_{j = 0}^{m - 1} Z_j p_\mathbb{A_\mathcal{j}}(i) \times \sum_{j = 0}^{m - 1} Z_j p_\mathbb{B_\mathcal{j}}(i) = \sum_{j = 0}^{m - 1} Z_j p_\mathbb{C_\mathcal{j}}(i) $$

$$(L_\mathbb{A_\mathcal{i}} \cdot Z)(L_\mathbb{B_\mathcal{i}} \cdot Z) = L_\mathbb{C_\mathcal{i}} \cdot Z$$

And the above being true for all $0 \le i < n$ is precisely what it means to have $\mathbb{A}Z \circ \mathbb{B}Z = \mathbb{C}Z$.

Sources

Constructing linearly homomorphic commitments

Folding relaxed R1CS instances

A relaxed R1CS is a system of equations encoded in

$$\mathbb{A}Z \circ \mathbb{B}Z = u\mathbb{C}Z + E$$

for fixed matrices $\mathbb{A}$, $\mathbb{B}$ and $\mathbb{C}$. An instance for the relaxed R1CS includes the vectors $Z$ and $E$ as well as the scalar $u$. An instance is a solution iff the system of equations is satisfied by it.

Claim

Suppose we have $(Z_1, E_1, u_1)$, $(Z_2, E_2, u_2)$ and $r$ such that

$$\mathbb{A}Z_1 \circ \mathbb{B}Z_1 = u_1\mathbb{C}Z_1 + E_1 \hspace{1cm}(1)$$

$$\mathbb{A}Z_2 \circ \mathbb{B}Z_2 = u_2\mathbb{C}Z_2 + E_2 \hspace{1cm}(2)$$

Then the following holds:

$$\mathbb{A}(Z_1 + rZ_2) \circ \mathbb{B}(Z_1 + rZ_2) = u\mathbb{C}(Z_1 + rZ_2) + E \hspace{1cm}(3)$$

where

$$u = u_1 + ru_2$$

$$E = E_1 + r(\mathbb{A}Z_1 \circ \mathbb{B}Z_2 + \mathbb{A}Z_2 \circ \mathbb{B}Z_1 - u_1\mathbb{C}Z_2 - u_2\mathbb{C}Z_1) + r^2E_2$$

iff $(1)$ and $(2)$ hold.

Proof

Expanding the LHS of $(3)$:

$\mathbb{A}(Z_1 + rZ_2) \circ \mathbb{B}(Z_1 + rZ_2) =$

$(\mathbb{A}Z_1 + r\mathbb{A}Z_2) \circ (\mathbb{B}Z_1 + r\mathbb{B}Z_2) =$

$\mathbb{A}Z_1 \circ \mathbb{B}Z_1 + r^2(\mathbb{A}Z_2 \circ \mathbb{B}Z_2) + r(\mathbb{A}Z_2 \circ \mathbb{B}Z_1 + \mathbb{A}Z_1 \circ \mathbb{B}Z_2)$

Expanding the RHS of $(3)$:

$u\mathbb{C}(Z_1 + rZ_2) + E =$

$(u_1 + ru_2)\mathbb{C}(Z_1 + rZ_2) + E_1 + r(\mathbb{A}Z_1 \circ \mathbb{B}Z_2 + \mathbb{A}Z_2 \circ \mathbb{B}Z_1 - u_1\mathbb{C}Z_2 - u_2\mathbb{C}Z1) + r^2E_2 =$

$u_1\mathbb{C}Z_1 + r^2u_2\mathbb{C}Z_2 + ru_1\mathbb{C}Z_2 + ru_2\mathbb{C}Z_1 + E_1 + r(\mathbb{A}Z_1 \circ \mathbb{B}Z_2 + \mathbb{A}Z_2 \circ \mathbb{B}Z_1 - u_1\mathbb{C}Z_2 - u_2\mathbb{C}Z_1) + r^2E_2 =$

$u_1\mathbb{C}Z_1 + E_1 + r^2(u_2\mathbb{C}Z_2 + E_2) + r(u_1\mathbb{C}Z_2 + u_2\mathbb{C}Z_1 + \mathbb{A}Z_1 \circ \mathbb{B}Z_2 + \mathbb{A}Z_2 \circ \mathbb{B}Z_1 - u_1\mathbb{C}Z_2 - u_2\mathbb{C}Z_1) =$

$u_1\mathbb{C}Z_1 + E_1 + r^2(u_2\mathbb{C}Z_2 + E_2) + r(\mathbb{A}Z_1 \circ \mathbb{B}Z_2 + \mathbb{A}Z_2 \circ \mathbb{B}Z_1)$

Now, combining both sides:

$$\mathbb{A}Z_1 \circ \mathbb{B}Z_1 + r^2(\mathbb{A}Z_2 \circ \mathbb{B}Z_2) = u_1\mathbb{C}Z_1 + E_1 + r^2(u_2\mathbb{C}Z_2 + E_2) \hspace{1cm}(4)$$

$\Longleftarrow$

If $(4)$ holds for any $r$, then the following must hold for $r_1$ and $r_2$ with $r_1 \ne r_2$:

$$\mathbb{A}Z_1 \circ \mathbb{B}Z_1 + r_1^2(\mathbb{A}Z_2 \circ \mathbb{B}Z_2) = u_1\mathbb{C}Z_1 + E_1 + r_1^2(u_2\mathbb{C}Z_2 + E_2)$$

$$\mathbb{A}Z_1 \circ \mathbb{B}Z_1 + r_2^2(\mathbb{A}Z_2 \circ \mathbb{B}Z_2) = u_1\mathbb{C}Z_1 + E_1 + r_2^2(u_2\mathbb{C}Z_2 + E_2)$$

Subtracting the two:

$$(r_1^2 - r_2^2)(\mathbb{A}Z_2 \circ \mathbb{B}Z_2) = (r_1^2 - r_2^2)(u_2\mathbb{C}Z_2 + E_2)$$

Since $r_1^2 - r_2^2 \ne 0$, then we can conclude that $\mathbb{A}Z_2 \circ \mathbb{B}Z_2 = u_2\mathbb{C}Z_2 + E_2$.

Rewrite that in $(4)$ to conclude that $\mathbb{A}Z_1 \circ \mathbb{B}Z_1 = u_1\mathbb{C}Z_1 + E_1$.

$\Longrightarrow$

Multiply the second relaxed R1CS by $r$ and sum with the first to produce the exact linear combination from $(4)$.

Conclusion

We've seen how to construct an instance for a relaxed R1CS that encodes a pair of instances such that checking whether the later is a solution is enough to be convinced that the instances in the initial pair are solutions as well. This construction guarantees that if the the pair of instances are solutions, then the resulting instance is also a solution.

Sources

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment