Skip to content

Instantly share code, notes, and snippets.

@gabriel-barrett
Last active May 25, 2023 15:38
Show Gist options
  • Save gabriel-barrett/acb23ea75c38357adf06d7c53b5dba83 to your computer and use it in GitHub Desktop.
Save gabriel-barrett/acb23ea75c38357adf06d7c53b5dba83 to your computer and use it in GitHub Desktop.
How to construct SNARKs for recursive functions (part 1)

How to construct SNARKs for recursive functions (part 1)

(This gist is a semi-formal and hopefully intuitive summary of Nova, what it accomplishes and why — but the mechanics are not fully described. It assumes general knowledge of R1CS and SNARKs)

Introduction

It is well known that polynomial-time computable functions can be represented as an R1CS problem, and thus a (zk)SNARK can be created for it. Is it possible to extend this to general recursive functions?

The first, naive approach, is to simply unroll recursive functions up to a certain limit, and trim off the recursive calls. This is far from a good approach however:

  1. We must decide statically what is the maximum recursion depth allowed.
  2. The unrolled circuit could become very large.
  3. If the function recursively calls itself multiple times within a block, the unrolling becomes exponentially big!

We could do better though. Instead of creating a circuit representing the entire computation of the function, let us try to create a circuit that only performs one step of the computation We could then prove each iteration of the circuit and somehow combine all these steps into a single proof.

Recursive SNARKs

Consider the simplest case of a recursion function: the natural number recursion scheme. There are only two components to it, an initial value $x_0$ and a step function $F$. The recursion scheme gives the function $G$ defined by

$G(0, x_0) = x_0$
$G(n, x_0) = F(G(n-1, x_0))$

Instead of unrolling the recursive call $G(n-1, x_0)$ up to a certain point, the circuit could take as nondeterministic advice a $rec$ value claiming to be the result of the recursive call, then the circuit would simply apply $F$ to this value. But we must constrain somehow the $rec$ value so that indeed it is the value of $G(n-1, x_0)$. This could be done by requiring a proof of this result... We could use a SNARK proof for this! The circuit would then verify the SNARK proof and then apply $F$ to $rec$.

The extended $G$ function that does this, which will be represented by a circuit, will have the form $G'(\textrm{vk}, n, x_0 | ω, u, π) → y,$ where $\textrm{vk}, n, x_0$ are the public inputs, $u, ω, π$ the nondeterministic advices (hidden inputs) and $y$ the (public) output. $\textrm{vk}$ is a verifier key for a SNARK, $ω$ the witness for the $F$ call, $u$ an instance of an R1CS, which should have a public IO size of 4 (same as $G'$), and $π$ a supposed SNARK proof of $u$. Here is how it will proceed:

  • if $n = 0$
    • return $x_0$
  • else:
    • check $u\textrm{.input} = (\textrm{vk}, n-1, x_0)$
    • check $\textrm{SNARK.V}(\textrm{vk}, u, π)$
    • compute $y ← F(u\textrm{.output})$ using ${ω}$ as witness
    • return $y$

Now, because $G'$ can be computed in polynomial time, then it can indeed be represented as an R1CS problem, meaning we can use SNARKs. We can then plug the verifier key for the SNARK of $G'$ in $G'$ itself! That is how $G'$ will be able to prove about itself in a recursive fashion. We are now at a point where we can give a protocol for proving one step at a time the correct execution of $G$, and verifying this proof, the so called Incremental Verifiable Computation (IVC):

$\textrm{IVC.P}(\textrm{pk}, \textrm{vk}, n, x_0, ω) → (u_n, π_n):$

  • if n = 0:
    • compute the trivial case $(u_0, w_0) ← \textrm{trace}(G', (\textrm{vk}, 0, x_0 | ø, ø, ø)),$ for $ø$ a dummy value, like $0$
    • compute $π_0 ← \textrm{SNARK.P}(\textrm{pk}, u_0, w_0)$
    • return $(u_0, π_0)$
  • else:
    • recursively compute $(u_{\textrm{n-1}}, π_{\textrm{n-1}}) ← \textrm{IVC.P}(\textrm{pk}, \textrm{vk}, n-1, x_0, ω)$
    • compute $(u_n, w_n) ← \textrm{trace}(G', (\textrm{vk}, n, x_0 | ω_{n-1}, u_{n-1}, π_{n-1}))$
    • compute $π_n ← \textrm{SNARK.P}(\textrm{pk}, u_n, w_n)$
    • return $(u_n, π_u)$

$\textrm{IVC.V}(\textrm{vk}, n, x_0, u_n, π_n):$

  • check $u_n\textrm{.input} = (\textrm{vk}, n, x_0)$
  • check $\textrm{SNARK.V}(\textrm{vk}, u_n, π_n)$

The trace function that appears in IVC prover scheme is the function that solves the circuit given the inputs and all the necessary nondeterministic advices. These advices are the part of the witness that cannot be deterministically solved.

Claim: if $\textrm{IVC.V}(\textrm{vk}, n, x_0, u_n, π_n)$ is satisfied, then, with very high probability,

$$y_n = u_n\textrm{.output} = G(n, x_0),$$

i.e., $y_n = F (F ... (x_0))$ where $F$ is applied $n$ times. Although handwavy when it comes to probability, here is the intuition behind this. If $\textrm{IVC.V}(\textrm{vk}, n, x_0, u_n, π_n)$ is satisfied for $n > 0$, then, because of the SNARK proof $π_n$ which attest with very high probability to the validity of $G'$ with instance $u_n$, there must be $ω_{n-1}, u_{n-1}, π_{n-1}$ such that $u_{n-1}\textrm{.input} = (\textrm{vk}, n-1, x_0)$ and $\textrm{SNARK.V}(\textrm{vk}, u_{n-1}, π_{n-1})$ is satisfied, and $u_n\textrm{.output} = F (u_{n-1}\textrm{.output})$. I.e., $G'$ is doing the $\textrm{IVC.V}$ scheme inductively for $(n-1, x_0, \textrm{vk}, u_{n-1}, π_{n-1})$ and so on until reaching the base case, which is trivially true.

The scheme we constructed, although much better than the naive approach, still has a large issue: a SNARK proof must be constructed for each recursion step, which is incredibly expensive. There is a much better approach though, which we will discuss after a digression.

Folding schemes

A folding scheme is a non-interactive protocol between a prover and a verifier in which the prover is able to combine two instances $u_1$ and $u_2$ with the same structure into another (folded) instance $u,$ of the same structure such that $u$ is satisfiable if and only if both $u_1$ and $u_2$ are satisfiable (at least this relation should hold with a very high probability, being virtually impossible to break). The verifier is able to compute the folded instance given a folding scheme proof. A folding scheme, if it exists, will give us the ability of checking multiple instances at the same time, using a single SNARK proof.

A folding scheme for R1CS is not currently known. However, Kothapalli, Setty, and Tzialla (see [1]), found a folding scheme for a generalization of the R1CS problem, called the relaxed R1CS problem, and also an efficient SNARK for this relaxed system. Although not as general, we will use the folding scheme to fold pure R1CS (which is a particular case of the relaxed R1CS) instances with relaxed R1CS instances, resulting in relaxed R1CS, that is:

$fold : R1CS × R1CS_{relaxed} → R1CS_{relaxed}$

Although I won't show the details of the folding scheme nor the relaxed R1CS, it suffices to know that

  1. folding two instances is a cheap operation (just a couple of linear combinations)
  2. for any R1CS, there exists an instance $u_⊥$ of the relaxed system, called the trivial instance, which is trivially satisfiable

We can then think of the relaxed R1CS instance as a list of instances: the trivial instance $u_⊥$ being the empty list, the other instances being constructed by consing, i.e. folding, R1CS instances into it.

Nova

Now, how can folding schemes help us with the goal of verifying a recursive function? Well, you must recall how the extended $G'$ function had to verify a SNARK proof at every recursion step. We can, instead, postpone this verification by folding the instance of the recursive call into an auxiliary accumulator instance which is then outputted at every step. Then at the very last return of $G',$ we can check this accumulator instance with a SNARK, and it will be like we've checked all the instances of every recursive step at once! This approach was first described in the Nova[1] paper, and I will now try to convey it in a more informal and handwavy way.

Here is the shape of the new function: $G'(\textrm{vk}, n, x_0 | ω, u, U, π) → (y, h)$ where $\textrm{vk}, n, x_0$ are the public inputs, $u, ω, U, π$ the nondeterministic advices and $y$ and $h$ the (public) outputs. $\textrm{vk}$ is the verifier key for a folding scheme, $ω$ the witness for the $F$ call, $u$ an R1CS instance and $U$ (the accumulator) a relaxed R1CS instance, which should have a public IO size of 5 (same as $G'$), and $π$ a supposed folding proof for $u$ and $U$. The $h$ is supposed to represent a (collision-resistant) hash of the updated accumulator instance. The reason we must output the hash of the instance instead of the instance itself, is because the instance is supposed to contain the output of $G'$, which would be a contradiction.

This is how the new $G'$ should work:

  • if $n = 0$
    • return $(x_0, \textrm{hash}(u_⊥)),$ i.e. return $x_0$ and the trivial instance
  • else:
    • check $u\textrm{.input} = (\textrm{vk}, n-1, x_0)$
    • check $u\textrm{.output}.h = \textrm{hash}(U)$
    • compute $U' ← \textrm{fold.V}(\textrm{vk}, u, U, π)$
    • compute $y ← F(u\textrm{.output}.y)$ using ${ω}$ as witness
    • return $(y, \textrm{hash}(U'))$

The $G'$ function now only has to fold instances and create a hash, instead of verifying a SNARK. This means the prover does not have to produce SNARK proofs at every step, but only a fold and a hash. If the hash function is tuned for R1CS, like Poseidon's, which is only around 300 or so constraints, this should be much, much faster than producing a SNARK proof.

Now, as before, since $G'$ can be computed in polynomial time, it can be represented as R1CS and we can plug in the verifier keys for $G'$ into $G'$ itself. The protocol now will come in 3 parts:

$\textrm{IVC.F}(pk_{\textrm{fold}}, vk_{\textrm{fold}}, n, x_0, ω) → (u_n, w_n, U_n, W_n):$

  • if n = 0:
    • compute the trivial case $(u_0, w_0) ← \textrm{trace}(G', (vk_{\textrm{fold}}, 0, x_0 | ø, ø, ø, ø)),$ for $ø$ a dummy value, like $0$
    • return $(u_0, w_0, u_⊥, w_⊥)$
  • else:
    • recursively compute $(u_{n-1}, w_{n-1}, U_{n-1}, W_{n-1}) ← \textrm{IVC.F}(pk_{\textrm{fold}}, vk_{\textrm{fold}}, n-1, x_0)$
    • compute $(U_n, W_n, π_n) ← \textrm{fold.P}(pk_{\textrm{fold}}, u_{n-1}, U_{n-1}, w_{n-1}, W_{n-1})$
    • compute $(u_n, w_n) ← \textrm{trace}(G', (\textrm{vk}, n, x_0 | ω_{n-1}, u_{n-1}, U_{n-1}, π_n))$
    • return $(u_n, w_n, U_n, W_n)$

$\textrm{IVC.P}(\textrm{pk}, u_n, w_n, U_n, W_n) → (π, Π):$

  • compute $(U_{n+1}, W_{n+1}, π) ← \textrm{fold.P}(pk_{\textrm{fold}}, u_n, U_n, w_n, W_n)$
  • compute $Π ← \textrm{SNARK.P}(pk_{\textrm{SNARK}}, U_{n+1}, W_{n+1})$
  • return $(π, Π)$

$\textrm{IVC.V}(\textrm{vk}, n, x_0, u_n, U_n, π, Π):$

  • check $u_n\textrm{.input} = (vk_{\textrm{fold}}, n, x_0)$
  • check $u_n\textrm{.output}.h = \textrm{hash}(U_n)$
  • compute $U_{n+1} ← \textrm{fold.V}(vk_{\textrm{fold}}, u_n, U_n, π)$
  • check $\textrm{SNARK.V}(vk_{\textrm{SNARK}}, U_{n+1}, Π)$

The first part, $\textrm{IVC.F}$ is going to step by step produce the folded accumulator, its folded witness and a proof of how to fold the next step. Notice it needs the nondeterministic advices of each step of the computation, represented as an array $ω$, which must be precomputed. The second and third parts are simply producing and verifying the SNARK for this folded instance.

Claim: Again, if $\textrm{IVC.V}(\textrm{vk}, n, x_0, u_n, U_n, π, Π)$ is satisfied, then, with very high probability,

$$y_n = u_n\textrm{.output}.y = G(n, x_0),$$

i.e., $y_n = F (F ... (x_0))$ where $F$ is applied $n$ times.

The intuition behind this is this. Suppose $\textrm{IVC.V}(\textrm{vk}, n, x_0, u_n, U_n, π, Π)$ is satisfied, for $n>0$. Since $U_{n+1}$ is verified by a SNARK proof, then it means both $u_n$ and $U_n$ are also satisfied. Since $u_n$ where $u_n.io = (\textrm{vk}, n, x_0, y_n, \textrm{hash}(U_n))$ is a pure R1CS instance of $G'$, we can then invoke the properties of $G'$. That is, there must be $ω_{n-1}, u_{n-1}, U_{n-1}, π_{n-1}$ such that $u_{n-1}.io = (\textrm{vk}, n-1, x_0, y_{n-1}, \textrm{hash}(U_{n-1}))$, $y_n = F(y_{n-1})$ and $U_n$ is the folded instance of $u_{n-1}$ and $U_{n-1}$. But since $U_n$ is satisfied, then both $u_{n-1}$ and $U_{n-1}$ should also be satisfied, meaning this will recursively be applied until reaching the trivial base case. In the end we should have $y_n = F(F ... (x_0))$, $F$ being applied $n$ times, as we wanted.

References

[1]: "Nova: Recursive Zero-Knowledge Arguments from Folding Schemes", Abhiram Kothapalli, Srinath Setty, and Ioanna Tzialla, 2021.
[2]: "SuperNova: Proving universal machine executions without universal circuits", Abhiram Kothapalli, and Srinath Setty, 2022.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment