zkcrypto / bellman Goto Github PK
View Code? Open in Web Editor NEWzk-SNARK library.
License: Other
zk-SNARK library.
License: Other
Hello,
I'm working on a simple merkle tree snark. I have printed logs to ensure that my circuit is working, but it seems to fail with this error before returning the proof. I'm unsure how to proceed as I'm following examples around the web that seem to work as well. Any ideas why I'm getting this error.
It occurs here: https://github.com/drewstone/rust-miximus/blob/master/src/zk_util.rs#L107
The unwrap fails with exactly this error:
thread 'test::test_generate_params' panicked at 'called `Result::unwrap()` on an `Err` value: IoError(Custom { kind: UnexpectedEof, error: StringError("expected more bases from source") })', src/libcore/result.rs:997:5
Snarkjs is a widely used zk-snark library. How can I read the generated files of snarkjs (e.g. proof.json
, public.json
and verification_key.json
)?
By "cost tree" I mean the number of constraints (and possibly other performance-relevant information) of each abstraction and the abstractions that it uses, recursively.
This replaces the groth16
feature flag with a separate crate.
It would be useful to add num_constraints
, num_inputs
and num_aux
methods to the ConstraintSystem
trait to get a quick sense of its cost. I think these methods are general enough to apply to all implementors of ConstraintSystem
. If you'd like I can send in a pull request for this as well.
Hello,
I am looking to generate a trusted setup by usage of an MPC. I was wondering how to do this properly as currently I can only generate the trusted setup by myself.
Thank you
The above is a great article explaining the details, it mentioned that the point is chosen randomly during setup
, but is it safe to do so? Is it possible that the chosen point has extremely low order?
// Input constraints to ensure full density of IC query
// x * 0 = 0
for i in 0..assembly.num_inputs {
assembly.enforce(|| "", |lc| lc + Variable(Index::Input(i)), |lc| lc, |lc| lc);
}
At Zcon0, some of us had really great conversations on how to speed up SNARKs and some (I believe the codaprotocol folks) mentioned using GPU for parallelize proving. Since SNARKs are highly parallelizable (and since there is some work on showing that), what would it take to implement Bellman to be GPU friendly?
Anyone that in rust/gpu programming expert here?
getting an error saying (failed to resolve: could not find pairing
in bellman
)
Hi everyone, I'm new with zk-SNARKs, I'm trying build example with bellman
https://github.com/bui-duc-huy/bellman-example
I didn't knew why my example run very slow when generate_random_parameters
and create_random_proof
Anyone can explained me the reason.
Many thanks.
It doesn't really matter whether we use the full field or not (as long as z != 0
), but it makes various scalar mults cheaper.
Originally posted by @str4d in #59 (comment)
Took me a while to figure this one out, and I'm not entirely sure what the issue may be, but I've definitely narrowed it down to the fact that this API (bellman::groth16::Parameters::read<R: Read>(reader: R, checked: bool) -> Result<Self>
) locks up my machine when I specify checked = true
.
It seems like it works just fine when I don't use it (checked = false
), so I'm curious if the flag is even necessary or what it does where this behavior is encountered.
Any chance this is in the pipeline?
With Ethereum now supporting BLAKE2b on-chain explicitly to support the ZCash POW algorithm, there is now an opportunity to use this same operation inside Dapps that parallel SNARK circuits. This allows for newer proving schemes such as commit-and-prove, recursive-, and composable-SNARKs to operate selectively on- or off-chain.
The BLAKE2s hash function is currently supported in gadget and function form but the similar BLAKE2b hash function is not. Would the new gadget and functions be useful?
Good day.
For some reason the libsnark
outputs a verifying key for Groth16 in a form of e(alpha_g1, beta_g2)
and other elements and bellman
has verifying key in a form of separate alpha_g1 and beta_g2
. First form is not verifiable in the current Ethereum EVM, although the second can be. Is there a specific reason why the libsnark
and bellman
formats are different?
I'll leave this open as a combination of tasks that people getting familiar with rust, zk-snarks, and bellman specifically can do to become proficient contributors to this codebase. Please respond or edit the list below.
I'll try to attempt some of the items mentioned as I get time. Hopefully other newbies to this codebase will also find this useful.
It might be worth eliminating gamma from the verification key. I guess this isn't a huge deal, but my understanding from talking to @arielgabizon and Mary Maller is that it isn't actually necessary and you can just use the generator of G2.
This is useful from the point of view of proof composition because it makes the verification key smaller.
Appendix B.2 of the Zcash protocol spec describes how to do batched verification. The Zcash ticket is zcash/zcash#320 .
[Note that bellman can just provide batched and unbatched verification as separate APIs. We don't need to bikeshed here about how these will be used to implement Zcash consensus policy, as happened on the PR to add batched PHGR13 verification to libsnark; please keep that to the Zcash ticket.]
I'm pretty sure these added constraints are needed only in BCTV to create linear independence of public input polynomials. I don't think it's important for Groth16, could give a slight performance boost to remove them.
Lines 147 to 149 in 10c5010
error: linking with `cc` failed: exit status: 1
[....]
= note: ld: library not found for -lgmp
clang: error: linker command failed with exit code 1 (use -v to see invocation)
error: could not compile `ecdsa-mpc` due to previous error
Install GMP with homebrew:
brew install gmp
Make sure gmp linked:
brew link --force gmp
Create file .cargo/config
in source code folder:
[target.x86_64-apple-darwin]
rustflags = [
"-L", "/opt/homebrew/lib",
"-L", "/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib",
"-C", "link-arg=-undefined",
"-C", "link-arg=dynamic_lookup",
]
[target.aarch64-apple-darwin]
rustflags = [
"-L", "/opt/homebrew/lib",
"-L", "/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/lib",
"-C", "link-arg=-undefined",
"-C", "link-arg=dynamic_lookup",
]
These flags was used to guide rustc
to look up libraries from homebrew. I'm prefer an official patch.
bellman/src/gadgets/boolean.rs
Line 474 in 0f2244f
#[derive(Debug, Clone)]
pub enum Boolean {
Constant(bool),
Other,
}
impl Boolean {
pub fn not(&self) -> Self {
match *self {
Boolean::Constant(c) => Boolean::Constant(!c),
_ => unreachable!()
}
}
pub fn xor(&self, other: &Self) -> Self {
match (self, other) {
(&Boolean::Constant(false), x) | (x, &Boolean::Constant(false)) => x.clone(),
(&Boolean::Constant(true), x) | (x, &Boolean::Constant(true)) => x.not(),
_ => unreachable!()
}
}
}
fn main() {
let a = Boolean::Constant(true);
let b = Boolean::Constant(true);
let c = a.xor(&b);
// Question: the result c should be `Boolean::Constant(true)` ?
assert_eq!(c, Boolean::Constant(false));
}
if x
is Boolean::Constant(true)
, will return Ok(Boolean::Constant(false))
. But true BitXor true
should be true
not false
?
Lines 124 to 128 in 9bb30a7
In order to improve performance during verification, it may be useful to precompute windows for input constraint terms during verification key preprocessing.
I am a bit confused about saving the params
, proofs
and pvk
as JSON files so I can move them between machines.
I've followed the example repo at https://github.com/arcalinea/bellman-examples/blob/master/src/cube.rs#L154 and also see a write implementation on the proof structure here: https://github.com/zkcrypto/bellman/blob/master/src/groth16/mod.rs#L43
Can someone help me export these variables?
Groth16 proving and verification key serialization needs to be implemented.
gamma
variable should be removed if it can be proven it's unneeded in Groth's construction.Provide for both streamed proving (from a File
).
This way the variable assignment closure can query exotic constraint system types in order to optimize away inversions or do other kinds of cached operations.
Ideally this would be a run-time rather than a compile-time option, so that clients of bellman such as zcashd could enable it based on a user setting. I know that constant-time MSM will be significantly slower; that's fine, and I think some people will want to take that trade-off.
Right now we use futures-cpupool
for one purpose (work stealing) and crossbeam
for another (scoped threads). I can get both of these with rayon
, which is also more mature, but in the past when I switched to rayon it was slower.
I think there are tweaks and new features in its API which would allow me to adopt it entirely.
I'm sorry to ask this question because this question does not involve the correctness of the code, but is purely a question of my own understanding. After inquiring some information, I did not get a suitable answer, so I think I can only ask the question in here it is.
I don't quite understand how the eval_of_tau
function in generator
works.
Here is the eval_at_tau
function:
bellman/src/groth16/generator.rs
Lines 371 to 384 in 9bb30a7
Its second parameter is the QAP polynomial stored in point-valued form:
bellman/src/groth16/generator.rs
Lines 313 to 316 in 9bb30a7
It's first parameter is powers_of_tau
, but this powers_of_tau
confuses me. In the front, this variable is exactly what it says, it is the powers of tau
bellman/src/groth16/generator.rs
Lines 244 to 259 in 9bb30a7
But before actually using it, make the following transformation:
bellman/src/groth16/generator.rs
Lines 294 to 295 in 9bb30a7
So as I understand it, from here on powers_of_tau
becomes the coefficients (a0, a1, ..an) of the polynomial f(x) such that
A polynomial of the form f(x) = a0 + a1 * x + a2 * x^2 .... an * x^n
Satisfy:
f(ฯ0) = tau^0
f(ฯ1) = tau^1
....
In this case, according to the implementation of the eval_at_tau
function
bellman/src/groth16/generator.rs
Lines 375 to 381 in 9bb30a7
How can this be the value of a QAP polynomial at tau?
Because powers_of_tau
has been ifft transformed before, so
powers_of_tau[index].0
here should be the index
th coefficient of f(x). What is the significance of multiplying this coefficient with coeff
?
I guess based on the comments, here is the equivalent to the point value form of QAP by doing Lagrangian interpolation, the formula:
Then coeff
is equivalent to $$ {y}_j $$
Then the j
th item of powers_of_tau
is equal to the l_j(x) corresponding to y_j?
Is this guess correct? Is there any formula to prove that they are indeed equal? Thanks!
Hey everyone,
I want to make a verification system where I have to test the point wise multiplication of arrays. For instance, [20,20]*[1,0] = [20,0] is what I want to verify. So, what I did was I started with a multiplication circuit and then tried to use batch mode on entire array referring to https://github.com/zkcrypto/bellman/blob/main/benches/batch.rs
But, this does not work and gives me an error that bellman::groth16::batch does not exist. Please let me know how will I be able to fix it.
Thanks in advance
I don't see any way to serialize the Proof<E>
struct; is something like this planned? (Presumably Zcash will require this as well once Sapling progresses enough).
This issue was automatically generated. Feel free to close without ceremony if
you do not agree with re-licensing or if it is not possible for other reasons.
Respond to @cmr with any questions or concerns, or pop over to
#rust-offtopic
on IRC to discuss.
You're receiving this because someone (perhaps the project maintainer)
published a crates.io package with the license as "MIT" xor "Apache-2.0" and
the repository field pointing here.
TL;DR the Rust ecosystem is largely Apache-2.0. Being available under that
license is good for interoperation. The MIT license as an add-on can be nice
for GPLv2 projects to use your code.
The MIT license requires reproducing countless copies of the same copyright
header with different names in the copyright field, for every MIT library in
use. The Apache license does not have this drawback. However, this is not the
primary motivation for me creating these issues. The Apache license also has
protections from patent trolls and an explicit contribution licensing clause.
However, the Apache license is incompatible with GPLv2. This is why Rust is
dual-licensed as MIT/Apache (the "primary" license being Apache, MIT only for
GPLv2 compat), and doing so would be wise for this project. This also makes
this crate suitable for inclusion and unrestricted sharing in the Rust
standard distribution and other projects using dual MIT/Apache, such as my
personal ulterior motive, the Robigalia project.
Some ask, "Does this really apply to binary redistributions? Does MIT really
require reproducing the whole thing?" I'm not a lawyer, and I can't give legal
advice, but some Google Android apps include open source attributions using
this interpretation. Others also agree with
it.
But, again, the copyright notice redistribution is not the primary motivation
for the dual-licensing. It's stronger protections to licensees and better
interoperation with the wider Rust ecosystem.
To do this, get explicit approval from each contributor of copyrightable work
(as not all contributions qualify for copyright, due to not being a "creative
work", e.g. a typo fix) and then add the following to your README:
## License
Licensed under either of
* Apache License, Version 2.0 ([LICENSE-APACHE](LICENSE-APACHE) or http://www.apache.org/licenses/LICENSE-2.0)
* MIT license ([LICENSE-MIT](LICENSE-MIT) or http://opensource.org/licenses/MIT)
at your option.
### Contribution
Unless you explicitly state otherwise, any contribution intentionally submitted
for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any
additional terms or conditions.
and in your license headers, if you have them, use the following boilerplate
(based on that used in Rust):
// Copyright 2016 bellman developers
//
// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
// option. This file may not be copied, modified, or distributed
// except according to those terms.
It's commonly asked whether license headers are required. I'm not comfortable
making an official recommendation either way, but the Apache license
recommends it in their appendix on how to use the license.
Be sure to add the relevant LICENSE-{MIT,APACHE}
files. You can copy these
from the Rust repo for a plain-text
version.
And don't forget to update the license
metadata in your Cargo.toml
to:
license = "MIT/Apache-2.0"
I'll be going through projects which agree to be relicensed and have approval
by the necessary contributors and doing this changes, so feel free to leave
the heavy lifting to me!
To agree to relicensing, comment with :
I license past and future contributions under the dual MIT/Apache-2.0 license, allowing licensees to chose either at their option.
Or, if you're a contributor, you can check the box in this repo next to your
name. My scripts will pick this exact phrase up and check your checkbox, but
I'll come through and manually review this issue later as well.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.