Giter Site home page Giter Site logo

arkworks-rs / crypto-primitives Goto Github PK

View Code? Open in Web Editor NEW
147.0 13.0 59.0 365 KB

Interfaces and implementations of cryptographic primitives, along with R1CS constraints for them

Home Page: https://www.arkworks.rs

License: Apache License 2.0

Rust 99.74% Python 0.26%
cryptography signatures hash-functions snark r1cs rust

crypto-primitives's Introduction

ark-crypto-primitives

The arkworks ecosystem consist of Rust libraries for designing and working with zero knowledge succinct non-interactive arguments (zkSNARKs). This repository contains efficient implementations of cryptographic primitives such as collision-resistant hash functions, hiding commitments, pseudo-random functions, signatures, and, optionally, R1CS constraints for these.

This library is released under the MIT License and the Apache v2 License (see License).

WARNING: This is an academic proof-of-concept prototype, and in particular has not received careful code review. This implementation is NOT ready for production use.

Build guide

The library compiles on the stable toolchain of the Rust compiler. To install the latest version of Rust, first install rustup by following the instructions here, or via your platform's package manager. Once rustup is installed, install the Rust toolchain by invoking:

rustup install stable

After that, use cargo, the standard Rust build tool, to build the library:

git clone https://github.com/arkworks-rs/crypto-primitives.git
cargo build --release

This library comes with unit tests for each of the provided crates. Run the tests with:

cargo test

License

This library is licensed under either of the following licenses, at your discretion.

Unless you explicitly state otherwise, any contribution submitted for inclusion in this library by you shall be dual licensed as above (as defined in the Apache v2 License), without any additional terms or conditions.

Acknowledgements

This work was supported by: a Google Faculty Award; the National Science Foundation; the UC Berkeley Center for Long-Term Cybersecurity; and donations from the Ethereum Foundation, the Interchain Foundation, and Qtum.

An earlier version of this library was developed as part of the paper "ZEXE: Enabling Decentralized Private Computation".

crypto-primitives's People

Contributors

autquis avatar drewstone avatar huitseeker avatar insun35 avatar intx4 avatar mercysjest avatar mmagician avatar npwardberkeley avatar pratyush avatar rozbb avatar sam-steffen avatar slumber avatar stechu avatar tsunrise avatar valardragon avatar weikengchen avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

crypto-primitives's Issues

Rename `PoseidonXYZ` types in `poseidon` module.

Summary

Rename the PoseidonParams, PoseidonSponge, PoseidonSpongeState types to Params, Sponge, SpongeState, respectively.

Problem Definition

This avoids repeating the word "poseidon" and gives the API user choice of how to name the types. In a context where it's important to specify that it's a Poseidon sponge, they can use poseidon; and refer to poseidon::Sponge; in a context where it's clear, they can simply use poseidon::Sponge; and refer to Sponge.

For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

Project BHP hash output to one coordinate, ala `TECompressor`

We should change the output of the BHP hash to be a single field element, instead of a group element. This wasn't done earlier because BHP was generic over any projective curve, but that shouldn't be an issue any more, as it is generic only over Edwards curves now, and projection to a single coordinate is injective in the curves we care about.

Inconsistent output of Poseidon CRH between native and constraints

I was helping @mfaghihi debugging the code, and found that the native code and constraints of Poseidon CRH in crypto-primitives seem to return different results.

Here are the tests I wrote: https://github.com/tsunrise/Poseidon/blob/debug/src/test.rs (the test failed)

Is there a bug in the Poseidon CRH implementations, or the Poseidon parameter is not correct?
(Could someone add unit tests for the CRH implementations if they have time? Thanks)

(btw, @mfaghihi 's implementation is using 0.2 release, but I have not changed the implementation of Poseidon CRH since then so I'd assume same problem will happen in 0.3 release)

cc: @drewstone @weikengchen

Is `ParametersVar` necessary?

pub trait FixedLengthCRHGadget<H: FixedLengthCRH, ConstraintF: Field>: Sized {
type OutputVar: EqGadget<ConstraintF>
+ ToBytesGadget<ConstraintF>
+ CondSelectGadget<ConstraintF>
+ AllocVar<H::Output, ConstraintF>
+ R1CSVar<ConstraintF>
+ Debug
+ Clone
+ Sized;
type ParametersVar: AllocVar<H::Parameters, ConstraintF> + Clone;
fn evaluate(
parameters: &Self::ParametersVar,
input: &[UInt8<ConstraintF>],
) -> Result<Self::OutputVar, SynthesisError>;
}

Is there any circumstance where user compute CRH parameters in circuit? Computing ParametersVar requires extra constraints so it should be better to allow an option to use native CRH parameters instead.

Relevant discussions: #30 (comment)

AssignmentMissing when using a private PathVar witness

Currently, it's not possible to use PathVar::new_witness for a placeholder PathVar in the merkle module. The reason why is that it takes in a Result<Vec<T>, SynthesisError> instead of something more like Vec<Option<T>>, which is used in other sequence-like variables. The reason the latter is necessary is because the allocator needs to know how many variables to create.

This is an issue and not a PR because I don't have any good ways off the top of my head to fix this. The workaround is pretty easy (just make an arbitrary Path of the appropriate length), but that doesn't fit with how the rest of the library works.

Parameters for Sponge

Currently the CryptographicSponge trait only has a new method to initialize a new sponge. This doesn't suffice for the case where the sponge is parameterized. For example, a sponge based on Poseidon can have many different parameterizations for the rate, capacity, and for the underlying permutation.

How should we handle this?

  • Should the trait have an associated Parameters type, and should new() then take this type as input?
  • Or should it be left to implementations? For example, you would make a PoseidonParameters trait, have have the Poseidon impl be generic over a type implementing this trait, so that the final call would be PoseidonSponge<F, ConcreteParams>::new()?

Originally posted by @ValarDragon in arkworks-rs/sponge#2 (comment)

Make all the subcomponents of `Path` public or serializable

Based on the discussion from Telegram, one would likely want to serialize/deserialize a Merkle tree path, and currently, this cannot be done.

https://github.com/arkworks-rs/crypto-primitives/blob/main/src/merkle_tree/mod.rs#L33

This is because our Merkle tree path is of the following structure:

pub struct Path<P: Config> {
    pub(crate) leaf_sibling_hash: LeafDigest<P>,    
    /// The sibling of path node ordered from higher layer to lower layer (does not include root node).
    pub(crate) auth_path: Vec<TwoToOneDigest<P>>;,
    /// stores the leaf index of the node
    pub(crate) leaf_index: size,
}

We will need to make a trade-off here and a change that likely could be in the next release. I suggest:

  • Let Path automatically derive the serialization traits' implementations.
  • Turn pub(crate) to pub.

Thoughts? @tsunrise

Relocate `variable_length_crh` from `ark-pcd` to `ark-crypto-primitives`

In the PCD library, we have implemented the variable-length versions of the CRH in this repo.

These variable-length versions, at a high level, do not fix the length of the input to the CRH. The main idea is that many public parameters of the CRH, which depend on the input length, could be produced by running a seeded, cryptographically secure hash function (e.g., Chacha), modeled as a random oracle.

An example is here: https://github.com/arkworks-rs/pcd/blob/master/src/variable_length_crh/bowe_hopwood/constraints.rs#L42

Thus, it would be useful that this general-purpose variable-length CRH is relocated to ark-crypto-primitives since it ideally would belong here.

Help needed and welcomed!

Add Proof of Work

Looks that arkworks does not have the implementation of proof of work (even the simplest one). We can probably add one here (use the one similar to libiop? )

We can use existing CRH trait to make it generic. Also, we will need to write the constraints. (For now all CRH has its constraints written, so we just need to wrap it.)

"one line algo"

For a message M and difficulty k, the prover will bruteforce a nonce N such that the last k bits of the output of H(M || N) are zero.

Squeeze native field element with size for constraints

Summary

We have squeeze_field_elements_with_sizes in native code. Shall we have equivalent in constraints?

Proposal

Add the following method to constraints code

fn squeeze_native_field_elements_with_sizes<F: PrimeField>(&mut self, sizes: &[FieldElementSize])
        ->Result<Vec<FpVar<F>>, SynthesisError>;

For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

Add `TreeHash` API for use specifically within a Merkle Tree

This is not a blocker for this API, but maybe instead of trying to design a general API for hashing two inputs into one, we should have specialized traits for hashing in the context of a Merkle tree. For example:

pub trait TreeHash {
	type Parameters;
	type InputOutput;
	const ARITY; // We can set this to just 2 for the time being.

	fn setup() -> Self::Parameters;
	
	/// Panics if .
	fn hash_inner_node(params: &Self::Parameters, hashes: &[Self::InputOutput]) -> Self::InputOutput;

	/// Hash the bottom layer. First, use  to hash the leaf object, and then
	/// convert the output of  to . (maybe by internally re-hashing.)
	fn hash_leaf<LeafHash: CRH>(
		params: &Self::Parameters, 
		leaf_hash_params: &LeafHash::Parameters, 
		leaf: &[u8]
	) -> Self::InputOutput {
		// hash the leaf using LeafHash, convert the output to bytes,
		// and then internally handle the conversion to .
		// For example, if  is , then use .
		// Else, if it's bytes, then there's no need to convert.
		// We can add specializations using  to avoid the conversion when 
		// , or when  is some known hash, 
		// like  or something.
	}
}

For pedersen::CRH, bowe_hopwood::CRH, and blake2s/sha256::CRH, the InputOutput type would be Vec<u8>, while for poseidon::CRH, it would be Vec<F> (or maybe [F; 2], or whatever the output size is). This way, the Merkle tree doesn't incur the overhead of converting into bytes at each step, and we've also made it very clear that these hashes are for use in a Merkle tree, and not (necessarily) outside that.

What do you think? @ValarDragon @tsunrise?

Originally posted by @Pratyush in #30 (comment)

Transparent setup of Poseidon in Rust compile-time

From an interface perspective, we want to use Poseidon as a drop-in replacement for Pedersen and SHA-256.
However, this would be slightly complicated because searching for Poseidon's parameters requires some care.

  • \alpha must be coprime with p-1.
  • Nums of full/partial rounds are related to p, sponge rate, and the desired level of security.
  • MDS matrix must go through a few steps: a random generation + three security checks + retry if not passed.
  • Round constants should be randomly chosen.

Or, in short, the parameters are application-dependent (sponge rate) and field-dependent (\alpha, p, MDS, constants).

Also, different choices of \alpha could end up with very different performances. As Fractal shows, choosing a larger \alpha could significantly reduce the number of rounds.

Though it is possible to leave all the jobs to the developers, it is not ideal since the developers would need to supply parameters for each field that the application may use. It is also possible that the developers did not generate secure parameters.


One possibility:

  • Developers only need to specify the field and sponge rate, which is the application-dependent part.
  • The program runs a search function, which starts with a deterministically generated seed (e.g., SHA256 of the string "Deterministic generation of sponge parameters [sponge rate] [field prime]" as the seed). This search function would need to find the most efficient & still secure parameters.

More specifically,

  • It needs to try different valid \alpha and compute the nums of full/partial rounds. Then, it needs to compute their cost and find the most efficient one.
  • It needs to generate the MDS matrix using a PRNG with the seed, go through the security checks, and retry if it did not pass.
  • Round constants are also generated using the PRNG.

idea? thoughts?

cc'ed: @ValarDragon

Purpose of this repo?

Hi, to clarify, is the purpose of this repo to create arithmetic hash functions that are snark friendly?

Or, are they perhaps to also facillitate creating STARKs like Fractal to be used in Zexe proof systems?

Pedersen hash test fails for MNT6-753 curve

I noticed that the pedersen hash native equality test fails on the MNT6-753 curve.
So far, I have not investigated it any further.

Here is a straightforward copy of the original test within the crate, just adapted to use the MNT6-753 curve.
It fails due to the result not being equal:

thread 'test_native_equality' panicked at 'assertion failed: `(left == right)`
  left: `(30390519998062178694725475488940532302589938923242267558107260219929746235682202286671549394454858372824954717174269307126111696685337959911401422533740374719533611898999068655889367618546677228247302093533059382636731943987153, 31577653328290403174636224464923809536312721930568898260924681142803173090303008532122247136406726160984825509478364846586928834029472596529985379920851904757723573215025326910498114850509986137058967739174592705707005691872426)`,
 right: `(28859985354968639743946055458264681860931873493266425400581389206408943549804757046100063253039557629051321858869964295459866261787662758167419177045205076114605735178873939272389854650616996123629301617779573535732334814960566, 24451171218383564241303970355918892269988964476566513651408375954536170738339336517864976400937097228888940980774708934032138427601324417814051388979167673529040688562864423189542386386911374815819363810050003455919964690178629, 1)`', tests/test.rs:52:5

I compared the result of the off-circuit hash to our reference implementation and it seems correct.
Thus, the problem must occur in the ZKP version of the hash.

Potential refactoring of `TwoToOneCRH`

Our current TwoToOneCRH takes left_hash_bytes and right_hash_bytes as input. As you can see, it has a strict requirement that both left and right must be bytes.

This is unsatisfactory for Poseidon, as evidenced by the fact that @drewstone and @filiplazovic have been working hard to bypass this limitation.

One potential solution is to make sure that TwoToOneCRH is merely a compressing CRH, and its inputs are of the same format as its output:

pub trait TwoToOneCRH {
    type Output: ToBytes
        + Clone
        + Eq
        + core::fmt::Debug
        + Hash
        + Default
        + CanonicalSerialize
        + CanonicalDeserialize;
    type Parameters: Clone + Default;

    fn setup<R: Rng>(r: &mut R) -> Result<Self::Parameters, Error>;

    /// Evaluates this CRH on the left and right inputs.
    fn evaluate(
        parameters: &Self::Parameters,
        left_input: &Self::Output,
        right_input: &Self::Output,
    ) -> Result<Self::Output, Error>;
}

This one will work, except for one other limitation that we previously discussed: this prohibits the use of more than one hash function, of different types. We refactor the Merkle tree all for this!

An example use case comes from Fractal, where ideally the bottom level of the Merkle tree is some sort of hash function that can be efficiently computed outside SNARK, e.g., Blake2b/s or another Poseidon.

So, we have two needs that are in conflict:

(1) provides a general framework such that different kinds of CRH can interact, e.g., making inputs [u8] and making outputs ToBytes.

(2) provides an efficient framework for CRH to not go through [u8] all the time.

Proposed solution 1: making TwoToOneCRH dual-interface

An alternative interface is to support both:

pub trait TwoToOneCRH {
    /// The bit size of the left input.
    const LEFT_INPUT_SIZE_BITS: usize;
    /// The bit size of the right input.
    const RIGHT_INPUT_SIZE_BITS: usize;

    type Output: ToBytes
        + Clone
        + Eq
        + core::fmt::Debug
        + Hash
        + Default
        + CanonicalSerialize
        + CanonicalDeserialize;
    type Parameters: Clone + Default;

    fn setup<R: Rng>(r: &mut R) -> Result<Self::Parameters, Error>;
    /// Evaluates this CRH on the left and right inputs.
    ///
    /// # Panics
    ///
    /// If `left_input.len() != Self::LEFT_INPUT_SIZE_BITS`, or if
    /// `right_input.len() != Self::RIGHT_INPUT_SIZE_BITS`, then this method panics.
    fn evaluate(
        parameters: &Self::Parameters,
        left_input: &[u8],
        right_input: &[u8],
    ) -> Result<Self::Output, Error>;

    /// Compress two CRH outputs.
    fn compress(
        parameters: &Self::Parameters,
        left_hash: &Self::Output,
        right_hash: &Self::Output,
    ) -> Result<Self::Output, Error>;
}

where compress is preferred except when touching the bottom layer.

This is still suboptimal, as there is still an unnecessary conversion between the leaf and non-leaf layers, even if they are both Poseidon.

Proposed solution 2: wrapping the LeafHash to match the TwoToOneCRH

We can provide a wrapper, such that when it wraps a LeafHash, it matches the LeafHash's output to the TwoToOneCRH's output.

Then, in the Merkle tree, we can require LeafHash's output = TwoToOneCRH's output, and if LeafHash is of a different format, we use a wrapper, like:

BytesToFieldElementWrapper<LeafHash>

If the LeafHash and TwoToOneCRH are the same, this wrapper is not needed.

Thoughts? @ValarDragon @tsunrise

Split permutation from sponge construction

Summary

I think it would be useful to split the current API into two parts, a lower-level part with access to the "raw" permutation, and a higher-level part with a sponge or other duplex construction.

Problem Definition

Currently, it's not possible for API clients to use the permutation directly, only via the provided sponge construction. This presents some difficulties:

  • For fixed-size hashing, users have to figure out how to specify the hashing behavior they want in terms of the duplex operations, instead of being able to write it directly. This makes it more likely that the code will include some behavior accidentally derived from the initial implementation choice.

  • It's not possible for users to implement any other duplex modes than what's already provided, or to implement duplex behavior that's differently specified than the provided one. For instance, the current implementation puts the capacity at the end of the state, but other instantiations of Poseidon hashing, like the C2SP proposal C2SP/C2SP#3 , put it at the beginning.

Proposal

Rework the current structures as follows:

PoseidonParameters: remains unchanged.1

PoseidonSpongeState: renamed to PoseidonState, and redefined as

pub struct PoseidonState<F: PrimeField> {
    pub state: Vec<F>,
    pub params: PoseidonParameters<F>,
}

In other words, the state structure includes the field elements in the state, as well as the parameters needed to run the permutation. The permute function is moved to a pub fn permute(&mut self) on PoseidonState.

PoseidonSponge: redefined as

pub struct PoseidonSponge<F: PrimeField> {
    pub state: PoseidonState<F> ,
    pub mode: DuplexSpongeMode,
}

In other words, the sponge structure consists of a state, together with the extra data tracking how that state is being used to implement a higher-level duplex construction.

This allows providing two levels of API: a lower-level API that provides precise control over the permutation, and a higher-level API that provides a duplex or other construction. Users who want to have more "batteries included" functionality can use the higher-level construction, but it's possible to implement other constructions using the lower-level API.


For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

Footnotes

  1. the doc comment about "and RNG used" should be eliminated, correct? I don't see any use of an RNG in the public API. โ†ฉ

Pedersen CRH and Commitment always fails

There's a check that's done in the constraints impl of Pedersen CRH and commitment that appears here and here. It asserts that padded_input.len() * 8 == W::WINDOW_SIZE * W::NUM_WINDOWS.

In most windows, however, this check fails and the program panics. The problem is that you cannot always pad bytes to a specific bitlength. In particular, if W::WINDOW_SIZE * W::NUM_WINDOWS is not divisible by 8, then Pedersen computations are guaranteed to fail every time.

This isn't a PR bc I'm not sure the best way to fix this. Currently my hacky workaround is to just pick NUM_WINDOWS = 8, but that makes things multiple times slower in the benchmarks for my current project.

Help trying to build an Asymmetric Encryption Gadget

Hi!
First of all, if this issue is not meant to be here, please let me know.

That being said, I'm toying with the many repos of arkworks to try and create a gadget for asymmetric encryption using the ElGamal scheme, to then prove said encryption happened.
I was mainly following the example from this test, but it generates 'msg' as a point on JubJub, right? I'd like to know if it is possible to use a string instead, that is, map the string to a point and then encrypt it.

I'm not that versed in cryptography, so correct me If I've said anything wrong, and feel free to point me in another direction if there's an easier way to do this.

Incremental Merkle Tree

As we are building manta (https://github.com/Manta-Network), we figured out that we need an incremental merkle tree that has Log N internal nodes, which is not supported in the current MerkleTree implementaion. The properties that we need for this merkle tree:

  1. Able to generate ZK membership proofs
  2. The merkle tree can be fixed sized and append only (i.e. mark the unused leaf node as a filler, etc)
  3. The number of internal nodes is Log N. (since we need to store the internal nodes on chain, O(N) storage would be too expensive).
    I think 3 is only possible if this merkle tree only support append. Happy to submit a PR on this. Would also like to hear is there any design considerations that need to take care of during the implementation.

Refactor Pedersen CRH impl

Right now, PedersenCRH impls CRH for any ProjectiveCurve, and outputs a group element. However, for TE curves, we can project this to just the x coordinate. Supporting this generically requires the hacky "InjectiveMap" infrastructure, so we should instead just impl CRH for PedersenCRH<SWCurve> which outputs a group element (maybe compressed?) and impl CRH for PedersenCRH<TECurve> which outputs the x coordinate, thus getting rid of the current hacks.

Make explicit the RNG's security needed for CRH parameter generation

In generating the group elements for Pedersen hashes, we want these elements to have some sort of independence. This can be done by (1) using truly random bits (but not verifiable) or (2) using a cryptographic hash function, treated as a random oracle.

This, indeed, places a requirement on the security of RNG that is qualified to set up Pedersen hashes.

Currently, the CRH interface does not explicitly ask for this security. In fact, any RNG suffices.

    pub fn generator_powers<R: Rng>(num_powers: usize, rng: &mut R) -> Vec<G> {
        let mut cur_gen_powers = Vec::with_capacity(num_powers);
        let mut base = G::rand(rng);
        for _ in 0..num_powers {
            cur_gen_powers.push(base);
            base.double_in_place();
        }
        cur_gen_powers
    }

This would be problematic if the developers did not instantiate CRH with a secure RNG, but, for example, uses a weak RNG.

Such as in dpc's test:

#[test]
fn integration_test() {
    let mut rng = XorShiftRng::seed_from_u64(23472342u64);
    ...
}

Proposed solution:

It is necessary for cryptographic parameters whose setup is somehow based on random oracle to say so explicitly.

It seems that Rust has provided a trait that extends Rng, called CryptoRng. Note that this is only a marker, and it needs to be used together with Rng.
https://docs.rs/rand/0.5.0/rand/trait.CryptoRng.html

We likely should change a number of setup functions to explicitly ask for this level. Also, some prove, index may also require this level of security.

Refactor Merkle Tree

Dev, Weikang, and I have discussed how to refactor the Merkle tree. We plan to use those new APIs:

  • update_leaf update the original_leaf to updated_leaf, and returned the updated MT root.
pub fn update_leaf<CS: ConstraintSystem<F>>(&self, 
        path: &[Boolean], 
        root: CRH_output, 
        auth_path: &[CRH_output], 
        original_leaf: &[MerkleTree_LeafType],
        updated_leaf: &[MerkleTree_LeafType]) -> Result<CRH_output, SynthesisError>
  • check_update will update the leaf and check if the updated root is new_root. (Constraints will not be satisfied if check failed)
pub fn check_update<CS: ConstraintSystem<F>>(&self, 
        path: &[Boolean], 
        old_root: CRH_output, 
        new_root: CRH_output,
        auth_path: &[CRH_output], 
        original_leaf: &[MerkleTree_LeafType],
        updated_leaf: &[MerkleTree_LeafType]) -> Result<(), SynthesisError>
  • check_membership verifies if the leaf is in the location of the Merkle tree specified by path.
    pub fn check_membership(&self, 
        path: &[Boolean], 
        root: CRH_output, 
        auth_path: &[CRH_output], 
        leaf: &[MerkleTree_LeafType]) -> Result<(), SynthesisError>
  • conditional_check_membership will check membership if should_enforce is true.
pub fn conditionally_check_membership(&self, 
        path: &[Boolean], 
        root: CRH_output, 
        auth_path: &[CRH_output], 
        leaf: &[MerkleTree_LeafType],
        should_enforce: &Boolean
  ) -> Result<(), SynthesisError>

In native code, we will additionally have an update function and check_update function. Detailed APIs will be posted later.

For now, we will continue using two hash functions LeafHasher and TwoToOneNodeHasher. In the future, for better performance, we will possibly use different hashes in different layers for better out-of-circuit and in-circuit performance.

Build fails because "cannot locate remote-tracking branch 'origin/update-serialize'"

Did a fresh clone and building locally I get the following output from cargo build

    Updating git repository `https://github.com/arkworks-rs/curves`
    Updating git repository `https://github.com/arkworks-rs/algebra`
    Updating git repository `https://github.com/arkworks-rs/r1cs-std`
    Updating git repository `https://github.com/arkworks-rs/snark`
    Updating git repository `https://github.com/arkworks-rs/sponge`
error: failed to load source for dependency `ark-sponge`

Caused by:
  Unable to update https://github.com/arkworks-rs/sponge?branch=update-serialize

Caused by:
  failed to find branch `update-serialize`

Caused by:
  cannot locate remote-tracking branch 'origin/update-serialize'; class=Reference (4); code=NotFound (-3)

Add some sample MT configs

Currently its a bit hard to use the MT's in this library. Would be really useful if we could have some out-of-the-box MT configs that people can use to get started with, when hacking on new projects.

I think we could do this by having a second feature flag called "examples", which if enabled, lets you import some example configs. (And then you also know where to look to copy/paste examples)

We basically do this already in private for testing, so would be great to get it exported.

Merkle tree API does not support r1cs + no_std

Seems that I forgot to handle R1CS in no-std context. Result is that r1cs would not work with no-std.

I will fix that shortly. Sorry for the inconvenience.

(I think in the future, we need to update the CI to ensure r1cs compiles with no-default-features. )

Add a CHANGELOG

This is to prepare for the release for 0.3 since there are some breaking changes.

On my way.

Add a `test_sponge` public function

For documentation and testing purposes, it would be great to obtain a sponge without much code overhead.
The simplest thing to do is to move poseidon_sponge_for_test methods from ark-poly-commit, which are pub(crate) functions for testing there, and to re-expose them under test_sponge in the root module here.

Alternatively this could go into ark-std, similar to test_rng: https://github.com/arkworks-rs/std/blob/4cee4bc18ff6b0159c00d48622181db145938601/src/rand_helper.rs#L37

Inconsistencies in the Blake2s implementation

On #103 the parameters function was removed from the Blake2sWithParameterBlock. However, for prf::blake2s::constraints::evaluate_blake2s_with_parameters, we need the parameters formatted as [u32; 8] as an argument, which would be exposed by calling the removed function.

Additionally, the non-constraint evaluate function now only runs in Mac mode. Even if no key is provided, a full 0-padded block is prefixed. This is different from the constraint behavior.

Filecoin Poseidon/Neptune

Summary

Filecoin has an optimized implementation of Poseidon that has gone through an audit by the ADBK Consulting (by Mikhail Vladimirov and Dmitry Khovratovich)

https://github.com/filecoin-project/neptune/blob/master/spec/poseidon_spec.pdf

It seems to include two changes: (1) treatment of the round constants and (2) the use of sparse MDS matrices.

Neptune is fortunately MIT/Apache2.

Problem Definition

It may be interesting to look at their implementations and see if it is compatible with the current implementation, and whether or not it can bring performance improvement.

Proposal

Analysis the compatibility and decide whether or not to implement it.


For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

Minimize need for repeated generic arguments.

Using many gadgets requires repeating type parameters: once for the native version, and once for the constraint version. However, this seems redundant as usually there's only a single gadget for a particular primitive. So, it makes sense to tie the gadget to the native impl. We can do this via an extension trait, as follows:

pub trait CRHGadget<ConstraintF: Field>: CRH {
	type ParametersVar: AllocGadget<Self::Parameters>;
	type InputVar: AllocGadget<Self::Input>;
	type OutputVar: AllocGadget<Self::Output>;
	
	fn evaluate(params: &Self::ParametersVar, input: &[UInt8]) -> R1CSResult<Self::OutputVar>; 
}

impl CRHGadget for PedersenCRH { ... } // Note: this is implemented for *PedersenCRH*, not PedersenCRHGadget.

We can invoke this as

fn hash<H: CRHGadget<ConstraintF>, F: Field>(...) {
	H::evaluate(&params, &input)?;
}

We've reduced the number of type parameters greatly. This works for Gadget traits which don't contain variables themselves. What about Var traits, which contain variables? We can still use them, as follows:

trait FieldExt {
	type Var: FieldVar<...>;//figure out what to put in the type parameters.
}

impl FieldExt for FpN {
...
}

This would allow us to access the "unique" variable type for a given field element, without introducing multiple type parameters.

Feedback on this idea is very welcome!

Parallelize the Merkle tree implementation

It would be nice if we could enable parallelization of the Merkle trees (more specifically, the commitment phase).
In a few projects I am working on the Merkle tree portion accounts for a majority of the runtime, and it seems like an obvious target for parallelization.

Ideally, we would gate this behind a feature flag (as it is done for parallel FFTs in algebra).

The parallelization is slightly non-trivial (naively, we would probably choose a layer k, compute the 2^k subtrees at that layer in parallel, and then compute the remaining smaller Merkle tree), but should be relatively doable.
Not sure if there are more clever ways to do this.

Rescue for Marlin

It might be worthwhile for us to consider Rescue, in addition to Poseidon.

The main reason is as follows:

  • In Marlin and a number of universal-setup primitives, the dominating measure is constraint system weight, not number of constraints.
  • Poseidon uses a lot of partial rounds to save computation of s-box, while the remaining operations are mostly linear combinations. However, in Marlin, linear combinations are also not cheap.
  • Rescue does not have partial rounds. And the number of full rounds are in a reasonable region.

When trying to find optimized parameters for Poseidon, the optimal point is alpha = 257 in one of the experiments. It is slightly too large.

Based on the analysis in Rescue paper, it might suggest that Rescue is a better choice than Poseidon for Marlin. More numerical analyses might be needed.

Apart from efficiency issues, it might be good to have both the implementations of Poseidon and Rescue, since they have similarities and differences, based on two different design strategies, and thus might be a good foundation for future hash functions.

Refactoring `MDS` component

Summary

refactoring MDS component in Poseidon so it can be used by Rescue

Proposal

[              1               1               1               1               1               1               1               1]
[              1               5              25             125             625            3125           15625           78125]
[              1              25             625           15625          390625         9765625       244140625      6103515625]
[              1             125           15625         1953125       244140625     30517578125   3814697265625 476837158203125]

Concretely:

pub(crate) struct MDS{
  ... 
}

impl Default for MDS {
  // use the rescue method
}

impl MDS {
 fn sample<R: Rng>(rng: &mut R)->Self{
   // use poseidon method
   // also need to decide if we want to do those three checks
 }
}

For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

Add Rescue for the next release

Summary

We have a branch rescue which includes the implementation of Rescue.

Merging it would take some efforts, and also the implementation is unchecked, this is left as a todo for the next release.


For Admin Use

  • Not duplicate issue
  • Appropriate labels applied
  • Appropriate contributors tagged
  • Contributor assigned/self-assigned

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.