Giter Site home page Giter Site logo

lnp-bp / client_side_validation Goto Github PK

View Code? Open in Web Editor NEW
20.0 13.0 19.0 1.86 MB

Standard implementation of client-side-validation APIs

Home Page: https://docs.rs/client_side_validation

License: Apache License 2.0

Rust 97.78% Shell 0.13% TeX 1.20% Nix 0.89%
client-side-validation distributed-computing lnp-bp bitcoin single-use-seals distributed-systems

client_side_validation's People

Contributors

afilini avatar crisdut avatar dr-orlovsky avatar h4sh3d avatar inaltoasinistra avatar kaiwolfram avatar kixunil avatar nicbus avatar rajarshimaitra avatar ukolovaolga avatar yancyribbens avatar yanganto avatar zhiqiangxu avatar zoedberg avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

client_side_validation's Issues

When does `strict_encoding` reaches stable release?

I've been building some functionality to master branch of lnp-node, and I encountered some nasty build error in dependency strict_encoding 0.9.0.
Something like this

error[E0432]: unresolved import `bitcoin::XpubIdentifier`
  --> /home/joemphilips/.cargo/registry/src/github.com-1ecc6299db9ec823/strict_encoding-0.9.0/src/bitcoin.rs:32:21
   |
32 |     XOnlyPublicKey, XpubIdentifier,
   |                     ^^^^^^^^^^^^^^
   |                     |
   |                     no `XpubIdentifier` in the root
   |                     help: a similar name exists in the module: `PubIdentifier`

Not sure where this PubIdentifier came from, I could not find the name with git log -p in rust-bitcoin.

Anyway, I tried to debug by forking strict_encoding. which existed in this repository.
But I suppose it has been moved to https://github.com/strict-types/strict-encoding now?
If so, forking this repository seems not a good idea.
Probably I should wait until the strict-encoding in the new repository gets stable and other lnp-bp-rgb libraries depends on it? How long should I wait for it?

Increase multi-protocol commitment tree size limit

Currently, MPC trees (LNPBP-4) are limited to a maximum depth of 2^4 = 16, i.e. they may contain up to 2^16=65536 elements. The original design assumed that's enough to host all assets which may be allocated to a single UTXO.

However, tests do show that even 127 different assets may not UNIQUELY fit into a tree with width=655536 (the position of the asset is 256-bit asset id modulo size of the tree) and practically we can assume just around 64 assets to be assignable to the same UTXO.

Here I propose extending MPC tree depth to 32 and tree width to 2^32 such that the tree may host up to 16000 separate assets (no of assets < 2^(width / 2 - 2))

CommitmentId produces different results in Wasm32 target

Hi @dr-orlovsky,

I noticed that the CommitmentId::commitment_id function returns different values ​​according to the target compilation.

I discovered this while I was testing rgb-schemata, I even created a test that can help you with the bug investigation:

#[cfg(test)]
mod test {
    use super::*;

    #[test]
    fn show_schema_id() {
        let expected = "5YWfKW3CqANHsKqpxy3HaCpt5bgvsMXHUuXiHpoynEYG";
        let schema = nia_schema();
        let id = schema.schema_id().to_string();
        assert_eq!(expected, id); // OK!
    }
}

#[cfg(target_arch = "wasm32")]
mod test_wasm32 {
    use super::*;
    use wasm_bindgen_test::*;
    wasm_bindgen_test_configure!(run_in_browser);
    
    #[wasm_bindgen_test]
    fn show_schema_id() {
        let expected = "5YWfKW3CqANHsKqpxy3HaCpt5bgvsMXHUuXiHpoynEYG";
        let schema = nia_schema();
        let id = schema.schema_id().to_string();
        assert_eq!(expected, id); // Fail!
       /*
          panicked at 'assertion failed: `(left == right)`
          left: `"5YWfKW3CqANHsKqpxy3HaCpt5bgvsMXHUuXiHpoynEYG"`,
          right: `"12vPR9qthiLpPgoytGBbeerVYFKvdCXEUmbg891HVyHV"`', src/main.rs:143:9
      */

    }
}

Hope this helps.

WASM support

stens fails to compile to wasm32-unknown-unknown due to the dependency, amplify_syn.

StrictEncoding issue with enum tuple struct field initialization

When fixing some clippy lints on the descriptor wallet, I ran into an issue here:

Screenshot from 2022-02-09 12-30-55

Docs on the linter rule are here:
https://rust-lang.github.io/rust-clippy/master/index.html#init_numbered_fields

Normally I wouldn't file a feature request just to satisfy a linter, but this might be indicative of some sort of oversight. Feel free to close if we're comfortable with using #![allow(clippy::init_numbered_fields)] in all projects using strict encoding.

broken MerkleBlock conceal procedure

Executing rgb-lib tests we found an issue that sometimes happens and seems very similar to RGB-WG/rgb-node#208, but I'm not sure it's the same.

The error we receive is MerkleBlock conceal procedure is broken and comes from ~/.cargo/git/checkouts/client_side_validation-8dff74d720902144/46cb8a7/commit_verify/src/mpc/block.rs:418:21.

@dr-orlovsky could you please investigate this?

Implement resolve_seal_issues

/// Method used to check whether seal resolution issues are resolved, for
/// instance allowing to try a different resolver, re-check after restored
/// network connectivity or after some time allowing unmined transactions to
/// be mined.
pub fn resolve_seal_issues<Resolver>(&mut self, _resolver: &mut Resolver)
where
Resolver: SealResolver<
<R::SealIssue as SealIssue>::SingleUseSeal,
Error = R::SealIssue,
>,
{
todo!()
}

Strict encoding should provide explicit guarantees on collection sizes

The main strict encoding rule was to restrict the size of any collection (string, vector of data, maps) to 2^16 elements (i.e. have a length encoded by 2 bytes). This happens implicitly, which leads to a non-obvious bugs missed in code reviews.

To address the issue it is proposed first to perform #97 and than, additionally to that, do the following:

  • Abandon confined_encoding for unordered hash collections (HashSet, HashMap);
  • Remove confined_encoding from all other collection types;
  • Introduce new types for collections, specifically ConfinedString, ConfinedVec, ConfinedSet, ConfinedMap, backed by inner types String, Vec, BTreeSet, BTreeMap) and require all consensus code to explicitly use them.

Decide on compressed pubkey/signature strict encoding

Currently, bitcoin public key/signatures allow encoding in compressed and uncompressed form, depending on the compression flag value, and do not encode the flag itself. Secp256k1 keys and ECDSA signatures do not allow uncompressed encoding.

This

  • makes two types incompatible
  • creates unnecessary space consumption with uncompressed option

It is proposed to prohibit uncompressed keys and make bitcoin and secp256k1 key/signature implementations to be identical. If it is required to use uncompressed flag, it must be presented as a separate field withing the structure.

Alternative: use bitcoin-based data types to always encode compression flag, but serialize even uncompressed keys in a compressed version.

Refactor strict_encoding crate

The original intent of strict_encoding was to provide encoding standard for the client-side-validation. That is why the crate is a part of this repository and used everywhere inside RGB and AluVM.

However, it appeared that the crate has become much more widely used, including commercial projects much outside of the scope of the client-side-validation. That happened due to the fact that in rust there are no good binary compact encodings guaranteeing determinism in data size or ordering.

Widespread use of strict_encoding creates pressure for adding more dependencies and types to it, since rust foreign type policies prohibits derivation of strict encoding traits on other dependencies downstream. While the new dependencies are optional, it still blows up codebase and requires permanent releases of new version.

On the other hand, client-side-validation requirements to the encoding are quite specific; for instance, we use bitcoin consensus encoding in many places, and such encoding is not consistent nor provides the same approach as encoding for other types.

So basically, it should not be advised to use strict encoding for anything else than client-side-validation and add more types here. Nevertheless, it is really hard to prevent that, especially taking into account existing adoption of the crate.

The solution I propose is the following:

  • Fork the strict encoding crate into two different crates with different purposes: strict_encoding and confined_encoding;
  • Move strict encoding outside of this repository, to https://github.com/Internet2-wg/ (where it will be used for instance by AluVM);
  • Migrate all client-side-validation, BP Core and RGB Core libraries to use exclusively confined_encoding;
  • Let any other project use strict_encoding (including descriptor wallet etc);
  • Remove use of any custom encodings (like bitcoin consensus etc) from the strict_encoding;
  • Remove TLV support from the confined_encoding;
  • Move encoding_derive_helpers to https://github.com/rust-amplify/

This will restrict use of new crate confined_encoding to client-side-validation consensus only. Any non-consensus code (including descriptor wallet etc) MUST NOT use confined_encoding. The confined_encoding must be ossified and new types must not be added to it.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.