Giter Site home page Giter Site logo

zipson's Issues

Incorrect result when compress nested empty object in template

I have two similar objects with empty object. When compressing with zipson part of state disappearing.

I wrote two tests in test/full/object.ts:

it('nested empty object', function() {
  testPackUnpack({ a: { 1: {} }, c: 42 })
})

it('nested empty object template', function() {
  testPackUnpack({ a: { 1: {} }, b: { 1: {} }, c: 42 })
})

Second test is failing

1) object
       nested empty template:

      unpacked integrity
      + expected - actual

       {
      -  "a": {}
      -  "b": {}
      +  "a": {
      +    "1": {}
      +  }
      +  "b": {
      +    "1": {}
      +  }
         "c": 42
       }

Question: what would be a use-case for this?

First of all: thank you for your lib!
This is not meant as a rude question just to better understand when it would make sense to use your lib.

When I have a pretty big JS object (client) and I want to send it to the server, is it useful then or is gzip doing the same thing and I don't really have to worry about it. Does zipson+gzip still save more than just gzip?

Or does it "only" make sense when you are streaming big chunks of data to/from client <> server?

Clarification on `parseIncremental()`

First of all, thanks for the great work on this library; its a life saver for my project.

I'm wondering if you can clarify what parseIncremental() does?

I have a rather large JSON file and I'm hoping to parse it incrementally in the background as if I'm streaming in the data.

The name seems to suggest parseIncremental() is what I'm looking for, but reading the documentation, it's unclear if thats actually what it does? Can you expound on what the use case for this function is?

PS: I know there's also the Zipson stream library, but I need to be able to stream it in a browser environment

Uint8Array convert

JSON.parse can convert Uint8Array directly to string and parse it.
Could you also implemented that?

RangeError: Maximum call stack size exceeded

I'm occasionally seeing the following exceptio thrown from zipson

(node:17412) UnhandledPromiseRejectionWarning: RangeError: Maximum call stack size exceeded at buildTemplate (C:\workspaces\server\node_modules\zipson\lib\compressor\template\object.js:51:23) at new TemplateObject (C:\workspaces\server\node_modules\zipson\lib\compressor\template\object.js:13:33) at Object.compressObject [as object] (C:\workspaces\server\node_modules\zipson\lib\compressor\object.js:11:26) at Object.compressAny [as any] (C:\workspaces\server\node_modules\zipson\lib\compressor\any.js:31:21) at Object.compressObject [as object] (C:\workspaces\server\node_modules\zipson\lib\compressor\object.js:32:29) at Object.compressAny [as any] (C:\workspaces\server\node_modules\zipson\lib\compressor\any.js:31:21) at Object.compressObject [as object] (C:\workspaces\server\node_modules\zipson\lib\compressor\object.js:32:29) at Object.compressAny [as any] (C:\workspaces\server\node_modules\zipson\lib\compressor\any.js:31:21) at Object.compressObject [as object] (C:\workspaces\server\node_modules\zipson\lib\compressor\object.js:32:29) at Object.compressAny [as any] (C:\workspaces\server\node_modules\zipson\lib\compressor\any.js:31:21)
It doesn't appear to stop it working though - the client still decodes correctly - but it would be nice not to see this in the logs

many thanks

Is this great project still maintained?

I just did a quick benchmark of zipson vs various serialisers + compressors.

The results are (length in bytes, time in ms, 100 iterations to warmup and 100 iterations to measure):

    ┌─────────┬───────────────────┬────────┬─────────────────────┬──────────────────────┐
    │ (index) │       name        │ length │     encodeTime      │      decodeTime      │
    ├─────────┼───────────────────┼────────┼─────────────────────┼──────────────────────┤
    │    0    │  'zipson brotli'  │  417   │     2.52860035      │      0.12007231      │
    │    1    │ 'zipson deflate'  │  475   │ 0.20293473999999997 │      0.11486058      │
    │    2    │ 'msgpack brotli'  │  576   │     3.92180084      │      0.05375433      │
    │    3    │   'json brotli'   │  599   │  4.567029679999999  │      0.07787885      │
    │    4    │ 'msgpack deflate' │  627   │      0.2011609      │      0.06081639      │
    │    5    │   'zipson lz4'    │  642   │     0.38702751      │  0.6388705200000001  │
    │    6    │   'msgpack lz4'   │  690   │ 0.6580354100000001  │      0.6165729       │
    │    7    │  'json deflate'   │  711   │ 0.10077317999999999 │ 0.053146879999999994 │
    │    8    │     'zipson'      │  1121  │     0.14132796      │      0.09936688      │
    │    9    │    'json lz4'     │  1123  │     0.13947896      │      0.47471737      │
    │   10    │     'msgpack'     │  1640  │     0.09119614      │      0.03184669      │
    │   11    │      'json'       │  2328  │     0.02867283      │      0.02063535      │
    └─────────┴───────────────────┴────────┴─────────────────────┴──────────────────────┘

So, zipson combined with deflate is fast and saves a lot of storage!

For the sake of our climate, you should continue your great work!!!

Handle exponential floats

Hello.
Thanks for great library. It ensures good compression results on great performance.

Only issue i faced - compressing exponential numbers.
For integers with positive exponent it seems to work correct, but produces huge strings:

console.log(stringify(1e+123));
// ¢D6NHK4g4X000000000000000000000000000000000000000000000000000000000000

For negative exponent it produces wrong output:

console.log(stringify(1e-123));
// £0.0

I'll fix this and provide MR

Typo in package.json types field?

Hi,

i think there is a typo in the package.json types field.
It references the file lib/index.d.js, but there is only a lib/index.d.ts existing.

Would be great if this could be fixed if am right.

Thanks!

Cheers
Johannes

fullPrecisionFloats not working for some cases.

Hi, I have used the lib and it significantly reduces the size of the object.
However, I noticed that even when I have made the fullPrecisionFloats to be true.
some of the attributes' 13th decimal places were still be rounded to the 12th decimal places. This might sound a little bit picky but I am just wondering is this something with JavaScript or if this can be avoided somehow.

The weird part is. For some of the fields, they got 17 or even 18 decimal places. but they remain the same as the original data.

  • workflow: original file -> stringify -> parsed

Left: original file ; Right: file parsed
Screen Shot 2022-07-15 at 11 07 07 am

Error: Unexpected scalar {VARIABLE} at {VARIABLE}-{VARIABLE}

I have a larger data set that I'm trying to uncompress and getting this error message.

Error: Unexpected scalar f at 11696307-11696343
at Object.decompressScalar (/app/node_modules/zipson/lib/decompressor/scalar.js:98:11)
at Object.decompressElement (/app/node_modules/zipson/lib/decompressor/element.js:15:32)
at Object.decompressStages (/app/node_modules/zipson/lib/decompressor/stages.js:78:28)
at Object.decompress (/app/node_modules/zipson/lib/decompress.js:32:14)
at Object.parse (/app/node_modules/zipson/lib/index.js:16:25)
at Response. (/app/build/app.js:3549:49)
at Request. (/app/node_modules/aws-sdk/lib/request.js:364:18)
at Request.callListeners (/app/node_modules/aws-sdk/lib/sequential_executor.js:106:20)
at Request.emit (/app/node_modules/aws-sdk/lib/sequential_executor.js:78:10)
at Request.emit (/app/node_modules/aws-sdk/lib/request.js:683:14)
at Request.transition (/app/node_modules/aws-sdk/lib/request.js:22:10)
at AcceptorStateMachine.runTo (/app/node_modules/aws-sdk/lib/state_machine.js:14:12)
at /app/node_modules/aws-sdk/lib/state_machine.js:26:10
at Request. (/app/node_modules/aws-sdk/lib/request.js:38:9)
at Request. (/app/node_modules/aws-sdk/lib/request.js:685:12)
at Request.callListeners (/app/node_modules/aws-sdk/lib/sequential_executor.js:116:18)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.