Giter Site home page Giter Site logo

msgpack / msgpack-javascript Goto Github PK

View Code? Open in Web Editor NEW
1.2K 30.0 148.0 3.33 MB

@msgpack/msgpack - MessagePack for JavaScript / msgpack.org[JavaScript/TypeScript/ECMA-262]

Home Page: https://msgpack.org/

License: ISC License

JavaScript 5.28% TypeScript 92.49% Makefile 0.69% HTML 1.53%
msgpack javascript-library typescript-library messagepack universal-javascript deno-library serialization-library

msgpack-javascript's Introduction

MessagePack for JavaScript/ECMA-262

npm version CI codecov minzip tree-shaking

This library is an implementation of MessagePack for TypeScript and JavaScript, providing a compact and efficient binary serialization format. Learn more about MessagePack at:

https://msgpack.org/

This library serves as a comprehensive reference implementation of MessagePack for JavaScript with a focus on accuracy, compatibility, interoperability, and performance.

Additionally, this is also a universal JavaScript library. It is compatible not only with browsers, but with Node.js or other JavaScript engines that implement ES2015+ standards. As it is written in TypeScript, this library bundles up-to-date type definition files (d.ts).

*Note that this is the second edition of "MessagePack for JavaScript". The first edition, which was implemented in ES5 and never released to npmjs.com, is tagged as classic.

Synopsis

import { deepStrictEqual } from "assert";
import { encode, decode } from "@msgpack/msgpack";

const object = {
  nil: null,
  integer: 1,
  float: Math.PI,
  string: "Hello, world!",
  binary: Uint8Array.from([1, 2, 3]),
  array: [10, 20, 30],
  map: { foo: "bar" },
  timestampExt: new Date(),
};

const encoded: Uint8Array = encode(object);

deepStrictEqual(decode(encoded), object);

Table of Contents

Install

This library is published to npmjs.com as @msgpack/msgpack.

npm install @msgpack/msgpack

API

encode(data: unknown, options?: EncoderOptions): Uint8Array

It encodes data into a single MessagePack-encoded object, and returns a byte array as Uint8Array. It throws errors if data is, or includes, a non-serializable object such as a function or a symbol.

for example:

import { encode } from "@msgpack/msgpack";

const encoded: Uint8Array = encode({ foo: "bar" });
console.log(encoded);

If you'd like to convert an uint8array to a NodeJS Buffer, use Buffer.from(arrayBuffer, offset, length) in order not to copy the underlying ArrayBuffer, while Buffer.from(uint8array) copies it:

import { encode } from "@msgpack/msgpack";

const encoded: Uint8Array = encode({ foo: "bar" });

// `buffer` refers the same ArrayBuffer as `encoded`.
const buffer: Buffer = Buffer.from(encoded.buffer, encoded.byteOffset, encoded.byteLength);
console.log(buffer);

EncoderOptions

Name Type Default
extensionCodec ExtensionCodec ExtensionCodec.defaultCodec
context user-defined -
useBigInt64 boolean false
maxDepth number 100
initialBufferSize number 2048
sortKeys boolean false
forceFloat32 boolean false
forceIntegerToFloat boolean false
ignoreUndefined boolean false

decode(buffer: ArrayLike<number> | BufferSource, options?: DecoderOptions): unknown

It decodes buffer that includes a MessagePack-encoded object, and returns the decoded object typed unknown.

buffer must be an array of bytes, which is typically Uint8Array or ArrayBuffer. BufferSource is defined as ArrayBuffer | ArrayBufferView.

The buffer must include a single encoded object. If the buffer includes extra bytes after an object or the buffer is empty, it throws RangeError. To decode buffer that includes multiple encoded objects, use decodeMulti() or decodeMultiStream() (recommended) instead.

for example:

import { decode } from "@msgpack/msgpack";

const encoded: Uint8Array;
const object = decode(encoded);
console.log(object);

NodeJS Buffer is also acceptable because it is a subclass of Uint8Array.

DecoderOptions

Name Type Default
extensionCodec ExtensionCodec ExtensionCodec.defaultCodec
context user-defined -
useBigInt64 boolean false
maxStrLength number 4_294_967_295 (UINT32_MAX)
maxBinLength number 4_294_967_295 (UINT32_MAX)
maxArrayLength number 4_294_967_295 (UINT32_MAX)
maxMapLength number 4_294_967_295 (UINT32_MAX)
maxExtLength number 4_294_967_295 (UINT32_MAX)

You can use max${Type}Length to limit the length of each type decoded.

decodeMulti(buffer: ArrayLike<number> | BufferSource, options?: DecoderOptions): Generator<unknown, void, unknown>

It decodes buffer that includes multiple MessagePack-encoded objects, and returns decoded objects as a generator. See also decodeMultiStream(), which is an asynchronous variant of this function.

This function is not recommended to decode a MessagePack binary via I/O stream including sockets because it's synchronous. Instead, decodeMultiStream() decodes a binary stream asynchronously, typically spending less CPU and memory.

for example:

import { decode } from "@msgpack/msgpack";

const encoded: Uint8Array;

for (const object of decodeMulti(encoded)) {
  console.log(object);
}

decodeAsync(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): Promise<unknown>

It decodes stream, where ReadableStreamLike<T> is defined as ReadableStream<T> | AsyncIterable<T>, in an async iterable of byte arrays, and returns decoded object as unknown type, wrapped in Promise.

This function works asynchronously, and might CPU resources more efficiently compared with synchronous decode(), because it doesn't wait for the completion of downloading.

This function is designed to work with whatwg fetch() like this:

import { decodeAsync } from "@msgpack/msgpack";

const MSGPACK_TYPE = "application/x-msgpack";

const response = await fetch(url);
const contentType = response.headers.get("Content-Type");
if (contentType && contentType.startsWith(MSGPACK_TYPE) && response.body != null) {
  const object = await decodeAsync(response.body);
  // do something with object
} else { /* handle errors */ }

decodeArrayStream(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): AsyncIterable<unknown>

It is alike to decodeAsync(), but only accepts a stream that includes an array of items, and emits a decoded item one by one.

for example:

import { decodeArrayStream } from "@msgpack/msgpack";

const stream: AsyncIterator<Uint8Array>;

// in an async function:
for await (const item of decodeArrayStream(stream)) {
  console.log(item);
}

decodeMultiStream(stream: ReadableStreamLike<ArrayLike<number> | BufferSource>, options?: DecoderOptions): AsyncIterable<unknown>

It is alike to decodeAsync() and decodeArrayStream(), but the input stream must consist of multiple MessagePack-encoded items. This is an asynchronous variant for decodeMulti().

In other words, it could decode an unlimited stream and emits a decoded item one by one.

for example:

import { decodeMultiStream } from "@msgpack/msgpack";

const stream: AsyncIterator<Uint8Array>;

// in an async function:
for await (const item of decodeMultiStream(stream)) {
  console.log(item);
}

This function is available since v2.4.0; previously it was called as decodeStream().

Reusing Encoder and Decoder instances

Encoder and Decoder classes are provided to have better performance by reusing instances:

import { deepStrictEqual } from "assert";
import { Encoder, Decoder } from "@msgpack/msgpack";

const encoder = new Encoder();
const decoder = new Decoder();

const encoded: Uint8Array = encoder.encode(object);
deepStrictEqual(decoder.decode(encoded), object);

According to our benchmark, reusing Encoder instance is about 20% faster than encode() function, and reusing Decoder instance is about 2% faster than decode() function. Note that the result should vary in environments and data structure.

Encoder and Decoder take the same options as encode() and decode() respectively.

Extension Types

To handle MessagePack Extension Types, this library provides ExtensionCodec class.

This is an example to setup custom extension types that handles Map and Set classes in TypeScript:

import { encode, decode, ExtensionCodec } from "@msgpack/msgpack";

const extensionCodec = new ExtensionCodec();

// Set<T>
const SET_EXT_TYPE = 0 // Any in 0-127
extensionCodec.register({
  type: SET_EXT_TYPE,
  encode: (object: unknown): Uint8Array | null => {
    if (object instanceof Set) {
      return encode([...object], { extensionCodec });
    } else {
      return null;
    }
  },
  decode: (data: Uint8Array) => {
    const array = decode(data, { extensionCodec }) as Array<unknown>;
    return new Set(array);
  },
});

// Map<T>
const MAP_EXT_TYPE = 1; // Any in 0-127
extensionCodec.register({
  type: MAP_EXT_TYPE,
  encode: (object: unknown): Uint8Array => {
    if (object instanceof Map) {
      return encode([...object], { extensionCodec });
    } else {
      return null;
    }
  },
  decode: (data: Uint8Array) => {
    const array = decode(data, { extensionCodec }) as Array<[unknown, unknown]>;
    return new Map(array);
  },
});

const encoded = encode([new Set<any>(), new Map<any, any>()], { extensionCodec });
const decoded = decode(encoded, { extensionCodec });

Ensure you include your extensionCodec in any recursive encode and decode statements!

Note that extension types for custom objects must be [0, 127], while [-1, -128] is reserved for MessagePack itself.

ExtensionCodec context

When you use an extension codec, it might be necessary to have encoding/decoding state to keep track of which objects got encoded/re-created. To do this, pass a context to the EncoderOptions and DecoderOptions:

import { encode, decode, ExtensionCodec } from "@msgpack/msgpack";

class MyContext {
  track(object: any) { /*...*/ }
}

class MyType { /* ... */ }

const extensionCodec = new ExtensionCodec<MyContext>();

// MyType
const MYTYPE_EXT_TYPE = 0 // Any in 0-127
extensionCodec.register({
  type: MYTYPE_EXT_TYPE,
  encode: (object, context) => {
    if (object instanceof MyType) {
      context.track(object); // <-- like this
      return encode(object.toJSON(), { extensionCodec, context });
    } else {
      return null;
    }
  },
  decode: (data, extType, context) => {
    const decoded = decode(data, { extensionCodec, context });
    const my = new MyType(decoded);
    context.track(my); // <-- and like this
    return my;
  },
});

// and later
import { encode, decode } from "@msgpack/msgpack";

const context = new MyContext();

const encoded = = encode({myType: new MyType<any>()}, { extensionCodec, context });
const decoded = decode(encoded, { extensionCodec, context });

Handling BigInt with ExtensionCodec

This library does not handle BigInt by default, but you have two options to handle it:

  • Set useBigInt64: true to map bigint to MessagePack's int64/uint64
  • Define a custom ExtensionCodec to map bigint to a MessagePack's extension type

useBigInt64: true is the simplest way to handle bigint, but it has limitations:

  • A bigint is encoded in 8 byte binaries even if it's a small integer
  • A bigint must be smaller than the max value of the uint64 and larger than the min value of the int64. Otherwise the behavior is undefined.

So you might want to define a custom codec to handle bigint like this:

import { deepStrictEqual } from "assert";
import { encode, decode, ExtensionCodec } from "@msgpack/msgpack";

// to define a custom codec:
const BIGINT_EXT_TYPE = 0; // Any in 0-127
const extensionCodec = new ExtensionCodec();
extensionCodec.register({
  type: BIGINT_EXT_TYPE,
  encode(input: unknown): Uint8Array | null {
    if (typeof input === "bigint") {
      if (input <= Number.MAX_SAFE_INTEGER && input >= Number.MIN_SAFE_INTEGER) {
        return encode(Number(input));
      } else {
        return encode(String(input));
      }
    } else {
      return null;
    }
  },
  decode(data: Uint8Array): bigint {
    const val = decode(data);
    if (!(typeof val === "string" || typeof val === "number")) {
      throw new DecodeError(`unexpected BigInt source: ${val} (${typeof val})`);
    }
    return BigInt(val);
  },
});

// to use it:
const value = BigInt(Number.MAX_SAFE_INTEGER) + BigInt(1);
const encoded: = encode(value, { extensionCodec });
deepStrictEqual(decode(encoded, { extensionCodec }), value);

The temporal module as timestamp extensions

There is a proposal for a new date/time representations in JavaScript:

This library maps Date to the MessagePack timestamp extension by default, but you can re-map the temporal module (or Temporal Polyfill) to the timestamp extension like this:

import { Instant } from "@std-proposal/temporal";
import { deepStrictEqual } from "assert";
import {
  encode,
  decode,
  ExtensionCodec,
  EXT_TIMESTAMP,
  encodeTimeSpecToTimestamp,
  decodeTimestampToTimeSpec,
} from "@msgpack/msgpack";

// to define a custom codec
const extensionCodec = new ExtensionCodec();
extensionCodec.register({
  type: EXT_TIMESTAMP, // override the default behavior!
  encode(input: unknown): Uint8Array | null {
    if (input instanceof Instant) {
      const sec = input.seconds;
      const nsec = Number(input.nanoseconds - BigInt(sec) * BigInt(1e9));
      return encodeTimeSpecToTimestamp({ sec, nsec });
    } else {
      return null;
    }
  },
  decode(data: Uint8Array): Instant {
    const timeSpec = decodeTimestampToTimeSpec(data);
    const sec = BigInt(timeSpec.sec);
    const nsec = BigInt(timeSpec.nsec);
    return Instant.fromEpochNanoseconds(sec * BigInt(1e9) + nsec);
  },
});

// to use it
const instant = Instant.fromEpochMilliseconds(Date.now());
const encoded = encode(instant, { extensionCodec });
const decoded = decode(encoded, { extensionCodec });
deepStrictEqual(decoded, instant);

This will become default in this library with major-version increment, if the temporal module is standardized.

Decoding a Blob

Blob is a binary data container provided by browsers. To read its contents, you can use Blob#arrayBuffer() or Blob#stream(). Blob#stream() is recommended if your target platform support it. This is because streaming decode should be faster for large objects. In both ways, you need to use asynchronous API.

async function decodeFromBlob(blob: Blob): unknown {
  if (blob.stream) {
    // Blob#stream(): ReadableStream<Uint8Array> (recommended)
    return await decodeAsync(blob.stream());
  } else {
    // Blob#arrayBuffer(): Promise<ArrayBuffer> (if stream() is not available)
    return decode(await blob.arrayBuffer());
  }
}

MessagePack Specification

This library is compatible with the "August 2017" revision of MessagePack specification at the point where timestamp ext was added:

  • str/bin separation, added at August 2013
  • extension types, added at August 2013
  • timestamp ext type, added at August 2017

The living specification is here:

https://github.com/msgpack/msgpack

Note that as of June 2019 there're no official "version" on the MessagePack specification. See msgpack/msgpack#195 for the discussions.

MessagePack Mapping Table

The following table shows how JavaScript values are mapped to MessagePack formats and vice versa.

The mapping of integers varies on the setting of useBigInt64.

The default, useBigInt64: false is:

Source Value MessagePack Format Value Decoded
null, undefined nil null (*1)
boolean (true, false) bool family boolean (true, false)
number (53-bit int) int family number
number (64-bit float) float family number
string str family string
ArrayBufferView bin family Uint8Array (*2)
Array array family Array
Object map family Object (*3)
Date timestamp ext family Date (*4)
bigint N/A N/A (*5)
  • *1 Both null and undefined are mapped to nil (0xC0) type, and are decoded into null
  • *2 Any ArrayBufferViews including NodeJS's Buffer are mapped to bin family, and are decoded into Uint8Array
  • *3 In handling Object, it is regarded as Record<string, unknown> in terms of TypeScript
  • *4 MessagePack timestamps may have nanoseconds, which will lost when it is decoded into JavaScript Date. This behavior can be overridden by registering -1 for the extension codec.
  • *5 bigint is not supported in useBigInt64: false mode, but you can define an extension codec for it.

If you set useBigInt64: true, the following mapping is used:

Source Value MessagePack Format Value Decoded
null, undefined nil null
boolean (true, false) bool family boolean (true, false)
number (32-bit int) int family number
number (except for the above) float family number
bigint int64 / uint64 bigint (*6)
string str family string
ArrayBufferView bin family Uint8Array
Array array family Array
Object map family Object
Date timestamp ext family Date
  • *6 If the bigint is larger than the max value of uint64 or smaller than the min value of int64, then the behavior is undefined.

Prerequisites

This is a universal JavaScript library that supports major browsers and NodeJS.

ECMA-262

  • ES2015 language features
  • ES2018 standard library, including:
    • Typed arrays (ES2015)
    • Async iterations (ES2018)
    • Features added in ES2015-ES2022
  • whatwg encodings (TextEncoder and TextDecoder)

ES2022 standard library used in this library can be polyfilled with core-js.

IE11 is no longer supported. If you'd like to use this library in IE11, use v2.x versions.

NodeJS

NodeJS v14 is required.

TypeScript Compiler / Type Definitions

This module requires type definitions of AsyncIterator, SourceBuffer, whatwg streams, and so on. They are provided by "lib": ["ES2021", "DOM"] in tsconfig.json.

Regarding the TypeScript compiler version, only the latest TypeScript is tested in development.

Benchmark

Run-time performance is not the only reason to use MessagePack, but it's important to choose MessagePack libraries, so a benchmark suite is provided to monitor the performance of this library.

V8's built-in JSON has been improved for years, esp. JSON.parse() is significantly improved in V8/7.6, it is the fastest deserializer as of 2019, as the benchmark result bellow suggests.

However, MessagePack can handles binary data effectively, actual performance depends on situations. You'd better take benchmark on your own use-case if performance matters.

Benchmark on NodeJS/v18.1.0 (V8/10.1)

operation op ms op/s
buf = Buffer.from(JSON.stringify(obj)); 902100 5000 180420
obj = JSON.parse(buf.toString("utf-8")); 898700 5000 179740
buf = require("msgpack-lite").encode(obj); 411000 5000 82200
obj = require("msgpack-lite").decode(buf); 246200 5001 49230
buf = require("@msgpack/msgpack").encode(obj); 843300 5000 168660
obj = require("@msgpack/msgpack").decode(buf); 489300 5000 97860
buf = /* @msgpack/msgpack */ encoder.encode(obj); 1154200 5000 230840
obj = /* @msgpack/msgpack */ decoder.decode(buf); 448900 5000 89780

Note that JSON cases use Buffer to emulate I/O where a JavaScript string must be converted into a byte array encoded in UTF-8, whereas MessagePack modules deal with byte arrays.

Distribution

NPM / npmjs.com

The NPM package distributed in npmjs.com includes both ES2015+ and ES5 files:

  • dist/ is compiled into ES2019 with CommomJS, provided for NodeJS v10
  • dist.es5+umd/ is compiled into ES5 with UMD
    • dist.es5+umd/msgpack.min.js - the minified file
    • dist.es5+umd/msgpack.js - the non-minified file
  • dist.es5+esm/ is compiled into ES5 with ES modules, provided for webpack-like bundlers and NodeJS's ESM-mode

If you use NodeJS and/or webpack, their module resolvers use the suitable one automatically.

CDN / unpkg.com

This library is available via CDN:

<script crossorigin src="https://unpkg.com/@msgpack/msgpack"></script>

It loads MessagePack module to the global object.

Deno Support

You can use this module on Deno.

See example/deno-*.ts for examples.

deno.land/x is not supported yet.

Maintenance

Testing

For simple testing:

npm run test

Continuous Integration

This library uses Travis CI.

test matrix:

  • TypeScript targets
    • target=es2019 / target=es5
  • JavaScript engines
    • NodeJS, browsers (Chrome, Firefox, Safari, IE11, and so on)

See test:* in package.json and .travis.yml for details.

Release Engineering

# run tests on NodeJS, Chrome, and Firefox
make test-all

# edit the changelog
code CHANGELOG.md

# bump version
npm version patch|minor|major

# run the publishing task
make publish

Updating Dependencies

npm run update-dependencies

License

Copyright 2019 The MessagePack community.

This software uses the ISC license:

https://opensource.org/licenses/ISC

See LICENSE for details.

msgpack-javascript's People

Contributors

azu avatar bananaumai avatar carbotaniuman avatar chrisnojima avatar dependabot[bot] avatar gfx avatar grantila avatar hbre avatar jcc10 avatar jhnieman avatar kapsonfire-de avatar mmorel-35 avatar redboltz avatar rileyev avatar sergeyzenchenko avatar uupaa avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

msgpack-javascript's Issues

Async decoding

Hi @gfx I've been looking at async decoding code and looks like it's not doing stream decoding, but instead just tries to decode buffer until it's fully available. I am right?

I think we need to add actual stream decoding. It can speed up decoding significantly for large payloads.

support for type information using typescript decorators ?

Hi, Thank for your work on this library.
I have one question:
Do you have some plans to add features to serialize/deserialize using specific types? like the type reflection in C# using the typescript decorators and reflect-metadata api.
https://www.typescriptlang.org/docs/handbook/decorators.html.
The goal is to be able to encode/decode some class in a custom way like with IMessagePackFormatter<T> in https://github.com/neuecc/MessagePack-CSharp.

First argument must be a Buffer (msgpack + forge aes encryption)

I am trying to excute this code that do

  • read json file

  • change json to message pack

  • encrypt the new json

  • decypt the new json

  • change json message pack to normal json (here is my error)

var fs = require('fs');
var forge = require('node-forge');
var msgpack = require('msgpack');

file = []
x= null
fs.readFile(process.cwd() + "/test.json", function(err, data)
{
  file  = data
  // json = JSON.stringify(JSON.parse(file),null,2);
  packedFile = msgpack.pack(file)
  key = 'SdI6qAn1602f942'

  // encrypt some bytes using ECB mode
  var cipher = forge.cipher.createCipher('AES-ECB', key);
  cipher.start();
  cipher.update(forge.util.createBuffer(packedFile));
  cipher.finish();
  var encrypted = cipher.output;

  // decrypt some bytes using ECB mode
  var decipher = forge.cipher.createDecipher('AES-ECB', key);
  decipher.start();
  decipher.update(encrypted);
  decipher.finish(); // check 'result' for true/false
  // outputs decrypted
  console.log(msgpack.unpack(decipher.output.data)) 
});

But I am getting this error in this line:

console.log(msgpack.unpack(decipher.output.data)) 
Uncaught TypeError: First argument must be a Buffer

My test file is any json file

{
"testin": "test"
}

Add "module" to package.json build for ES modules

Please would it be possible to add a module field in the package.json for an ES2019 bundle?

https://github.com/rollup/rollup/wiki/pkg.module

At the moment it seems like only browser and main are supported, targeting UMD and CommonJS respectively?

At the moment we are using rollup and the line that causes the trouble in bundling (using the commonJs() rollup plugin) is this one:

process.env.TEXT_ENCODING !== "never" && typeof TextEncoder !== "undefined" && typeof TextDecoder !== "undefined";

TextDecoder for string decoding

hi @gfx have you considered using TextDecoder for string decoding? I was able to archive significant performance improvement (up to %30).

It's available in most browsers and NodeJs engine.

Decoding only part of a stream?

Scenario: I have a small app where I store a binary blob. The binary blob has metadata associated with it, so, I think: I will store first some sort of header, and then the binary blob. I decide that the header should be a msgpack item. So I encode() my header, write() the result to a file, and then write() my binary blob. (There are reasons why I do not simply include the binary blob inside the msgpack item.)

When it comes time to read my file back in, I get a ReadableStream for the file, I call decodeAsync() on the ReadableStream, and… I get an error, Extra 512 of 529 byte(s) found at buffer[17]. Which, yes, that is expected, I put it there.

My only options for decoding msgpack seem to be decode/decodeAsync, which error if there is extra data at the end of the stream; and decodeStream, which understands there can be many consecutive data items but assumes they are all msgpack.
I can decode a single msgpack at the start of a stream by doing for await (const idk of decodeStream(fileStream)) { and then doing a break inside the loop, but if i do this I find fileStream is exhausted (0 remaining bytes), so I cannot resume from the stream following that first msgPack item. (And I don't have a way of knowing how many bytes the first msgPack item was, so I can't even start over from the start and skip past it).
Alternately, I can do the for await trick and attempt to read from the stream inside the loop after msgPack has decoded one item, but this won't work either because ReadableStreams are only allowed to have one Reader at a time.

How should I proceed? It seems like "read one item from this stream, but allow me to still use the stream when you are done with it" is not a particularly outlandish use case, but it does not seem to be supported.

`undefined` becomes `null` after deserializing

const msgpack = require('@msgpack/msgpack');

let original = {x: undefined};
let serialized = msgpack.encode(original, {extensionCodec});
let deserialized = msgpack.decode(serialized, {extensionCodec});
console.log(original, deserialized, original.x === deserialized.x);

prints "{ x: undefined } { x: null } false". This is a problem because now code checking for strict equality fails.

I was also not able to handle this in an extension codec because that is only called for non-primitive values.

ignoreUndefined: true doesn't work in all cases:

let original = undefined;
let serialized = msgpack.encode(original, {ignoreUndefined: true});
let deserialized = msgpack.decode(serialized, {ignoreUndefined: true});
console.log(original, deserialized, original === deserialized); // undefined null false

This is stated in the readme, but is there no way to retain this with an extension codec?

Module not in registry

When i run npm i @msgpack/msgpack, an error our:
螢幕快照 2020-08-06 下午5 25 21

Before today, everything is ok, anyone run into the same issue?

decode async problem

trying to return
decodeAsync(response.body) and i get

core.js:7187 ERROR Error: Uncaught (in promise): TypeError: e.getReader is not a function
TypeError: e.getReader is not a function

Too difficult to use decodeStream in browser

I pack multiple datum into a single message. When I try to unpack with decode(), I get an error.

Uncaught RangeError: Extra 6 of 7 byte(s) found at buffer[1]

I understand that in order to decode a multi-part message I must instead use decodeStream(), however it seems this method is designed for Node users and is not very practical to use in the browser since streams support is scarce. In particular, I don't know how to convert number[]/Uint8Array to ReadableStreamLike.

I suggest there should be an alternative non-streams method to decode multi-part messages, such as decodeMultiple(), or just to permit calling decode() multiple times until all messages are processed, instead of assuming extra bytes are an error.

No ES6 Map support?

If I run the following test program with msgpack-javascript 2.1.0:

import { encode, decode } from "@msgpack/msgpack"
const map = new Map()
map.set("a", 1)
map.set("b", 2)
map.set("c", 3)
const encoded = encode(map,{initialBufferSize:128})
const decoded = decode(encoded)
console.log({map, encoded, decoded})

The output is:

Screen Shot 2020-10-04 at 11 40 42 PM

Conclusion: Asking msgpack-javascript to encode a Map causes it to encode it as if it were an empty object.

Supporting Maps in some capacity would be good because if anything it is better to store Maps in msgpack than to store objects. Sometimes deserializing in JavaScript can cause problems or even security flaws if keys overwrite "special" javascript objects like hasOwnProperty or __proto__. Here is an example of such a problem. I suspect msgpack-javascript itself will not have any such security flaws, but it is easy for client code to have such security flaws and the security issues can be sidestepped by using Maps instead of objects.

Fetch API with msgpack

Anybody has a working example for using msgpack with the fetch API and promises.
The doc says the decode can take an ArrayBuffer.
Here is what I tried among other variations:

  fetch('/data.msgpack').then(function(response) {
    return response.arrayBuffer();
  }).then(function(buffer) {
    data = msgpack.decode(buffer);
    // ...
  }).catch(function(error) {
    console.log(error.toString());
  });

The error is "invalid type: undefined".
I am sure the solution is quite close but I can't seem to find it.
I tried returning response.body, response.Blob() or response.text() as well.

I had this working with json and yaml before but I want to switch to msgpack.

Thank you in advance for any suggestion.

Factory to build encode and decode functions to share encoders and decoders

I'm thinking about a factory function that creates { encode, decode } functions, just like as https://github.com/mcollina/msgpack5

The interface is something like this:

import { factory } from "@msgpack/msgpack";
const { encode, decode } = factory({
  extensionCodec, // common options
  encodeOptions, // encoder options
  decodeOptions, // decoder options
});

The factory pattern is also adopted by msgpack-ruby, which is designed by the primary author of MessagePack.

"(!) `this` has been rewritten to `undefined`" error when bundling a TypeScript IIFE

This the relevant rollup config:

{
        input: fileName,
        output: {
            file: "build/js/bubblewrap.js",
            format: "iife",
            name: "bubblewrap"

        },
        plugins: [
            commonjs(),
            nodeResolve(),
            sucrase({
                exclude: ["node_modules/**"],
                transforms: ["typescript"]
            })
        ]
        
    }

And this are the warnings:

(!) `this` has been rewritten to `undefined`
https://rollupjs.org/guide/en/#error-this-is-undefined
node_modules/@msgpack/msgpack/dist.es5+esm/decodeAsync.mjs
1: var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, generator) {
                    ^
2:     function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }
3:     return new (P || (P = Promise))(function (resolve, reject) {
...and 3 other occurrences
node_modules/@msgpack/msgpack/dist.es5+esm/Decoder.mjs
1: var __extends = (this && this.__extends) || (function () {
                    ^
2:     var extendStatics = function (d, b) {
3:         extendStatics = Object.setPrototypeOf ||
...and 11 other occurrences
node_modules/@msgpack/msgpack/dist.es5+esm/utils/stream.mjs
1: // utility for whatwg streams
2: var __generator = (this && this.__generator) || function (thisArg, body) {
                      ^
3:     var _ = { label: 0, sent: function() { if (t[0] & 1) throw t[1]; return t[1]; }, trys: [], ops: [] }, f, y, t, g;
4:     return g = { next: verb(0), "throw": verb(1), "return": verb(2) }, typeof Symbol === "function" && (g[Symbol.iterator] = function() { return this; }), g;
...and 5 other occurrences

I have the impression this is related to #104, but it might be totally different. Any idea?

Can't build with webpack

I'm trying to build my project with @msgpack/[email protected] and that is not possible (error log bellow),

when I downgrade to version 2.2.1 it works again, do you understand why is that?

Can't import the named export '__asyncGenerator' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/utils/stream.mjs 12:11-27
Can't import the named export '__asyncGenerator' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/decodeAsync.mjs
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/Decoder.mjs 142:35-48
Can't import the named export '__asyncValues' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/Decoder.mjs 213:35-48
Can't import the named export '__asyncValues' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/Decoder.mjs 215:49-56
Can't import the named export '__await' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/Decoder.mjs 234:45-52
Can't import the named export '__await' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/Decoder.mjs 261:45-52
Can't import the named export '__await' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/utils/stream.mjs 24:41-48
Can't import the named export '__await' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/decodeAsync.mjs
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/utils/stream.mjs 28:41-48
Can't import the named export '__await' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/decodeAsync.mjs
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/utils/stream.mjs 32:41-48
Can't import the named export '__await' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/decodeAsync.mjs
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/decodeAsync.mjs 7:11-20
Can't import the named export '__awaiter' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/Decoder.mjs 133:15-24
Can't import the named export '__awaiter' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/Decoder.mjs 32:4-13
Can't import the named export '__extends' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/decodeAsync.mjs 9:15-26
Can't import the named export '__generator' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/Decoder.mjs 114:15-26
Can't import the named export '__generator' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/Decoder.mjs 135:19-30
Can't import the named export '__generator' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/Decoder.mjs 205:19-30
Can't import the named export '__generator' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts

ERROR in ./node_modules/@msgpack/msgpack/dist.es5+esm/utils/stream.mjs 14:15-26
Can't import the named export '__generator' from non EcmaScript module (only default export is available)
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/decodeAsync.mjs
 @ ./node_modules/@msgpack/msgpack/dist.es5+esm/index.mjs
 @ ./src/services/helpers/rabbitmq.ts
 @ ./src/index.ts
npm ERR! code ELIFECYCLE
npm ERR! errno 2
npm ERR! [email protected] build: `webpack`
npm ERR! Exit status 2
npm ERR!
npm ERR! Failed at the [email protected] build script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.

npm ERR! A complete log of this run can be found in:
npm ERR!     /root/.npm/_logs/2021-04-07T16_47_29_571Z-debug.log

and here is my webpack config:

const { NODE_ENV = "production" } = process.env;
module.exports = {
  entry: path.resolve(__dirname, "src", "index.ts"),
  mode: NODE_ENV,
  target: "node",
  output: {
    path: path.resolve(__dirname, "dist"),
    filename: "bundle.js",
  },
  resolve: {
    extensions: [".ts", ".js"],
  },
  module: {
    rules: [
      {
        test: /\.ts$/,
        use: ["ts-loader"],
      },
    ],
  },
};

Support encoding ArrayBuffer to bin directly

Using @msgpack/msgpack version 1.12.0 from npm, encoding an ArrayBuffer results in an empty object when decoding it. It would be desirable to have it encode as bin, as it would reduce the amount of redundant code required when encoding the result of functions such as SubtleCrypto.digest().

RangeError: Extra x bytes found at buffer[1]

I am using this library and attempting to send some encoded data from a koa server to a react client and I keep getting the following error

Uncaught (in promise) RangeError: Extra 12287 byte(s) found at buffer[1]

The code for the server is

import Koa from 'koa'
import Router from '@koa/router'
import * as fs from 'fs'
import * as util from 'util'
import cors from '@koa/cors'
import {encode} from '@msgpack/msgpack'

const app = new Koa()

app.use(cors())
const rt = new Router()

const read = async (file: string) => {
	const readFile = util.promisify(fs.readFile)
	return readFile(file)
}

rt.get('/image', async ctx => {
	const data = await read('/home/beckspoppn/Dev/trykoa/image.jpg')
	const resp = {name: 'gary', data: Uint8Array.from(data), age: 37}
	const body = encode(resp)
	console.log(body)
	ctx.status = 200
	ctx.body = body
	ctx.response.type = 'application/x-msgpack'
})

app.use(rt.routes())
app.listen(3000)

and here is the client code

import React from 'react';
import logo from './logo.svg';
import './App.css';
import {decodeAsync} from '@msgpack/msgpack'

const getData = async () => {
  const msgPackType = 'application/x-msgpack'
  const response = await fetch('http://localhost:3000/image')
  const contentType = response.headers.get('content-type')
  console.log(contentType)
  if (contentType !== null && contentType.startsWith(msgPackType) &&
      response.body !== null) {
    console.log(response.body)
    const object = await decodeAsync(response.body)
    console.log(object)
  }

}
function App() {
  return (
    <div className="App">
      <header className="App-header">
        <img src={logo} className="App-logo" alt="logo" />
        <p>
          click button to see error
        </p>
        <button onClick={getData}></button>
      </header>
    </div>
  );
}

export default App;

I also uploaded the code here so you could try it if need be https://github.com/g5becks/msgpack_error/tree/master .

System : Ubuntu 20.04
Node Version : 12.18.2

Not sure if there is something I am doing wrong on my part or not if so, any help is appreciated.

Thanks.

decode() attempts to decode JavaScript Blob type but always fails

Currently, this results in the following fairly cryptic error message: RangeError: offset is outside the bounds of the DataView. It would be preferable to have a more useful error message.

A MessagePack blob can be obtained from a Python server sending a MessagePack over a WebSocket.

[bug] Maximum call stack size exceeded

It is caused in my project. Maybe because of recursion?

RangeError: Maximum call stack size exceeded
    at Encoder.encodeBinary ($proj/node_modules/@msgpack/msgpack/src/Encoder.ts:202:10)
    at Encoder.encodeObject ($proj/node_modules/@msgpack/msgpack/src/Encoder.ts:175:12)
    at Encoder.encode ($proj/node_modules/@msgpack/msgpack/src/Encoder.ts:36:12)
    at Encoder.encodeMap ($proj/node_modules/@msgpack/msgpack/src/Encoder.ts:255:14)
    at Encoder.encodeObject ($proj/node_modules/@msgpack/msgpack/src/Encoder.ts:177:12)
    at Encoder.encode ($proj/node_modules/@msgpack/msgpack/src/Encoder.ts:36:12)
    at Encoder.encodeMap ($proj/node_modules/@msgpack/msgpack/src/Encoder.ts:255:14)
    at Encoder.encodeObject ($proj/node_modules/@msgpack/msgpack/src/Encoder.ts:177:12)
    at Encoder.encode ($proj/node_modules/@msgpack/msgpack/src/Encoder.ts:36:12)
    at Encoder.encodeMap ($proj/node_modules/@msgpack/msgpack/src/Encoder.ts:255:14)
    at Encoder.encodeObject ($proj/node_modules/@msgpack/msgpack/src/Encoder.ts:177:12)
    at Encoder.encode ($proj/node_modules/@msgpack/msgpack/src/Encoder.ts:36:12)
    at Object.encode ($proj/node_modules/@msgpack/msgpack/src/encode.ts:16:11)

BigInt should be handled by native codec instead of extension codec

It is not a good choice to handle 64bit integers with extension codec. While 64 bit integers are built-in types of msgpack and BigInt is a built-in type of javascript, solving 64 bit problem with extension codec is not an compatible way when making cross-language systems. If javascript developers use an extension codec to support 64bit integers, then developers in other languages will have to write an corresponding extension codec, while msgpack implementation in other languages (c++, c, python, java, etc.) natively support mapping 64 bit integers into their language-specific type.

To support BigInt, I think this library should map 64 bit integers to BigInt natively or this library should add support to register custom codec for built-in types.

original issue #114

Export options types for encode/decode

If I want to write a wrapper around encode() or decode() (and the other), I might want to forward the types needed for these functions, e.g. EncodeOptions. But to reach this type I need to import

import { EncodeOptions } from "@msgpack/msgpack/dist/encode"

...which is kindof ugly. I don't want to import from a dist/ folder, that's a technical detail that may change in the future.

It would be great if these types which are implicitly exposed, would also be explicitly exported in the index file.

Add option to strip undefined values off object as JSON does

In JSON, undefined values stripped off the object

> JSON.parse(JSON.stringify({a: undefined}))
{}
> JSON.parse(JSON.stringify({a: undefined})).a
undefined

and undefined values in arrays are converted to null

It would be great if there were some option that removed undefined props at encode time.

Unable to pack int 64

Hello, when I packaged "number", I could not package "int64". After I checked the source code, I changed "Number.isSafeInteger(object)" to "object.toString().indexOf('.') <0" It is temporarily solved. Obviously this is not the best method, but it can temporarily solve my problem. If there is a better solution to identify ”int64“, I hope to inform, thanks

Feature request: map int64/uint64 to BigInt

BigInt is a built-in object that provides a way to represent whole numbers larger than 253 - 1, which is the largest number JavaScript can reliably represent with the Number primitive and represented by the Number.MAX_SAFE_INTEGER constant. BigInt can be used for arbitrarily large integers.

Reference of BigInt here

can't compile with "error TS2304: Cannot find name 'AsyncGenerator'" (or 'AsyncIterable')

Hi, I'm really new to msgpack-javasript. I'm appreciated your great works first 😄

I'm facing errors saying like "error TS2304: Cannot find name 'AsyncGenerator'" when I compile my ts code with tsc.

My environment is something like below.

package.json

{
  "name": "msgpack",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "author": "",
  "license": "ISC",
  "dependencies": {
    "@msgpack/msgpack": "^2.1.0",
    "typescript": "^4.0.3"
  }
}

tsconfig.jso

{
    "compilerOptions": {
        "moduleResolution": "node",
        "lib": ["ES2018", "DOM"]
    }
}

index.ts

import { encode } from "@msgpack/msgpack";

const encoded: Uint8Array = encode({ foo: "bar" });
console.log(encoded);

The following is the full error messages:

$ npx tsc index.ts
node_modules/@msgpack/msgpack/dist/decodeAsync.d.ts:5:162 - error TS2304: Cannot find name 'AsyncGenerator'.

5 export declare function decodeArrayStream<ContextType>(streamLike: ReadableStreamLike<ArrayLike<number>>, options?: DecodeOptions<SplitUndefined<ContextType>>): AsyncGenerator<unknown, void, unknown>;
                                                                                                                                                                   ~~~~~~~~~~~~~~

node_modules/@msgpack/msgpack/dist/decodeAsync.d.ts:6:157 - error TS2304: Cannot find name 'AsyncGenerator'.

6 export declare function decodeStream<ContextType>(streamLike: ReadableStreamLike<ArrayLike<number>>, options?: DecodeOptions<SplitUndefined<ContextType>>): AsyncGenerator<unknown, void, unknown>;
                                                                                                                                                              ~~~~~~~~~~~~~~

node_modules/@msgpack/msgpack/dist/Decoder.d.ts:27:25 - error TS2304: Cannot find name 'AsyncIterable'.

27     decodeAsync(stream: AsyncIterable<ArrayLike<number>>): Promise<unknown>;
                           ~~~~~~~~~~~~~

node_modules/@msgpack/msgpack/dist/Decoder.d.ts:28:31 - error TS2304: Cannot find name 'AsyncIterable'.

28     decodeArrayStream(stream: AsyncIterable<ArrayLike<number>>): AsyncGenerator<unknown, void, unknown>;
                                 ~~~~~~~~~~~~~

node_modules/@msgpack/msgpack/dist/Decoder.d.ts:28:66 - error TS2304: Cannot find name 'AsyncGenerator'.

28     decodeArrayStream(stream: AsyncIterable<ArrayLike<number>>): AsyncGenerator<unknown, void, unknown>;
                                                                    ~~~~~~~~~~~~~~

node_modules/@msgpack/msgpack/dist/Decoder.d.ts:29:26 - error TS2304: Cannot find name 'AsyncIterable'.

29     decodeStream(stream: AsyncIterable<ArrayLike<number>>): AsyncGenerator<unknown, void, unknown>;
                            ~~~~~~~~~~~~~

node_modules/@msgpack/msgpack/dist/Decoder.d.ts:29:61 - error TS2304: Cannot find name 'AsyncGenerator'.

29     decodeStream(stream: AsyncIterable<ArrayLike<number>>): AsyncGenerator<unknown, void, unknown>;
                                                               ~~~~~~~~~~~~~~

node_modules/@msgpack/msgpack/dist/utils/stream.d.ts:1:45 - error TS2304: Cannot find name 'AsyncIterable'.

1 export declare type ReadableStreamLike<T> = AsyncIterable<T> | ReadableStream<T>;
                                              ~~~~~~~~~~~~~

node_modules/@msgpack/msgpack/dist/utils/stream.d.ts:2:86 - error TS2304: Cannot find name 'AsyncIterable'.

2 export declare function isAsyncIterable<T>(object: ReadableStreamLike<T>): object is AsyncIterable<T>;
                                                                                       ~~~~~~~~~~~~~

node_modules/@msgpack/msgpack/dist/utils/stream.d.ts:3:80 - error TS2304: Cannot find name 'AsyncIterable'.

3 export declare function asyncIterableFromStream<T>(stream: ReadableStream<T>): AsyncIterable<T>;
                                                                                 ~~~~~~~~~~~~~

node_modules/@msgpack/msgpack/dist/utils/stream.d.ts:4:83 - error TS2304: Cannot find name 'AsyncIterable'.

4 export declare function ensureAsyncIterabe<T>(streamLike: ReadableStreamLike<T>): AsyncIterable<T>;
                                                                                    ~~~~~~~~~~~~~

It would be appreciated if you give me any advise. Thanks.

Omit property names

I'm using msgpack to transmit objects with a fixed schema, so it feels wasteful to encode the property names with each payload. Is there some way to provide some schema information to the encoder/decoder during initialization so that property names can be excluded from the serialization output?

Proposal of what this api could look potentially like:

import { ValueOnlyEncoder, ValueOnlyDecoder } from "@msgpack/msgpack";

const schema = ... // some representation of your object schema, contains the property names
const encoder = new ValueOnlyEncoder(schema);
const decoder = new ValueOnlyDecoder(schema);

const data = { foo: [1, 2, 3], bar: "baz" };

const encoded = encoder.encode(data); // fails if data doesn't conform to schema
const decoded = decoder.decode(encoded);

Why not use TextDecoder in browser?

These lines apparently set TEXT_ENCODING_AVAILABLE to false in browsers, as process is not defined:

const TEXT_ENCODING_AVAILABLE =
typeof process !== "undefined" &&
process.env.TEXT_ENCODING !== "never" &&
typeof TextEncoder !== "undefined" &&
typeof TextDecoder !== "undefined";

Is that intended? If yes, why?

Refs: #33.

problem when use Float32Array in object's property

Problem

Problem happen when use Float32Array in object's property.

indicate minimum jest test below

import { encode, decode } from "@msgpack/msgpack";

test("issue", () => {
  const f32 = new Float32Array([1.1, 1.2, 1.3]);
  const uint8 = new Uint8Array(f32.buffer);
  expect(new Float32Array(uint8.buffer)).toEqual(f32); //success

  const data = encode({ audio: uint8 });
  const { audio } = decode(data) as { audio: Uint8Array };

  expect(uint8).toEqual(audio); //success

  expect(uint8.buffer).toEqual(audio.buffer); //success

  const fail = new Float32Array(audio.buffer);
  expect(fail).toEqual(f32); //fail

  const success = new Float32Array(new Uint8Array(Object.values(audio)).buffer);
  expect(success).toEqual(f32); //success
});

this test fail in

 const fail = new Float32Array(audio.buffer);
 expect(fail).toEqual(f32); //fail

Enviroment

Node.js : 11.15.0
TypeScript : 3.5.2
Test Framework : Jest

Release 1.3.2

Hi @gfx we need to release new version with typing fixes for old typescripts

Website seems broken

The website repo doesn't support filing issues there, so raising it here.

The "Try!" seems to do nothing. And some content below for "Languages" and "API" seems blank. Tested on Firefox desktop and Chrome Android, same result. I noticed multiple errors in the web console logged:

Uncaught TypeError: document.getElementsByClassName(...)[index] is undefined
    <anonymous> https://msgpack.org/:33
    jQuery 4
msgpack.org:33:31

Embedded Search timelines have been deprecated. See https://twittercommunity.com/t/deprecating-widget-settings/102295. 
<a class="twitter-timeline twitter-timeline-error" data-dnt="true" href="https://twitter.com/sear…OR+MessagePack+lang%3Aen" data-widget-id="343624208380735490" width="350" height="500" data-twitter-extracted-i1614912872937101145="true">
widgets.js:1:8196

Request to access cookie or storage on “<URL>” was blocked because it came from a tracker and content blocking is enabled. 

Profile timeline for screen_name: search not found

decodeArrayStream vs decodeStream

The documentation of decodeArrayStream and decodeStream is a bit vague, and the signatures and examples are identical.

Do we need both? And what's the exact difference? One accepts "an array of items" and the other "independent MessagePack items". Huh?

My exact question is this; Do any of these support byte streams (of arbitrary size) of data consisting of MsgPack items, or do the inputs have to be "complete"?

What I'm explicitly after is support for this:

Consider MsgPack-encoded objects:
+---------------------+ +----------------+ +----------------+
|    MsgPack enc 1    | | MsgPack enc 2  | | MsgPack enc 3  |
+---------------------+ +----------------+ +----------------+
Although split into Uint8Array chunks:
+-------+ +-------+ +-------+ +-------+ +--------+ +--------+ 
|   1   | |   2   | |   3   | |   4   | |   5    | |   6    |
+-------+ +-------+ +-------+ +-------+ +--------+ +--------+ 
When these chunks are provided to decode*Stream(), I get a stream of:
                       +---------------------+
                       |    MsgPack dec 1    |
                       +---------------------+
                                          +----------------+
                                          | MsgPack dec 2  |
                                          +----------------+
                                                             +----------------+
                                                             | MsgPack dec 3  |
                                                             +----------------+
               ------------ Time ----------->

Is this possible?

[FeatureRequest] More type out of box

Background

Migrate from msgpack-lite, which has support for many types out of box.

And I had a look at types at @msgpack/msgpack/dist/encode.d.ts,
seems everything is also OK out-of-box.

export declare function encode<ContextType = undefined>(
  // seems everything is OK out-of-box
  value: unknown, 
  options?: EncodeOptions<SplitUndefined<ContextType>>
): Uint8Array;

Reproduce

var msgpack=require('@msgpack/msgpack')
var nop=x=>msgpack.decode(msgpack.encode(x))

Actual

> nop({x:Buffer.from('x'), y:new ArrayBuffer(0), z:new Error('msg')})
{ x: Uint8Array(1) [ 120 ], y: {}, z: {} }

Expected

> msgpack = require('msgpack-lite')
> nop({x:Buffer.from('x'), y:new ArrayBuffer(0), z:new Error('msg')})
{
  x: <Buffer 78>,
  y: ArrayBuffer { [Uint8Contents]: <>, byteLength: 0 },
  z: Error: msg
}

Suggestion

export declare function encode<ContextType = undefined>(
  value: unknown, 
  options?: EncodeOptions<SplitUndefined<ContextType>>
): Uint8Array;

export declare function encodeOutOfBox<ContextType = undefined>(
  // any types are tested, `decode(encode(value))` should be same out of box
  value: TESTED_TYPES, 
  options?: EncodeOptions<SplitUndefined<ContextType>>
): Uint8Array;

DataView usage overhead

hi @gfx have you considered using direct buffer instead of DataView? DataView performance is still not optimal in many cases. I know it will require additional work on number decoding but it can give performance boost.

I hope I am not too annoying with performance stuff :) We have project with huge msgpack payloads (up to 10-20 MB) that we need to parse in browsers.

Error if node built-in globals are not polyfilled in browser (access to globalThisOrWindow.process)

Due to a direct access to process.env (

process.env.TEXT_ENCODING !== "never" && typeof TextEncoder !== "undefined" && typeof TextDecoder !== "undefined";
) the script fails in the browser if node globals are not polyfilled (default if building with rollup)

is it possible to feature detect "process.env" typeof process !== 'undefined' && typeof process.env !== 'undefined' so that the library runs without polyfilling node globals in the browser?

Reuse Encoder/Decoder instance?

The readme says that encoding is 20% faster when reusing an Encoder instance. If that's the case, is there any reason not to just memoize them in encode (or at least reuse a default instance when no options are set)?

Use with AssemblyScript

There is a way to use this library into an AssemblyScript project?

When I try to compile with asc I obtain this error:

ERROR TS6054: File '~lib/@msgpack/msgpack.ts' not found.

 import { encode } from "@msgpack/msgpack";
                        ~~~~~~~~~~~~~~~~~~
 in assembly/index.ts(7,24)

FAILURE 1 parse error(s)

RangeError issue

Greetings,

I'm trying to figure out an issue, I have a CoAP server that is decoding encoded messagePack packets however every so often I'm getting an error

_packet: { code: '0.02', confirmable: true, reset: false, ack: false, messageId: 14, token: <Buffer >, options: [ [Object], [Object] ], payload: <Buffer dc 00 02 de 00 18 d9 02 45 56 cd 00 04 d9 04 47 53 50 54 c0 d9 02 53 56 c0 d9 02 48 50 c0 d9 02 43 51 d1 00 1f d9 02 47 53 c0 d9 02 47 54 ce 00 00 02 ... 462 more bytes>, piggybackReplyMs: 50 }, _payloadIndex: 0, [Symbol(kCapture)]: false }

parse error RangeError: Offset is outside the bounds of the DataView at DataView.getUint8 (<anonymous>) at Decoder.readU8 (/Users/braydonharris/dev/axcesspoint/coap-server/node_modules/@msgpack/msgpack/dist/Decoder.js:517:33) at Decoder.doDecodeSync (/Users/braydonharris/dev/axcesspoint/coap-server/node_modules/@msgpack/msgpack/dist/Decoder.js:219:31) at Decoder.decode (/Users/braydonharris/dev/axcesspoint/coap-server/node_modules/@msgpack/msgpack/dist/Decoder.js:83:29) at decode (/Users/braydonharris/dev/axcesspoint/coap-server/node_modules/@msgpack/msgpack/dist/decode.js:13:20)

I may just have a miss-configuration and not sure if I'm missing something.

Cheers!

Contextual state shared across encode/decode

I need to pass state to the extension codecs, to be able to decode values properly. The decoding needs state to be able to construct the JavaScript values. It would be very cumbersome to rewrite the code to wrap all the decoding in a super function just to be able to capture the state, and it would mean overhead.

Would it be possible to add a custom field in DecodeOptions and add an optional second argument to encode()/decode() (and the siblings) so that you could pass a state through?

E.g.:

const ctx = { // This is my local context
	session, // I have multiple sessions, and they "hold"/own encodable variables
};

const extensionCodec = new ExtensionCodec< typeof ctx >( );
extensionCodec.register( {
	type: 0,
	// In my case, encoding doesn't need ctx...
	encode( input, ctx ) {
		if ( input instanceof Foo )
			return input.encode( );
		else
			return null;
	},
	// But decode needs ctx to know where to "attach" Foo
	decode( input, ctx ) {
		const foo = Foo.decode( input );
		ctx.session.registerFoo( foo ); // <-- like so
		return foo;
	}
} );

decode( buf, { extensionCodec, ctx } );

Would you be interested in implementing this, or should I start doing it myself and provide a PR?

Cross Platform Generation

Hi,

Well this is more a question than an issue :)
Is there any auto generation solution for cross platform?
For example, say my server is written in c# (.NET) and my client is written in Javascript (React Native), is there some auto generation tool to convert C# MsgPack objects to Javascript MsgPack objects?

Thanks,
Tomer

Bad Decoder state

Sometimes I get this error in my app when I use a cached instance of the Decoder class:

Error: The type of key must be string or number but object
    at Decoder.doDecodeSync (/usr/app/src/node_modules/@msgpack/msgpack/dist/Decoder.js:383:31)
    at Decoder.decode (/usr/app/src/node_modules/@msgpack/msgpack/dist/Decoder.js:83:29)
    at FileCacheStorage.get (/usr/app/src/app_out/cache/file-cache.js:20:33)
    at async Cache.get (/usr/app/src/app_out/cache/index.js:21:23)
    at async Promise.all (index 8)
    at async process (/usr/app/src/app_out/www/routers/user.js:37:167)
    at async /usr/app/src/app_out/www/routers/user.js:20:5
    at async wrappedMiddleware (/usr/app/src/node_modules/@awaitjs/express/index.js:116:7)

It goes away when I restart the app.

Looks like the Decoder.reinitializeState() method is failing to reset some vars.

`Error: The type of key must be string or number but object` is thrown when decoding a object with null as key

const b = Buffer.from("gaRkYXRhgaRzaG9wgqdhbGNoZW15gqltYXhfY291bnRGqmV4ZWNfY291bnQAr3JlY292ZXJfc3RhbWluYYbAAaVjb3VudB6pbWF4X2NvdW50HqpleGVjX2NvdW50AKhyZWNvdmVyeXikY29zdCg=", "base64");
msgpack5 = require("msgpack5");
console.log(msgpack5().decode(b));
msgpack = require("@msgpack/msgpack");
console.log(msgpack.decode(b));

with the above code, msgpack5 can successfully decode the content but msgpack throws Error: The type of key must be string or number but object

The version of msgpack I'm using is 2.4.1, Node version is v10.14.2

The json version of the message is:

{
    "data":{
        "shop":{
            "alchemy":{
                "max_count":70,
                "exec_count":0
            },
            "recover_stamina":{
                "":1,
                "count":30,
                "max_count":30,
                "exec_count":0,
                "recovery":120,
                "cost":40
            }
        }
    }
}

It seems that this is caused by an empty string key.

According to this link, an empty string is a valid json key, I think this applied to Msgpack as well?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.