Giter Site home page Giter Site logo

pino-multi-stream's Introduction

pino-multi-stream CI

pino-multi-stream is a wrapper around the pino logger. The purpose of pino-multi-stream is to provide a stop-gap method for migrating from the Bunyan logger. Whereas pino allows only one destination stream, pino-multi-stream allows multiple destination streams via the same configuration API as Bunyan.

Please see the caveats section for some important information regarding the performance of this module.

Install

For Pino v7+

npm install -s pino-multi-stream

For Pino v5 and v6

npm install -s pino-multi-stream@legacy

pino-multi-stream does not provide the CLI that pino provides. Therefore, you should not install it globally.

Usage

var fs = require('fs')
var pinoms = require('pino-multi-stream')
var streams = [
  {stream: fs.createWriteStream('/tmp/info.stream.out')},
  {level: 'fatal', stream: fs.createWriteStream('/tmp/fatal.stream.out')}
]
var log = pinoms({streams: streams})

log.info('this will be written to /tmp/info.stream.out')
log.fatal('this will be written to /tmp/fatal.stream.out')

API

The API for pino-multi-stream is the same as that for pino. Please read pino's documentation for full details. Highlighted here are the specifics for pino-multi-stream:

  • The signature for constructor remains the same, pino(opts, stream), but there are a few conditions under which you may get a real pino instance or one wrapped by pino-multi-stream:

    1. If the opts parameter is a writable stream, then a real pino instance will be returned.
  1. If the opts parameter is an object with a singular stream property then a real pino instance will be returned. If there is also a plural streams property, the singular stream property takes precedence.
  2. If the opts parameter is an object with a plural streams property, does not include a singluar stream property, and is an array, then a pino-multi-stream wrapped instance will be returned. Otherwise, opts.streams is treated a single stream and a real pino instance will be returned.
  • The pino options object accepts a streams option, as alluded to in then previous item. This option should be an array of stream objects. A stream object is one with at least a stream property and, optionally, a level property. For example:

    var logger = pinoms({
      streams: [
        {stream: process.stdout}, // an "info" level destination stream
        {level: 'error', stream: process.stderr} // an "error" level destination stream
      ]
    })

pinoms.level set accessor

You can set the level to all streams by changing the level property. It accepts the same parameters as pino. If the level is changed on a child logger, it does not alter the parent streams level. As this is costly operation, we recommend not changing the level for each child logger that is being created.

pinoms.level get accessor

The behavior of the get accessor changes if { bunyan: true } is passed to pinoms. In that case, it implements the bunyan.level function.

pinoms.prettyStream({ [prettyPrint], [prettifier], [dest] })

Manually create an output stream with a prettifier applied.

var fs = require('fs');
var pinoms = require('pino-multi-stream')

var prettyStream = pinoms.prettyStream()
var streams = [
    {stream: fs.createWriteStream('my.log') },
    {stream: prettyStream }
]

var logger = pinoms(pinoms.multistream(streams))

logger.info("HELLO %s!", "World")

The options object may additionally contain a prettifier property to define which prettifier module to use. When not present, prettifier defaults to pino-pretty (must be installed as a separate dependency).

The method may be passed an alternative write destination, but defaults to process.stdout.

Prettifying options (after 4.2.0) are to be set like this:

const prettyStream = pinoms.prettyStream(
{
 prettyPrint:
  { colorize: true,
    translateTime: "SYS:standard",
    ignore: "hostname,pid" // add 'time' to remove timestamp
  },
 prettifier: require('pino-pretty') // not required, just an example of setting prettifier
    // as well it is possible to set destination option
}
);

Caveats

Stern warning: the performance of this module being dependent on the number of streams you supply cannot be overstated. This module is being provided so that you can switch to pino from Bunyan and get some immediate improvement, but it is not meant to be a long term solution. We strongly suggest that you use this module for only as long as it will take you to overhaul the way you handle logging in your application. pino-multi-stream offers close to zero overhead if there is only one destination stream.

To illustrate what we mean, here is a benchmark of pino and Bunyan using "multiple" streams to write to a single stream:

benchBunyanOne*10000: 703.071ms
benchPinoMSOne*10000: 287.060ms

Now let's look at the same benchmark but increase the number of destination streams to four:

benchBunyanFour*10000: 2249.955ms
benchPinoMSFour*10000: 1017.886ms

And, finally, with ten destination streams:

benchBunyanTen*10000: 4950.301ms
benchPinoMSTen*10000: 3127.361ms

License

MIT License

pino-multi-stream's People

Contributors

bhicks avatar dependabot[bot] avatar fdawgs avatar jchen038 avatar jimmiehansson avatar jsumners avatar leorossi avatar mcollina avatar mmarchini avatar ovhemert avatar robertslando avatar sheldhur avatar shogunpanda avatar yaskevich avatar zhangyijiang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

pino-multi-stream's Issues

Recreate logfile?

probably by design, but just checking.

I see that the specified logfile is not recreated if it was removed sometime after creation.
I expected it to be recreated if some process would rotate the file for example.
Renaming the file retains it as a destination, though.

const util = require('util')
const fs = require('fs')
const fsUnlink = util.promisify(fs.unlink)
const pinoms = require('pino-multi-stream')
const streams = [
  {stream: fs.createWriteStream('/tmp/info.stream.out')},
  {level: 'fatal', stream: fs.createWriteStream('/tmp/fatal.stream.out')}
]
var log = pinoms({streams: streams})

log.info('this will be written to /tmp/info.stream.out')
log.fatal('this will be written to /tmp/fatal.stream.out')

const moveLog = async () => {
  await fsUnlink('/tmp/info.stream.out')
}
moveLog()

log.info('this will NOT be written to /tmp/info.stream.out')

`name` field not being respected

It appears the multi-stream implementation isn't passing through the name property:

const pino = require('pino');
const pinoMulti = require('pino-multi-stream');

const parent = pino({
    name: 'foobar',
    safe: true,
    level: 'info'
});

const child = parent.child({
    component:'foo'
});

parent.info({ a:1, b:2 }, 'parent');
child.info({ a:1, b:2 }, 'child');

Using this snippet, the results look correct:

{"pid":46022,"hostname":"nfml-aliu","name":"foobar","level":30,"time":1501542340491,"msg":"parent","a":1,"b":2,"v":1}
{"pid":46022,"hostname":"nfml-aliu","name":"foobar","level":30,"time":1501542340493,"msg":"child","component":"foo","a":1,"b":2,"v":1}

If you change the snippet above to use pinoMulti, notice the missing name field:

{"pid":46046,"hostname":"nfml-aliu","level":30,"time":1501542418322,"msg":"parent","a":1,"b":2,"v":1}
{"pid":46046,"hostname":"nfml-aliu","level":30,"time":1501542418324,"msg":"child","component":"foo","a":1,"b":2,"v":1}

unknown level undefined error when using prettyPrint

const logger = pinoms({
  base: null,
  level: config.logLevel,
  // prettyPrint: process.env.NODE_ENV !== 'production',
  redact: {
    paths: [
      '*.headers.authorization',
      '*.headers.cookie',
      "*.headers['set-cookie']",
    ],
  },
  streams: [
    // Log everything to stdout
    { stream: process.stdout },
    // In addition to logging to stdout, log error and fatal messages to sentry
    { level: 'error', stream: pinoSentryStream },
    { level: 'fatal', stream: pinoSentryStream },
  ],
})

The above code works, however when removing the comment from prettyPrint the following error is thrown:

/Users/pascal/code/node_modules/pino-multi-stream/node_modules/pino/lib/levels.js:86
  if (values[level] === undefined) throw Error('unknown level ' + level)
                                         ^
Error: unknown level undefined
    at Pino.setLevel (/Users/pascal/code/node_modules/pino-multi-stream/node_modules/pino/lib/levels.js:86:42)
    at Pino.set level [as level] (/Users/pascal/code/node_modules/pino-multi-stream/node_modules/pino/lib/proto.js:55:45)
    at fixLevel (/Users/pascal/code/node_modules/pino-multi-stream/index.js:33:16)
    at Object.pinoMultiStream [as default] (/Users/pascal/code/node_modules/pino-multi-stream/index.js:27:12)
    at Object.<anonymous> (/Users/pascal/code/projects/daemon/src/utils/logger.ts:15:22)
    at Module._compile (internal/modules/cjs/loader.js:956:30)
    at Module.m._compile (/Users/pascal/code/node_modules/ts-node/src/index.ts:814:23)
    at Module._extensions..js (internal/modules/cjs/loader.js:973:10)
    at Object.require.extensions.<computed> [as .ts] (/Users/pascal/code/node_modules/ts-node/src/index.ts:817:12)
    at Module.load (internal/modules/cjs/loader.js:812:32)

Using prettyPrint: true with the default pino package works just fine.

@mcollina Has this anything to do with #29 ?

Custom levels don't behave the same with and without pino-multi-stream

Hello,

I'm creating a pino logger with some custom levels.
This is my code:

const pino = require("pino");
const pinoms = require("pino-multi-stream");
const streams = [
   { stream: process.stdout }
];
const logger = pino({
    customLevels: {
      foo: 15,
      bar: 25,
      baz: 35,
      bazz: 45
    },
    level: "bar",
    useLevelLabels: true,
}, pinoms.multistream(streams));

When I send logs with level set to bar, this is what I get

  logger.trace("trace");  // doesn't log --> expected
  logger.foo("foo");      // doesn't log --> expected
  logger.bar("bar");      // doesn't log --> I should have a log
  logger.info("info");    // log --> expected
  logger.baz("baz");     // log --> expected
  logger.bazz("bazz");  // log --> expected

When I send logs with level set to foo, only info, baz and bazz are logging, which is not normal because I would expect foo and bar to also log.

Let's say that I change my levels to the following:

customLevels: {
      foo: 1,
      bar: 2,
      baz: 3,
      bazz: 4
    }

Only info will log. My custom levels will never log, whatever I do.

Finally, let's say that I change my levels to the following:

customLevels: {
      foo: 100,
      bar: 200,
      baz: 300,
      bazz: 400
    }

Here the logger behaves as it should. For example, when setting the level to foo, then foo, bar, baz and bazz will all log.

If I don't use pino-multi-stream, then all the custom levels behave as expected in any of the situations described above.

It seems to me that pino-multi-stream doesn't behave correctly for the custom levels which have a value below 30.

Steam into file

I use this code to stream into a file. But the created file ist empty. Is there something wrong with my code?

const fileStream = pinoms.prettyStream(
	{
		prettyPrint: {
			colorize: true,
			levelFirst: true,
			translateTime: "yyyy-dd-mm, h:MM:ss TT",
		},
	},
	pinoms.destination({
		dest: './my-file', // omit for stdout
		minLength: 4096, // Buffer before writing
		sync: true}) // Asynchronous logging)
)

const streams = [
	{stream: fileStream}
]

const logger = pinoms(pinoms.multistream(streams))

logger.info('HELLO %s!', 'World')

dedupe logs

If I have more than one stream I will have multiple logs, for example if one is set to info level and the other to error the first one will also contain error logs, is there a way to prevent this? I mean, if multiple streams are provided, only send those logs to the stream with the higher level only?

multi-stream not working with pretty-print

Today I have tried to pass 'prettyPrint' option when created pino logger instance.

Working example:

  const pinoStreams = pinoms.multistream([
       { stream: pino.destination('/tmp/log1' ) },
       { stream: pino.destination('/tmp/log2') },
  ]);

    const pinoLog = pino({
       level: 'trace
    })(pinoStreams)

Not working example:

  const pinoStreams = pinoms.multistream([
       { stream: pino.destination('/tmp/log1' ) },
       { stream: pino.destination('/tmp/log2') },
  ]);

    const pinoLog = pino({
       level: 'trace,
       prettyPrint: { colorize: true } 
    })(pinoStreams)

I looked at the source and realized that the context of the call was lost (and with it the this.lastLevel value is not defined).

I used tracing and assume that the call context in the extra wrapper is lost

Working example trace:

    at Object.write (/home/nik/workspace/dcloud-server/node_modules/pino-multi-stream/multistream.js:44:13)
    at Pino.write (/home/nik/workspace/dcloud-server/node_modules/pino/lib/proto.js:161:15)
    at Pino.LOG [as info] (/home/nik/workspace/dcloud-server/node_modules/pino/lib/tools.js:38:21)

Not working example trace:

    _at Object.write (/home/nik/workspace/dcloud-server/node_modules/pino-multi-stream/multistream.js:44:13)
    at Object.write (/home/nik/workspace/dcloud-server/node_modules/pino/lib/tools.js:240:12)
    at Pino.write (/home/nik/workspace/dcloud-server/node_modules/pino/lib/proto.js:161:15)
    at Pino.LOG [as info] (/home/nik/workspace/dcloud-server/node_modules/pino/lib/tools.js:38:21)_

Passing arguments to prettyStream

Currently, this method of pino-multi-stream looks like this:
pinoms.prettyStream({ [prettifier], [dest] }).

But original function is getPrettyStream (opts, prettifier, dest).
So there is no way to pass features object (with features like translateTime, ignore, colorize).
The module just puts an empty object there (index.js line 91):

return getPrettyStream({}, prettifier, dest)

Could you unify API of original function from pino\lib\tools and method of this module?

Not working in browser

Seem like using this in a browser environment fails due to the deconstruction of a property not found in the pino browser version here.

Any reasons why this shouldn't be used in the browser?

Doesn't work with prettyPrint

var multistream = require('pino-multi-stream').multistream
var streams = [
  {stream: process.stdout},
  {stream: fs.createWriteStream(_base+'/logs/' + moment().format('YYYY-MM-DD') + '.pino.log')}
]

var pino = require('pino')({
    level: process.env.level || 'info',
    prettyPrint: process.env.stage ? true : false
}, multistream(streams));

 pino.error('test');

Gives you:

/home/ec2-user/workspace/app.js:26
    logger.error(err);
          ^

TypeError: Cannot read property 'error' of undefined
    at process.<anonymous> (/home/ec2-user/workspace/app.js:26:11)
    at emitOne (events.js:96:13)
    at process.emit (events.js:188:7)
    at process._fatalException (bootstrap_node.js:292:26)

Streams do not encompass lower levels

var pinoms = require('pino-multi-stream')
var log = pinoms({streams: [{level: 'debug', stream: process.stderr}]})

log.debug('foo') // logs 'foo' to stderr
log.info('bar') // does nothing because log.info is a noop

Serializers don't work with multi-stream

> var pinoms = require('pino-multi-stream');
undefined
> var msLog = pinoms({serializers: pinoms.stdSerializers, streams: [{stream: process.stdout}, {stream: process.stderr}]})
undefined
> msLog.info({req: {foo: 'bar'}});
{"pid":22638,"hostname":"lgud-yunong","level":30,"time":1497550501568,"req":{"foo":"bar"},"v":1}
{"pid":22638,"hostname":"lgud-yunong","level":30,"time":1497550501570,"req":{"foo":"bar"},"v":1}
undefined
> var log = pinoms({serializers: pinoms.stdSerializers})
undefined
> log.info({req: {foo: 'bar'}});
TypeError: Cannot read property 'remoteAddress' of undefined
    at Object.asReqValue [as req] (/home/yunong/workspace/nodequark/node_modules/pino/lib/serializers.js:8:34)
    at EventEmitter.asJson (/home/yunong/workspace/nodequark/node_modules/pino/pino.js:143:77)
    at EventEmitter.pinoWrite (/home/yunong/workspace/nodequark/node_modules/pino/pino.js:201:16)
    at EventEmitter.LOG (/home/yunong/workspace/nodequark/node_modules/pino/lib/tools.js:122:10)
    at repl:1:5
    at sigintHandlersWrap (vm.js:22:35)
    at sigintHandlersWrap (vm.js:73:12)
    at ContextifyScript.Script.runInThisContext (vm.js:21:12)
    at REPLServer.defaultEval (repl.js:346:29)
    at bound (domain.js:280:14)

Notice here that the multistream logger is not respecting the request serializer. Is this expected or am I using the API incorrectly?

module crashes at the configuration stage

Hi, I am trying to use "pino-multi-stream" but I get an error.

talentumtuum@mac pino-multi-stream-example % node ./src/main.js
/Users/talentumtuum/Documents/me/pino-multi-stream-example/node_modules/pino/lib/levels.js:90
  if (values[level] === undefined) throw Error('unknown level ' + level)
                                   ^

Error: unknown level undefined
    at Pino.setLevel (/Users/talentumtuum/Documents/me/pino-multi-stream-example/node_modules/pino/lib/levels.js:90:42)
    at Pino.set level [as level] (/Users/talentumtuum/Documents/me/pino-multi-stream-example/node_modules/pino/lib/proto.js:63:38)
    at fixLevel (/Users/talentumtuum/Documents/me/pino-multi-stream-example/node_modules/pino-multi-stream/index.js:33:16)
    at pinoMultiStream (/Users/talentumtuum/Documents/me/pino-multi-stream-example/node_modules/pino-multi-stream/index.js:27:12)
    at Object.<anonymous> (/Users/talentumtuum/Documents/me/pino-multi-stream-example/src/main.js:4:16)
    at Module._compile (internal/modules/cjs/loader.js:1068:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:1097:10)
    at Module.load (internal/modules/cjs/loader.js:933:32)
    at Function.Module._load (internal/modules/cjs/loader.js:774:14)
    at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:72:12)

nodejs version

talentumtuum@mac pino-multi-stream-example % node -v 
v14.17.0

my code

const pinoms = require('pino-multi-stream');
const fs = require('fs');

const logger = pinoms({
    prettyPrint: true,
    streams: [
        { stream: fs.createWriteStream('./all.out') },
        { level: 'info', stream: fs.createWriteStream('./info.out') }
    ]
});

logger.info('hello world');

my package.json file

{
  "name": "pino-multi-stream-example",
  "version": "1.0.0",
  "main": "index.js",
  "license": "MIT",
  "dependencies": {
    "pino": "6.13.1",
    "pino-multi-stream": "5.3.0",
    "pino-pretty": "6.0.0"
  }
}

Log no longer written to multiple streams

Im having an issue since version 5.0 that a log only gets written to one destination stream.
Here is my config:

const logger = pinoms({
  base: null,
  redact: {
    paths: [
      '*.headers.authorization',
      '*.headers.cookie',
      "*.headers['set-cookie']",
    ],
  },
  streams: [
    // Log everything to stdout in production, prettify in dev environments.
    {
      level: config.logLevel,
      stream:
        process.env.NODE_ENV === 'production'
          ? process.stdout
          : pinoms.prettyStream(),
    },
    // In addition to logging to stdout, capture error and fatal messages with sentry
    { level: 'error', stream: pinoSentryStream },
  ],
})

When running logger.error({ err }, 'Something went wrong') I expect it to be logged both to stdout and to sentry, but it is only logged to sentry.

multilog

Hi, thank you for pino-multi-stream and your thoughtful warning about the multiple destinations. I came across the tool called multilog, and wanted to ask you if you have any opinions about it? It seems to work well.

https://cr.yp.to/daemontools/multilog.html

It can be started like this:

node src/bin/server.js | tee out >(multilog '-*' '+*"level":30*' ./log/info '-*' '+*"level":50*' ./log/other)      

and after further digging, I have also found a go rewrite that is modern and that can work with promtail to ship the logs and also log to syslog.

https://github.com/UweOhse/multislog

Deprecate this module

Given that both pino@7 and pino@8 shipped with this feature built in, I think we should deprecate this module and possibly archive it.

At a minimum, we should let people know this functionality is embedded in pino now.

Stream level changes the level of another stream

Hello,

I'm using multiple streams, and I set the level property for all the streams. The problem I have is, when I change only one child logger level, it changes the another stream I have.

For example:

pinoMultiStream({
    name: 'name',
    level: 'error',
    streams: [
      {
        stream: ringBuffer,
        level: 'debug', 
      },
      {
        stream: process.stdout,
      },
    ],
  });

The level of process.stdout changes to debug, printing all the logs into console. Am I missing some configuration?

Thank you.

log.child(obj).info(string) throws 'unknown level' error

const log = pinoms({streams: streams})
log.child({a: 'A'}).info('TEST')

throws

/core/node_modules/pino/pino.js:102
     throw Error('unknown level ' + level)
     ^
 
 Error: unknown level undefined
     at EventEmitter._setLevel (/core/node_modules/pino/pino.js:102:11)
     at EventEmitter.value (/core/node_modules/pino-multi-stream/index.js:49:18)
     at applyOptions (/core/node_modules/pino/lib/tools.js:27:8)
     at EventEmitter.child (/core/node_modules/pino/pino.js:201:3)
     at Object.<anonymous> (/core/framework/loggers/index.js:22:5)
     at Module._compile (module.js:624:30)
     at Object.Module._extensions..js (module.js:635:10)
     at Module.load (module.js:545:32)
     at tryModuleLoad (module.js:508:12)
     at Function.Module._load (module.js:500:3)

Provide example how to use standard pino + one more stream

I want to use pino in api form (pino + attach prettier), but additionally, add sentry logger:

But all the current examples show that I must create myself default stream:

var streams = [
  {stream: fs.createWriteStream('/tmp/info.stream.out')},
  {level: 'fatal', stream: fs.createWriteStream('/tmp/fatal.stream.out')}
]
var log = pinoms({streams: streams})

how to let pino decide the default streams?

const logger = require('pino')()

logger.info('hello world')

here it just works, I don't figure out a path for stdout.

Destination write stream file is bloated with NUL characters at head

Title.

I use pino-multi-stream with fastify

details

fastify 1.14.6
pino 5.17.0
pino-multi-stream 4.3.0

My app runs in container with mapped volume. There are 2 streams, one is written to a file in the mapped volume, the other is written to std.out of the docker container.

There is only 1 node process.

Level is not being set upon initilization

var pinoms = require('pino-multi-stream');
var pretty = pinoms.pretty();
pretty.pipe(process.stdout);

var streams = [
  {stream: process.env.stage ? pretty : process.stdout },
  {stream: fs.createWriteStream(_base+'/logs/' + (process.env.stage ? 'debug.log' : 'prod.log'), { flags: 'a'})}
]
var pino = pinoms({
        streams: streams,
        level: process.env.level || 'debug',
});

pino.info('test'); //prints
pino.debug('test'); //silence

only specifying level explicitly solved the problem:
pino.level = 'debug'; //OK now

Log redaction not working in production

I noticed log redaction is not working in production.

Take the following code:
index.js

const pinoms = require('pino-multi-stream')
const express = require('express')
const expressPino = require('express-pino-logger')

const logger = pinoms({
    base: null,
    redact: {
      paths: ['*.headers.authorization'],
    },
    streams: [
      // Log everything to stdout in production, prettify in dev environments.
      {
        level: process.env.NODE_ENV === 'production' ? 'info' : 'debug',
        stream:
          process.env.NODE_ENV === 'production'
            ? process.stdout
            : pinoms.prettyStream(),
      },
    ],
  })
  
const expressLogger = logger.child({ name: 'express' })  

const app = express()

app.use(
  expressPino({
    logger: expressLogger,
    autoLogging: {
    ignorePaths: ['/ping'],
    },
  })  
)

app.listen(8000)

package.json

{
  "name": "pino-redaction-test",
  "version": "1.0.0",
  "main": "index.js",
  "scripts": {
    "dev": "NODE_ENV=development node index.js",
    "prod": "NODE_ENV=production node index.js"
  },
  "dependencies": {
    "express": "^4.17.1",
    "express-pino-logger": "^5.0.0",
    "pino-multi-stream": "^5.0.0",
    "pino-pretty": "^4.0.0"
  }
}

Now run yarn dev (npm run dev) and type the following into your terminal:
curl -H "authorization: Bearer test" http://localhost:8000/test

This will produce the following output:

[1590174448481] INFO  (express): request completed
    res: {
      "statusCode": 404,
      "headers": {
        "x-powered-by": "Express",
        "content-security-policy": "default-src 'none'",
        "x-content-type-options": "nosniff",
        "content-type": "text/html; charset=utf-8",
        "content-length": 143
      }
    }
    responseTime: 4
    req: {
      "id": 1,
      "method": "GET",
      "url": "/test",
      "headers": {
        "host": "localhost:8000",
        "user-agent": "curl/7.64.1",
        "accept": "*/*",
        "authorization": "[Redacted]"
      },
      "remoteAddress": "::ffff:127.0.0.1",
      "remotePort": 55458
    }

As you can see req.headers.authorization is redacted.

Now kill the process and run yarn prod (npm run prod). This runs the same code with NODE_ENV env variable set to production. This changes 2 things with our code:
process.stdout is used instead of pinoms.prettystream(). level is set to info instead of debug. As far as I know both changes should not affect redact funcitonality in any way.

Now run the same request as before:
curl -H "authorization: Bearer test" http://localhost:8000/test

You will get the following log output:

{"level":30,"time":1590174859337,"name":"express","req":{"id":1,"method":"GET","url":"/test","headers":{"host":"localhost:8000","user-agent":"curl/7.64.1","accept":"*/*","authorization":"Bearer test"},"remoteAddress":"::ffff:127.0.0.1","remotePort":55556},"res":{"statusCode":404,"headers":{"x-powered-by":"Express","content-security-policy":"default-src 'none'","x-content-type-options":"nosniff","content-type":"text/html; charset=utf-8","content-length":143}},"responseTime":4,"msg":"request completed"}

As you can see req.headers.authorization is no longer redacted.

TypeError: Cannot read property 'flushSync' of undefined

Hi there, I'm trying to implement exit logging – should a pino-multi-stream instance work with pino.final?

My

const stdout = isProduction
  ? pino.destination(1)
  : pinoMultiStream.prettyStream({
      prettyPrint: {
        colorize: true,
        ignore: 'pid,hostname',
        translateTime: 'UTC:yyyy-mm-dd HH:MM:ss.l Z',
      },
    });

export const logger = pinoMultiStream({
  streams: [
    { level: 'debug', stream: stdout },
    { level: 'trace', stream: pino.destination(logPath) },
  ],
});

const shutdown = (exitCode: 0 | 1) => {
  logger.info('Shutting down...');
  healthCheck.close(() => {
    logger.info('Health check terminated.');
    process.exitCode = exitCode;
  });
};

const gracefulExit = pino.final(logger, (err, finalLogger) => {
  const signal = (err as unknown) as string; // type bug: `err` is a `string` in this case
  finalLogger.info(signal, 'Exit Signal');
  shutdown(0);
});
process.on('SIGINT', gracefulExit);
process.on('SIGTERM', gracefulExit);
process.on(
  'uncaughtException',
  pino.final(logger, (err, finalLogger) => {
    finalLogger.error(err, 'uncaughtException');
    shutdown(1);
  })
);
process.on(
  'unhandledRejection',
  pino.final(logger, (err, finalLogger) => {
    finalLogger.error(err, 'unhandledRejection');
    shutdown(1);
  })
);

When I trigger an uncaughtException, I see the following trace:

.../node_modules/pino/lib/tools.js:364
  if (typeof stream.flushSync !== 'function') {
                    ^

TypeError: Cannot read property 'flushSync' of undefined
    at Function.final (/.../node_modules/pino/lib/tools.js:364:21)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.