Giter Site home page Giter Site logo

justeat / nlog.structuredlogging.json Goto Github PK

View Code? Open in Web Editor NEW
51.0 23.0 26.0 336 KB

Structured logging for NLog using Json (formerly known as JsonFields)

License: Other

C# 96.88% PowerShell 2.74% Shell 0.38%
dotnet nlog json structured-logging kibana elk-stack

nlog.structuredlogging.json's Introduction

NLog.StructuredLogging.Json

NuGet version

Linux Windows
Build Status Build status Build status
Build History Build history Build history

Join the chat at https://gitter.im/justeat/NLog.StructuredLogging.Json

codecov |

What

Structured logging with NLog. Generates log entries as JSON. These can .e.g. be sent to Kibana over NXLog.

for each LogEventInfo message, render one JSON object with any parameters as properties.

What problems does this solve

Structured logging

When logging without StructuredLogging.Json, the "Message" field is used to hold unstructured data, e.g.:

@LogType: nlog
Level: Warn
Message: Order 1234 resent to Partner 4567

When we want to query Kibana for all occurrences of this log message, we have to do partial string matching as the message is slightly different each time. When we want to query Kibana for all messages related to this order, we also have to do partial string matching on the message as the orderId is embedded in the message.

When logging with StructuredLogging.Json, the data is written as Json with extra fields containing any data that you add to the log entry. So the log line written by NLog might be e.g.:

{
"TimeStamp":"2016-09-21T08:11:23.483Z","Level":"Info","LoggerName":"Acme.WebApp.OrderController",
"Message":"Order resent to partner","CallSite":"Acme.WebApp.OrderController.ResendOrder",
"OrderId":"1234","PartnerId":"4567",
"NewState":"Sent","SendDate":"2016-09-21T08:11:23.456Z"
}

This is well formatted for sending to Kibana.

In Kibana you get:

@LogType: nlog
Level: Warn
Message: Order resent to partner
OrderId: 1234
PartnerId: 4567
NewState: Sent

This makes it much easier to search Kibana for the exact message text and see all the times that this log statement was fired, across time. We can also very easily search for all the different log messages related to a particular orderId, partnerId, or any other fields that can be logged.

Simpler, more flexible logging configuration

No need for a custom nxlog configuration file, and no need to specify all the columns used.

How to get it

  1. Update the dependencies as below
  2. Install the NLog.StructuredLogging.Json package from NuGet
  3. Update your NLog config so you write out JSON with properties
  4. Add additional properties when you log

Update the dependencies

  • Ensure you have version of NLog >= 4.5.0 (assembly version 4.0.0.0 - remember to update any redirects)
  • Ensure you have version of Newtonsoft.Json >= 9.0.1
Update-Package NLog
Update-Package Newtonsoft.Json

Install the NLog.StructuredLogging.Json package from NuGet

Make sure the DLL is copied to your output folder

Install-Package NLog.StructuredLogging.Json

Update your NLog config so you write out JSON with properties

NLog needs to write to JSON using the structuredlogging.json layout renderer.
The structuredlogging.json layout renderer is declared in this project.
Any DLLs that start with NLog. are automatically loaded by NLog at runtime in your app.
Copy and replace your nlog.config with this example nlog.config in your solution

Usage

Use the log properties to add extra fields to the JSON. You can add any contextual data values here:

using NLog.StructuredLogging.Json;

...

logger.ExtendedInfo("Sending order", new { OrderId = 1234, RestaurantId = 4567 } );


logger.ExtendedWarn("Order resent", new { OrderId = 1234, CustomerId = 4567 } );

logger.ExtendedError("Could not contact customer", new { CustomerId = 1234, HttpStatusCode = 404 } );

logger.ExtendedException(ex, "Error sending order to Restaurant", new { OrderId = 1234, RestaurantId = 4567 } );

The last parameter can be a Dictionary of names and values, or an anonymous object. If an anonymous object is supplied, the property names and values on this object become field names and corresponding values as shown above.

Example of using a dictionary:

var logProperties = new Dictionary<string, object>
{
  {"orderId", 1234 },
  {"customerId", 3456 }
};

if (partner != null)
{
   logProperties.Add("partnerId", partner.Id)
}

logger.ExtendedInfo("Order received", logProperties);

Structured logging without additional properties

You might still want the json output and context information when there are no additional properties to log. In this case, all of the following are equivalent:

logger.ExtendedInfo("Order received", new {});
logger.ExtendedInfo("Order received", null);
logger.ExtendedInfo("Order received");

The last is prefered as it is simplest.

Logging data from exceptions

If exceptions are logged with ExtendedException, then the name-value pairs in the exception's data collection are recorded.

e.g. where we do:

var restaurant = _restaurantService.GetRestaurant(restaurantId);
if (restaurant == null)
{
  throw new RestaurantNotFoundException();
}

We can improve on this with:

var restaurant = _restaurantService.GetRestaurant(restaurantId);
if (restaurant == null)
{
  var ex = RestaurantNotFoundException();
  ex.Data.Add("RestaurantId", restaurantId);
  throw ex;
}

This is useful where the exception is caught and logged by a global "catch-all" exception handler which will have no knowledge of the context in which the exception was thrown.

Use the exception's Data collection rather than adding properties to exception types to store values.

The best practices and pitfalls below also apply to exception data, as these values are serialised in the same way to the same destination.

Logging inner exceptions

You do not need to explicitly log inner exceptions, or exceptions contained in an AggregateException. They are automatically logged in both cases. Each inner exception is logged as a separate log entry, so that the inner exceptions can be searched for all the usual fields such as ExceptionMessage or ExceptionType.

When an exception has one or more inner exceptions, some extra fields are logged: ExceptionIndex, ExceptionCount and ExceptionTag.

  • ExceptionCount: Tells you have many exceptions were logged together.
  • ExceptionIndex: This exception's index in the grouping.
  • ExceptionTag: a unique guid identifier that is generated and applied to the exceptions in the group. Searching for this guid should show you all the grouped exceptions and nothing else.

e.g. logging an exception with 2 inner exceptions might produce these log entries:

ExceptionMessage: "Outer message"
ExceptionType: "ArgumentException"
ExceptionIndex: 1
ExceptionCount: 3
ExceptionTag: "6fc5d910-3335-4eba-89fd-f9229e4a29b3"

ExceptionMessage: "Mid message"
ExceptionType: "ApplicationException"
ExceptionIndex: 2
ExceptionCount: 3
ExceptionTag: "6fc5d910-3335-4eba-89fd-f9229e4a29b3"

ExceptionMessage: "inner message"
ExceptionType: "NotImplementedException"
ExceptionIndex: 3
ExceptionCount: 3
ExceptionTag: "6fc5d910-3335-4eba-89fd-f9229e4a29b3"

Logging data from context

Properties are also read from the Mapped Diagnostic Logical Context. This is an NLog class, and the data is stored on the logical call context and typed as a Dictionary<string, object>.

Add a value to the MDLC like this:

MappedDiagnosticsLogicalContext.Set("ConversationId", conversationId);

This value will then be attached to all logging that happens afterwards in the same logical thread of execution, even after await statements that change the actual thread.

Logical scopes

To provide your logs with more logical context information you can use BeginScope extension.

using(Logger.BeginScope("first scope description", firstScopeProps)) // first scope, Guid 5D646242-C5A3-4FA0-9A7A-779ED5EA56E2
{
  Logger.ExtendedInfo("scoped log", innerLogProps); // first message
  using(Logger.BeginScope("second scope description", secondScopeProps)) // second scope, Guid 74253AC8-11BB-4CBD-B68D-ED966DBDB478
  {
    Logger.ExtendedInfo("scoped log", null); // second message
    using(Logger.BeginScope("third scope description", thirdScopeProps)) // third scope, Guid 0721B91D-4764-4693-99D9-1AF4B63463A0
    {
      Logger.ExtendedInfo("scoped log", null); // third message
    }
  }
}

Each scope has start and end logs. And in properties we get something like this:

  • For first message:
  "Scope":"first scope description",
  "ScopeId":"5D646242-C5A3-4FA0-9A7A-779ED5EA56E2", // first scope GUID
  "ScopeIdTrace":"5D646242-C5A3-4FA0-9A7A-779ED5EA56E2",
  "ScopeNameTrace":"first scope description",
  // innerLogProps go here
  // firstScopeProps go here
  • For second message:
  "Scope":"second scope description",
  "ScopeId":"74253AC8-11BB-4CBD-B68D-ED966DBDB478", // second scope GUID
  "ScopeIdTrace":"5D646242-C5A3-4FA0-9A7A-779ED5EA56E2 -> 74253AC8-11BB-4CBD-B68D-ED966DBDB478", // first scope GUID -> second scope GUID
  "ScopeNameTrace":"first scope description -> second scope description",
  // firstScopeProps go here
  // secondScopeProps go here
  • For third message:
  "Scope":"third scope description",
  "ScopeId":"0721B91D-4764-4693-99D9-1AF4B63463A0", // third scope GUID
  "ScopeIdTrace":"5D646242-C5A3-4FA0-9A7A-779ED5EA56E2 -> 74253AC8-11BB-4CBD-B68D-ED966DBDB478 -> 0721B91D-4764-4693-99D9-1AF4B63463A0", // first scope GUID -> second scope GUID -> third scope GUID
  "ScopeNameTrace":"first scope description -> second scope description -> third scope description"
  // firstScopeProps go here
  // secondScopeProps go here
  // thirdScopeProps

Logging additional json properties

You can add context to your log events by using the JsonWithProperties layout instead of the structuredlogging.json layout renderer. The example below shows how you can log the machine name and component version with each log event.

<target name="MyTarget"
       xsi:type="file"
       fileName="MyFilePath"
       encoding="utf-8">
  <layout xsi:type="JsonWithProperties">
    <property name="MachineName" layout="${machinename}" />
    <property name="ComponentVersion" layout="1.0.0.0" />
  </layout>
</target>

A log entry created using code similar to:

logger.ExtendedInfo("Order sent to partner", new { OrderId = 1234, RestaurantId = 4567 } );

might result in output similar to:

{"TimeStamp":"2016-09-21T08:11:23.483Z","Level":"Info","LoggerName":"Acme.WebApp.OrderController",
"Message":"Order sent to partner","CallSite":"Acme.WebApp.OrderController.ResendOrder",
"OrderId":"1234","PartnerId":"4567",
"NewState":"Sent","SendDate":"2016-09-21T08:11:23.456Z",
"MachineName":"MyMachineName","ComponentVersion":"1.0.0.0"}

where we can see that the properties specified in the layout are appended to the properties that come from the log event.

Best practices

  • The message logged should be the same every time. It should be a constant string, not a string formatted to contain data values such as ids or quantities. Then it is easy to search for.
  • The message logged should be distinct i.e. not the same as the message produced by an unrelated log statement. Then searching for it does not match unrelated things as well.
  • The message should be a reasonable length i.e. longer than a word, but shorter than an essay.
  • The data should be simple values. Use field values of types e.g. string, int, decimal, DateTimeOffset, or enum types. StructuredLogging.Json does not log hierarchical data, just a flat list of key-value pairs. The values are serialised to string with some simple rules:
    • Nulls are serialised as empty strings.
    • DateTime values (and DateTime?, DateTimeOffset and DateTimeOffset?) are serialised to string in ISO8601 date and time format.
    • Everything else is just serialised with .ToString(). This won't do anything useful for your own types unless you override .ToString(). See the "Code Pitfalls" below.
  • The data fields should have consistent names and values. Much of the utility of Kibana is from collecting logs from multiple systems and searching across them. e.g. if one system logs request failures with a data field StatusCode: 404 and another system with HttpStatusCode: NotFound then it will be much harder to search and aggregate logging data across these systems.

Code pitfalls

Reserved field names

Some attributes are generated automatically. Message is the text field, and others are added: CallSite, Level, LoggerName, Parameters, TimeStamp. If there is an exception, there are several other fields present as well, all starting with Exception.

In some cases of name clashes it can keep the value under a different name, e.g. data_ or ex_ prefixes. But if this fails the extra value must be discarded.

Don't do this:

_logger.ExtendedInfo("This text is the message", new { Message = someData, TimeStamp = DateTime.Now } );`

Do not add a timestamp at all, this is done automatically, and find a name for the "Message" that does not clash, e.g.:

_logger.ExtendedInfo("This text is the message", new { QueueMessageData = someData } );`

No format strings

Don't do this:

_logger.ExtendedWarn("Order {0} resent", new { OrderId = 1234 } );

As there's no format string, the {0} is not filled in.

No simple data values

Don't do

int orderId = 1234
_logger.ExtendedWarn("Order resent", orderId);

as the last parameter needs to be an object with named properties on it.

No nested data values

Don't serialise complex objects such as domain objects or DTOs as values, e.g.:

var orderDetails = new OrderDetails
  {
    OrderId = 123,
    Time = DateTimeOffset.UtcNow.AddMinutes(45)
  };

// let's log the OrderDetails
_logger.ExtendedInfo("Order saved", new { OrderDetails = orderDetails });

The orderDetails object will be serialised with ToString(). Unless this method is overridden in the OrderDetails type declaration, it will not produce any useful output. And if it is overridden, we only get one key-value pair, when instead the various values such as OrderId are better logged in separate fields.

Only some log levels are supported

We support ExtendedException which uses LogLevel.Error, ExtendedError, ExtendedWarn, ExtendedInfo and ExtendedDebug. Other log levels could be added if need be, but we don't believe that fine-grained log levels add a lot of value.

The model where log messages are discarded immediately based on configuration, chiefly based on log level, is one that we can leave behind. All messages of every level are sent to kibana for later processing. Filtering is best done after the fact when investigating an error. Log level is a field that can be searched or filtered on, but is far from the only important one.

Field naming and special characters

When sending logs to the ELK stack, the field names are parsed, and some characters such as '.' have special meaning. So don't use them unless you know how they will be interpreted by the back end.

Compatibility with the ILogger abstraction

From version 3.0.0, this library is fully compatible with the ILogger abstraction in Microsoft.Extensions.Logging.Abstractions and message templates used therein.

This is useful if you use libraries that need an ILogger implementation (such as JustSaying V6) and want the output to go to structured JSON files.

How to ensure that you can use ILogger with this library's JSON formatted output:

ILogger example

When _logger is an ILogger instance set up as above, and the code is:

_logger.LogInformation("Templated information for order {OrderId} at {CustomProperty}",
    12345, DateTime.UtcNow);

The log entry will be like this:

{
  "TimeStamp":"2019-01-29T11:12:46.609Z",
  "Level":"Info",
  "LoggerName":"LogDemo.HomeController",
  "Message":"Templated information for order 12345 at 01/29/2019 11:12:46",
  "MessageTemplate":"Templated information for order {OrderId} at {CustomProperty}",
  "OrderId": "12345",
  "CustomProperty":"2019-01-29T11:12:46.6210834Z"
}

Contributors

Started for JustEat Technology by Alexander Williamson in 2015.

And then battle-tested in production with code and fixes from: Jaimal Chohan, Jeremy Clayden, Oleh Formaniuk, Andy Garner, Kenny Hung, Henry Keen, Payman Labbaf, Joรฃo Lebre, Chris Mannix, Peter Mounce, Simon Ness, Mykola Shestopal, Anthony Steele.

nlog.structuredlogging.json's People

Contributors

andrewchaa avatar anthonysteele avatar britebit avatar cmannix avatar gitter-badger avatar james00harper avatar jonnywidefoot avatar josephwoodward avatar jplebre avatar kbrimble avatar martincostello avatar nukeeperbot avatar oformaniuk avatar slang25 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nlog.structuredlogging.json's Issues

Failing test on Mac

Hi,

This test permanently fails when running on a Mac as a slightly different StackTrace format is generated.

Instead of the expected:

"   at NLog.StructuredLogging.Json.Tests.Helpers.MapperExceptionLoggingTests.GenerateExceptionWithStackTrace()"

It generates

"  at NLog.StructuredLogging.Json.Tests.Helpers.MapperExceptionLoggingTests.GenerateExceptionWithStackTrace ()"

Possible options

From my perspective the following options exist, so before doing the work I'd be interested in which one you think is more suitable:

  1. The assertion does a start with, so grab the beginning of the stack trace and sanitise it (strip spaces in this case). This is my personal preference.

  2. Conditionally assert against the expected based on the OS. eg:

Assert.That(result["ExceptionStackTrace"], Does.StartWith(StrackTraceByOs()));
  1. Do a Contains on the stack trace on the following string (excluding the parentheses):
NLog.StructuredLogging.Json.Tests.Helpers.MapperExceptionLoggingTests.GenerateExceptionWithStackTrace

Personally I'm thinking 2 would be the best way forward.

Basic variable layout renderer doesn't seem to be supported

I'm trying to output basic variables as in:

<variable name="foo" value="X.Y.Z" />
<variable name="bar" value="A-B-C" />
<extensions>
  <add assembly="NLog.Web" />
  <add assembly="NLog.StructuredLogging.Json" />
</extensions>
<targets>
  <target type="BufferingWrapper" name="textBuffer" bufferSize="100" flushTimeout="5000">
    <target name="jsonFile" type="File" fileName="${basedir}\log.json" archiveAboveSize="104857600" maxArchiveFiles="10" archiveNumbering="Sequence">
      <layout type="JsonWithProperties">
        <property name="foo" layout="${var:foo}" />
        <property name="bar" layout="${var:bar}" />
        <property name="machineName" layout="${machinename}" />
      </layout>
    </target>
  </target>
<rules>
  <logger name="*" minlevel="Trace" writeTo="textBuffer" />
</rules>

I threw the machineName one in there to check my sanity. The log output has all of the StructureLogging properties plus machineName, but foo and bar are nowhere to be found (not even an empty property). I also tried tinkering with the variable layout renderer by specifying "${var:name=foo}" to no avail.

Is this expected behavior? If so, is there a known workaround?

Two Layouts

We have StructuredLoggingLayoutRenderer for when a layout renderer is appropriate.
When a layout is appropriate, we have FlattenedJsonLayout and JsonWithPropertiesLayout. Do we need both, and if so, when do you choose between them? Are the differences between them just history, and can they be made more similar?

Scoped logs

Hi all,

After I saw #52 (Context, attaching properties) I got another idea how logs can be improved. It is related mostly to tracing logs, meaning providing better idea on the context of log message by attaching some unique ID (this also can be used to attach other properties if it is useful).

To achieve this we can wrap NestedDiagnosticsLogicalContext of NLog.
I see it by using next syntax:

using(Logger.BeginScope("first scope description", firstLogProps)) // first scope, Guid 5D646242-C5A3-4FA0-9A7A-779ED5EA56E2
{
  Logger.ExtendedInfo("scoped log", innerLogProps); // first message
  using(Logger.BeginScope("second scope description", secondLogProps)) // second scope, Guid 74253AC8-11BB-4CBD-B68D-ED966DBDB478
  {
    Logger.ExtendedInfo("scoped log", null); // second message
    using(Logger.BeginScope("third scope description", thirdLogProps)) // third scope, Guid 0721B91D-4764-4693-99D9-1AF4B63463A0
    {
      Logger.ExtendedInfo("scoped log", null); // third message
    }
  }
}

Each scope start and end logs.
Not sure if logProps of scope should be attached to each message in it. (maybe it should be optional?)
And in properties we get something like this:

  • For first message:
  "ScopeId":"5D646242-C5A3-4FA0-9A7A-779ED5EA56E2", // first scope GUID
  "Stack":"5D646242-C5A3-4FA0-9A7A-779ED5EA56E2"
  • For second message:
  "ScopeId":"74253AC8-11BB-4CBD-B68D-ED966DBDB478", // second scope GUID
  "Stack":"5D646242-C5A3-4FA0-9A7A-779ED5EA56E2 -> 74253AC8-11BB-4CBD-B68D-ED966DBDB478" // first scope GUID -> second scope GUID
  • For third message:
  "ScopeId":"0721B91D-4764-4693-99D9-1AF4B63463A0", // third scope GUID
  "Stack":"5D646242-C5A3-4FA0-9A7A-779ED5EA56E2 -> 74253AC8-11BB-4CBD-B68D-ED966DBDB478 -> 0721B91D-4764-4693-99D9-1AF4B63463A0" // first scope GUID -> second scope GUID -> third scope GUID

The we can have hext query in Kibana, for example:

{
  "query": {
    "wildcard": {
      "Stack": {
        "value": "*74253AC8-11BB-4CBD-B68D-ED966DBDB478*" // second scope GUID
      }
    }
  }
}

Query returns all log messages that are nested under second scope (in this case second and third messages).

Changing JsonSerializerSettings EG. CamelCasePropertyNames

I would like to serialize all the property names in camelCase instead of PascalCase which is the default.

I could not get a workaround maybe because of the design in ConvertJson class.

    public static class ConvertJson
    {
        private static readonly JsonSerializerSettings LogSettings = new JsonSerializerSettings
        {
            ContractResolver = new DefaultContractResolver(),
            Formatting = Formatting.None
        };

        public static string Serialize(Dictionary<string, object> data)
        {
            return JsonConvert.SerializeObject(data, LogSettings);
        }
    }

I am not able to do...
ContractResolver = new CamelCasePropertyNamesContractResolver(),

Is there any way to do that?

Update: I do not want to change the original code of the library, I am looking for a way to extend it.
In the Unit Tests associated with, I explicitly have ignored the global setting

Class: ConvertJsonTests
Test: SerialisationIsNotAffectedByChangeToGlobalSettings

Support .NET Core

.a version of this lib with Net core support would open some doors for us.
it's probably a lot harder to get the tests onto core, but also less important.

Logs in Kibana broken when using newer versions of NLog

When using certain combination of NLog and Nlog.StructuredLogging.Json versions the logs in Kibana contain only properties with suffix @. See the example screenshot below where the app version in the bottom (1.0.0.41) - with Nlog 4.4.3 and Nlog.StructuredLogging.Json 1.0.78 - has missing fields like Message, Level, ProcessId, then above (1.0.0.42) when other combination of versions used in the same app, then it is working:
image
Here is the table containing the combination of different versions (Newtonsoft.Json 9.0.1 was used) and whether it works or not:
NLog | Nlog.StructuredLogging.Json | Is working?
4.4.3 | 1.0.78 | No (also tested with Newtonsoft 10.0.2)
4.4.8 | 1.0.78 | No
4.4.3 | 1.0.68 | No
4.4.1 | 1.0.68 | Yes
4.4.5 | 1.0.58 | No
4.3.11 | 1.0.58 | Yes

UTF-8 encoding missing?

Writing an application that fetches data from a swedish accounting software. They have their error messages in Swedish, this causes the error messages that we get to stored in the json output as the diamondquestionmark symbol instead of encoded character.
Using the as suggested by standard NLog doesn't work, as attributes aren't allowed with structuredlogging. the encoding="UTF-8" property on the target doesn't help either.

Been looking around a bit, and see no mention of encoding in the documentation for structuredlogging either.

attaching the nlog.config file I have.

<?xml version="1.0" encoding="utf-8" ?>
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    throwExceptions="true
    encoding="UTF-8">
    <extensions>
        <add assembly="NLog.StructuredLogging.Json" />
    </extensions>
    <targets>
        <target name="localError"
            xsi:type="File"
            fileName="${basedir}/logs/error/${shortdate}-error.json"
            encoding="UTF-8"
            layout="${structuredlogging.json}"
            maxArchiveFiles="90"/>
	<target name="localSuccess"
            xsi:type="File"
            fileName="${basedir}/logs/success/${shortdate}-success.json"
            encoding="UTF-8"
            layout="${structuredlogging.json}"
            maxArchiveFiles="90"/>
	</targets>
	<rules>
            <logger name="error"
                minlevel="Debug"
                writeTo="localError" />
	<logger name="success"
                minlevel="Debug"
		writeTo="localSuccess"/>
	</rules>
</nlog>

Include extra Exception properties

When we get exceptions like DbEntityValidationException we only see the message, stacktrace and some exception related fields being serialized into a structured JSON object but not the any custom fields (in this case EntityValidationErrors) that will give us more detail.

Example:

System.Data.Entity.Validation.DbEntityValidationException: Validation failed for one or more entities. See 'EntityValidationErrors' property for more details.

This does not help us since the actual field is not serialized in the JSON message. Would it be possible to extend this Structured Logging library to include custom fields like EntityValidationErrors so that we can log those as well?

Some values should not be turned into strings

e.g. Numbers and booleans, that can be represented in json, should not be quoted as strings in the output.

e.g. when an order id is supplied as type int, instead of output "OrderId": "12345",, it should be "OrderId": 12345,

Extensions only work with NLog.ILogger

Hi,
Is it feasible to make the extensions (ExtendedError, ExtendedInfo and so on) to work with the Microsoft.Extensions.Logging.ILogger, instead of just with the NLog.ILogger interface. This way the package can be used with the standard DI inside web projects for example :)

measure performance

Measure performance so that we know if we're good. This project would benefit from some characterisation of the performance, which might raise issues to change things if they are slower than they should be.

How to work with ILogger

Should have an answer to the question of how/if to use a Microsoft.Extensions.Logging.Abstractions.ILogger with this code.

NullReferenceException in mapper

 Exception type: NullReferenceException 
    Exception message: Object reference not set to an instance of an object.
   at NLog.JustEat.JsonFields.Helpers.Mapper.<>c.<ToDictionary>b__0_0(Object x)
   at System.Linq.Enumerable.WhereSelectArrayIterator`2.MoveNext()
   at System.String.Join(String separator, IEnumerable`1 values)
   at NLog.JustEat.JsonFields.Helpers.Mapper.ToDictionary(LogEventInfo source)
   at NLog.JustEat.JsonFields.JsonFieldsLayoutRenderer.Append(StringBuilder builder, LogEventInfo logEvent)
   at NLog.LayoutRenderers.LayoutRenderer.Render(StringBuilder builder, LogEventInfo logEvent)
   at NLog.Layouts.SimpleLayout.GetFormattedMessage(LogEventInfo logEvent)
   at NLog.Layouts.Layout.Precalculate(LogEventInfo logEvent)
   at NLog.Targets.Target.PrecalculateVolatileLayouts(LogEventInfo logEvent)
   at NLog.Targets.Wrappers.AsyncTargetWrapper.Write(AsyncLogEventInfo logEvent)
   at NLog.Targets.Target.WriteAsyncLogEvent(AsyncLogEventInfo logEvent)

Add ability to log without parameters

Hi,

I'm trying to use this library to add structured logging to a project, but I've run into an issue when using it with the NLog "MappedDiagnosticsLogicalContext" functionality for adding context data to logs. This lets you define scoped (or global) key value pairs that can be added to all your log lines.

Quite often I find I'm adding several pieces of data to the log context, and for several logs within that context I have no need to add more data at the point of logging, as all the relevant data is in the context. Unfortunately, theres no overload for any of the "Extended*" methods that allows me to do this as far as I can tell. Using an empty object as the log properties gives me the log data I want, but that's not a great solution.

As an example:

using NLog;
using NLog.StructuredLogging.Json;

namespace Console
{
    internal class Program
    {
        private static void Main()
        {
            var logger = LogManager.GetCurrentClassLogger();
            try
            {
                using (MappedDiagnosticsLogicalContext.SetScoped("ContextData", "ABC"))
                {
                    // Doesn't compile:
                    // logger.ExtendedInfo("Test context data - StructuredLogging, no log properties");
                    
                    logger.ExtendedInfo("Test context data - StructuredLogging, empty object", new { });
                    logger.ExtendedInfo("Test context data - StructuredLogging, with data", new { LocalData = "123" });
                    logger.Info("Test context data - nlog only");
                }
            }
            finally
            {
                // Ensure to flush and stop internal timers/threads before application-exit (Avoid segmentation fault on Linux)
                LogManager.Shutdown();
            }
        }
    }
}
<?xml version="1.0" encoding="utf-8" ?>
<!-- XSD manual extracted from package NLog.Schema: https://www.nuget.org/packages/NLog.Schema-->
<nlog xmlns="http://www.nlog-project.org/schemas/NLog.xsd" xsi:schemaLocation="NLog NLog.xsd"
      xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
      autoReload="true"
      internalLogFile="c:\temp\console-example-internal.log"
      internalLogLevel="Info" >


    <!-- the targets to write to -->
    <targets>
        <!-- write logs to file -->
        <target xsi:type="File" name="target1" fileName="c:\temp\console-example.log"
                layout="${structuredlogging.json}" />
        <target xsi:type="Console" name="target2"
                layout="${date}|${level:uppercase=true}|${message} ${exception}|${logger}|${all-event-properties}" />


    </targets>

    <!-- rules to map from logger name to target -->
    <rules>
        <logger name="*" minlevel="Trace" writeTo="target1,target2" />

    </rules>
</nlog>

Results in:

{"TimeStamp":"2019-01-14T10:38:21.156Z","Level":"Info","LoggerName":"Console.Program","Message":"Test context data - StructuredLogging, empty object","CallSite":"Console.Program.Main","ContextData":"ABC"}
{"TimeStamp":"2019-01-14T10:38:21.373Z","Level":"Info","LoggerName":"Console.Program","Message":"Test context data - StructuredLogging, with data","CallSite":"Console.Program.Main","LocalData":"123","ContextData":"ABC"}
{"TimeStamp":"2019-01-14T10:38:21.379Z","Level":"Info","LoggerName":"Console.Program","Message":"Test context data - nlog only"}

Am I missing something here? If not would it be possible to add this feature please - happy to submit a PR for it if so, as I think it should pretty much just be a case of either adding a default value for the logProperties parameter, or perhaps tweaking the 'structuredlogging.json' layout to include the context properties when just using the original nlog methods?

Logging with a Dictionary not anon object

It would be nice to have an overload with a dictionay for the properties, e.g. typed
_logger.ExtendedInfo(string message, IDictionary<string, object> props)

The advantage is that a Dictionary is mutable, so you can add context to it as you go, add context in an if statement, etc.

e.g.

public void DoSomeOrderOperation(int orderId)
{
  var props = new Dictionary<string, object>();
  props.Add("OrderId", orderId);
  _logger.ExtendedInfo("Received DoSomeOrderOperation", props);

  if (someCondition) 
  {
     props.Add("SomeContext", someId);
  }

  var customer = GetCustomerForOrder(orderId); 
  props.Add("CustomerId", customer.Id);
  _logger.ExtendedInfo("Got Customer for order", props);
}

This is significantly less verbose than the the equivalent implemented using a new anon object for each log statement.

Depend upon NLog and not NLog.Config, NLog.Schema

The generated package dependencies on nuget lists NLog.Config which in turn depends upon NLog and NLog.Schema.

But this package does not list NLog. What was the original reason to depend upon the Config and Schema packages?

Do we really need to depend on anything other than just NLog? It is confusing to not depend upon NLog, and because of the other two packages, when adding this package to a solution, a schema file is added, which usually isn't needed, and a nlog.config file is written, usually overwriting the existing configuration with an example. This is actively harmful unless you know to revert it.

For the sake of problem-free installs we should just depend upon NLog., unless there is a good reason to do otherwise.

Timezones differ from NLog itself

I have noticed that there is a difference in how structuredlogging makes the timestamp as opposed to NLOG itself.

I am NLogs own structured logging for logging locally, which creates a timestamp that follows the local computers locale and timezone.
example:
{ "Timestamp": "2019-07-19 01:00:13.3124", "Level": "Info", "CallSite": "xxx", "LoggerName": "xxx", "Message": "xxx", "Status": "OK", ... }

Then I am using structuredlogging.json layout for logging over webservice, which causes the timestamp to show the timestamp in the wrong timezone
example:
{ "Timestamp": "2019-07-18 23:59:13.3124", "Level": "Info", "CallSite": "xxx", "LoggerName": "xxx", "Message": "xxx", "Status": "OK", ... }

Basically, NLog seem to use the PCs local time, while structuredlogging is using UTC all the time.
Is it possible to make structuredlogging.json use the same timestamp as nlog?

This is my current nlog.config:

`

<extensions>
	<add assembly="NLog.StructuredLogging.Json" />
</extensions>

<variable name="log_dir" value="${basedir}/logs"/>
    <variable name="archive_dir" value="${basedir}/archive"/>
<variable name="human_readable_layout" value="${longdate}|${level}|${logger}|${message}|${onexception:EXCEPTION OCCURRED\:${exception:format=type,message,method,stackTrace:maxInnerExceptionLevel=5:innerFormat=shortType,message,method,stackTrace}}" />

<targets async="true">
	<target name="log_json"
					xsi:type="File"
					fileName="${log_dir}/${level:lowercase=true}/${shortdate}.json"
					encoding="utf-8"
					archiveEvery="Day"
					archiveNumbering="Date"
					archiveDateFormat="yyyyMMdd"
					archiveFileName="${archive_dir}\{#####}-${level:lowercase=true}.log"
					concurrentWrites="false"
					keepFileOpen="false"
					maxArchiveFiles="90">
		<layout xsi:type="JsonLayout"
          includeAllProperties="true"
          maxRecursionLimit="5">
			<attribute name="Timestamp"
               layout="${longdate}"/>
			<attribute name="Level"
               layout="${level}"/>
			<attribute name="CallSite"
               layout="${callsite}"/>
			<attribute name="LoggerName"
               layout="${logger}"/>
			<attribute name="Message"
               layout="${message}"/>
		</layout>
	</target>
	<target name="logstash"
        xsi:type="WebService"
        protocol="JsonPost"
        encoding="UTF-8"
        url="http://xxx">
		<parameter name="message" type="System.String" layout="${structuredlogging.json}"/>
		<header name="Authorization" layout="${Authentication}"/>
	</target>
</targets>
<rules>
	<logger name="*"
					minlevel="Info"
					writeTo="log_json"/>

	<logger name="*"
					minlevel="Debug"
					writeTo="logstash"/>
</rules>

`

Field names contains dot

Mykola Shestopal says:

can we rename exception fields to not contain dots in them (for example Data.SomeField become Data_SomeField or so) in Structured Logging since this cause errors in Elastic Search so logs are not indexed if they have "Data" field (since earlier log had Data.SomeField so Data becomes an object and now needs to have properties, but Data field treated as usual field and can't be parsed as an object)
[12:45 PM] Mykola Shestopal: can this be configurable by some option, like rename fields with dot's (and which separator to use instead of dot)

See also:
elastic/elasticsearch#2354
http://blog.endpoint.com/2013/04/elasticsearch-object-mapping-eof-400.html

Context, attaching properties

This is just speculation, nothing concrete yet.

The more that we deal with flowing state as context down to lower callees, the more that it seems a bad idea. I mean mainly HttpContext, LogicalThreadContext and SynchronisationContext

e.g. in a MVC controller, a database repository class is invoked, which logs a message about the database operation. It is very useful for business reasons to attach to this message a correlation Id read from http request headers. But for Software SRP reasons, it is better if the http request is only used in the controller that receives it. Use of HttpContext.Current.Request.Headers deep in the call stack or in some logging plugin is not ideal.

Is it possible to do some sort of way of reading values into the logger up the call stack, and automatically attaching them to everything logged below? e.g.

var myCorrelationId = ReadItFromTheHttpHeaders(request);
using (logger.AddValue("CorrelationId", myCorrelationId))
{
   var results = await _repo.GetResults(id);
}

And all structured logging inside that block will automagically get the property "CorrelationId" populated.

It would be easy to do this if there is only one request at a time. Hard to do it with concurrent requests. Or is this not possible without re-inventing context as a logging context?

Message templates

Open thread about pros and cons of message templates.
See https://messagetemplates.org/

So far we have not used message templates since we have not felt that they are ideal: the message is presented twice wasting bytes, and the string formatting used to render the "human-readable" version takes time. When logging frequently in high throughput high performance systems, you don't want to squander these.

However, message templates might be an emerging standard and therefore useful for logging consistently with other logging libraries.

Cannot install package when targeting .NET Framework 4.5.1

Hi!

I am trying to integrate your library to my project, however, when I run the command Install-Package NLog.StructuredLogging.Json I'm getting the following error:

You are trying to install this package into a project that targets '.NETFramework,Version=v4.5.1', but the package does not contain any assembly references or content files that are compatible with that framework

Is there an specific version targeting this version of the framework? An suggestions on how to trouble shoot this issue?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.