Logging as first-class citizen in software development

Structured logging in .NET Core application using Serilog

Have you ever faced a production issue and realized you lack required information and need to reproduce the issue again with verbose logs enabled for troubleshooting?

We, as developers, build features for our end-users; we build logging and monitoring capabilities for ourselves. Undoubtedly, it's a very important aspect of any application. However, often it's is not given so much thought during development - maybe, because we usually do not have any acceptance criteria associated it.

Robust logging is an investment.

The importance of what to log, where to log and what level to log -

  1. If I'm developing a WebAPI that serves many users, having a logging statement "Request processed" without context isn't of much use. The context is as much important as the event itself.

  2. While inadequate logging is not desired, too much information makes debugging a nightmare as well. Having a log file with 80,000 lines of logs out of which 75,000 are of no use just adds to the noise while troubleshooting any issue. Having logs within a big iterative block that does not provide much insight an AVOID!

  3. Usually developers log exceptions as errors but do not differentiate among Debug, Information and Warning levels. Logging levels are present for a reason, and it should be actively used.

Structured Logging

Logs are unstructured text representing series of events. We usually need to filter logs based on attributes, for example, a specific user or transaction. This requires some sort of semantics in the logs. Structured logs are really useful in such scenarios. Structured logging uses Message Template for capturing and rendering logs as it provides user friendly logs rendering and machine-friendly schema for analytics. A message template is a block of text with embedded holes that name a property to be captured and inserted into the text.

Most of the structured logging providers support useful attributes like timestamp, thread, level, process etc. We need to extend this list with attributes that are specific to our software.

An event captured using a message template is not immediately rendered into text unless it is being displayed directly to a user. Instead, the template itself is captured, along with property values for each argument. This is stored in an intermediate representation like JSON shown in the below snippet for processing. It enables slicing and dicing of data based on the attributes(property User for example).

log.LogInformation("User {User} logged in from {IpAddress}", username, ipAddress)
  // JSON translation of the log -
  //   {
  //      "time": "2021-09-26T13:02:11.888",
  //      "messageTemplate": "User {User} logged in from {IpAddress}",
  //      "level": "Information"
  //      "User": "Ishan", 
  //      "IpAddress": "123.45.67.89"
  //    }
  // UI Rendering of the log
  // [13:02:11 INF] User Ishan logged in from 123.45.67.89

Structured logging in .NET applications using Serilog

Structured Logging can easily be included in the any .NET Core application by registering Structured Logging Provider - Serilog.

Using Serilog, the logs can be emitted to various sinks for storage and visualization.

Serilog aggregates and emit the logs asynchronously in batches, so it does not have any visible impact on the performance of the application.

The following semantics should be kept in mind while using structured logging with Serilog -

  • Property names are written between { and } brackets
  • Brackets can be escaped by doubling them, e.g. {{ will be rendered as {
  • Property names may be prefixed with an optional operator, @ or $, to control how a property is captured.

Registering Serilog in .NET applications logging pipeline is very simple using Serilog NuGet package

var logger = new LoggerConfiguration()
    .WriteTo.Console()
    .WriteTo.File("log.txt")
    .WriteTo.Console()
    .Enrich.WithEnvironmentName()
    .Enrich.WithProperty("CustomProperty", customValue)
    .CreateLogger();
 builder.Services.AddLogging(lb => lb.AddSerilog(logger, dispose: true));

Serilog is added to the default logging pipeline using AddSerilog method from Serilog.Extensions.Logging.

In .NET core applications, we can have the logger implementation transparent to the business layer by having ILogger<TCategoryName> abstraction provided by Microsoft.Extensions.Logging injected throughout the application. With this pattern, our application can seamlessly switch between any logging provider without any change to the functional components of the application.

In the above snippet, for illustration purpose, application is configured to emit logs to Console and File. Serilog has an extensive list of sinks that can be configured with different configurations. Serilog also provides an option to enrich the logs with useful pre-defined attributes as well as custom properties.

However, the real power of structured logging comes when it's used with centralized search/analytics engine and visualization framework like ELK. This integration can be achieved using Elasticsearch sink.

Now, ELK by itself is beyond the scope of this blog but in short it acts as a powerful logs ingestion pipeline using Elasticsearch for log persistence and Kibana for visualization. It is helpful in monitoring the application since we can slice and dice the logs as per the need using Kibana Query Language.

I will try to cover ELK stack and other monitoring tools in detail in my subsequent blogs.

Hopefully you guys find this short blog helpful.

See you guys in my next blog posts!

Cheers!!