Skip to content

yuriy-millen/serilog-sinks-http

 
 

Repository files navigation

Serilog.Sinks.Http - A Serilog sink sending log events over HTTP

Build status codecov NuGet Version SemVer compatible NuGet Documentation Join the chat at https://gitter.im/serilog/serilog Help

Package - Serilog.Sinks.Http | Platforms - .NET 4.5/4.6.1, .NET Standard 2.0/2.1

Table of contents


Introduction

This project started out with a wish to send log events to the Elastic Stack. I had prior experience of Elastic Filebeat and didn't like it. I thought the value it added was lower than the complexity it introduced.

Knowing that Serilog.Sinks.Seq existed, and knowing that the code was of really good quality, I blatantly copied many of the core files into this project and started developing a general HTTP sink.

And here we are today. I hope you'll find the sink useful. If not, don't hesitate to open an issue.

Super simple to use

In the following example, the sink will POST log events to http://www.mylogs.com over HTTP. We configure the sink using named arguments instead of positional because historically we've seen that most breaking changes where the result of a new parameter describing a new feature. Using named arguments means that you more often than not can migrate to new major versions without any changes to your code.

ILogger log = new LoggerConfiguration()
  .MinimumLevel.Verbose()
  .WriteTo.Http(requestUri: "https://www.mylogs.com", queueLimitBytes: null)
  .CreateLogger();

log.Information("Logging {@Heartbeat} from {Computer}", heartbeat, computer);

Used in conjunction with Serilog.Settings.Configuration the same sink can be configured in the following way:

{
  "Serilog": {
    "MinimumLevel": "Verbose",
    "WriteTo": [
      {
        "Name": "Http",
        "Args": {
          "requestUri": "https://www.mylogs.com",
          "queueLimitBytes": null
        }
      }
    ]
  }
}

The sink can also be configured to be durable, i.e. log events are persisted on disk before being sent over the network, thus protected against data loss after a system or process restart. For more information please read the wiki.

The sink is batching multiple log events into a single request, and the following hypothetical payload is sent over the network as JSON.

[
  {
    "Timestamp": "2016-11-03T00:09:11.4899425+01:00",
    "Level": "Information",
    "MessageTemplate": "Logging {@Heartbeat} from {Computer}",
    "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
    "Properties": {
      "Heartbeat": {
        "UserName": "Mike",
        "UserDomainName": "Home"
      },
      "Computer": "Workstation"
    }
  },
  {
    "Timestamp": "2016-11-03T00:09:12.4905685+01:00",
    "Level": "Information",
    "MessageTemplate": "Logging {@Heartbeat} from {Computer}",
    "RenderedMessage": "Logging { UserName: \"Mike\", UserDomainName: \"Home\" } from \"Workstation\"",
    "Properties": {
      "Heartbeat": {
        "UserName": "Mike",
        "UserDomainName": "Home"
      },
      "Computer": "Workstation"
    }
  }
]

Typical use cases

Producing log events is only half the story. Unless you are consuming them in a matter that benefits you in development or production, there is really no need to produce them in the first place.

Integration with Elastic Stack (formerly know as ELK, an acronym for Elasticsearch, Logstash and Kibana) is powerful beyond belief, but there are many alternatives to get the log events into Elasticsearch.

Send log events from Docker containers

A common solution, given your application is running in Docker containers, is to have stdout (standard output) and stderr (standard error) passed on to the Elastic Stack. There is a multitude of ways to accomplish this, but one using Logspout is linked in the Sample applications chapter.

Send log events to Elasticsearch

The log events can be sent directly to Elasticsearch using Serilog.Sinks.Elasticsearch. In this case you've solved your problem without using this sink, and all is well in the world.

Send log events to Logstash

If you would like to send the log events to Logstash for further processing instead of sending them directly to Elasticsearch, this sink in combination with the Logstash HTTP input plugin is the perfect match for you. It is a much better solution than having to install Filebeat on all your instances, mainly because it involves fewer moving parts.

Sample applications

The following sample applications demonstrate the usage of this sink in various contexts:

The following sample application demonstrate how Serilog events from a Docker container end up in the Elastic Stack using Logspout, without using Serilog.Sinks.Http.

Install via NuGet

If you want to include the HTTP sink in your project, you can install it directly from NuGet.

To install the sink, run the following command in the Package Manager Console:

PM> Install-Package Serilog.Sinks.Http

Contributors

The following users have made significant contributions to this project. Thank you so much!


JetBrains

🚇

C. Augusto Proiete

💵 💬 💻 🤔

Louis Haußknecht

💻 🤔 🐛

rob-somerville

💻 🤔 🐛

Kevin Petit

💻 🤔 🐛

aleksaradz

💻 🤔 🐛

michaeltdaniels

🤔

dusse1dorf

🤔

vaibhavepatel

🐛 💻 🤔

KalininAndreyVictorovich

🐛 💻

Sergios

🐛

Anton Smolkov

💻 🐛

Michael J Conrad

📖

Yuriy Sountsov

🤔

About

A Serilog sink sending log events over HTTP.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C# 98.6%
  • PowerShell 1.4%