Support Logging setup Logstash custom parsing

Logstash Custom Parsing

You can setup Logstash to do custom parsing of your logs and then send the output to Loggly. Logstash is able to parse logs using grok filters. This can be useful if your log format is not one of our automatically parsed formats. Parsing allows you to use advance features like statistical analysis on value fields, faceted search, filters and more.

1. Setup Logstash

If you haven’t already, configure Logstash to output to Loggly.

2. Configure A Grok Filter

Grok is library of expressions that make it easy to extract data from your logs. You can select from hundreds of available grok patterns.

In this example, we have taken an example of the kern.log file and created a custom log format for the kernel logs and added in the Grok Filter.

Sample log line from kern.log file

Sep 24 08:12:51 ubuntu kernel: [ 0.000000] Calgary: Unable to locate Rio Grande table in EBDA - bailing!

Custom format used

%{CISCOTIMESTAMP:timestamp} %{HOST:host} %{WORD:program}%{NOTSPACE} %{NOTSPACE}%{NUMBER:duration}%{NOTSPACE} %{GREEDYDATA:kernel_logs}

Paste this into your logstash configuration file, replacing your own customer token.

input{
   file{
      path => "/var/log/kern.log"
      start_position => beginning
   }
}
filter{
   grok{
      add_field => {"source" => "logstash"}
      match => {"message" => "%{CISCOTIMESTAMP:timestamp} %{HOST:host} %{WORD:program}%{NOTSPACE} %{NOTSPACE}%{NUMBER:duration}%{NOTSPACE} %{GREEDYDATA:kernel_logs}"}
   }
}
output{
   loggly{
      host => "logs-01.loggly.com"
      key => "TOKEN"
      proto => "http"
   }
}

Replace:

3. Verify Events

See the output below for the above mentioned custom format.

 

Advanced Logstash Custom Parsing Options

Troubleshooting Logstash Custom Parser

If you don’t see any data show up in the verification step, then check for these common problems.

Check logstash configuration:

Still Not Working?

Thanks for the feedback! We'll use it to improve our support documentation.