Log Processing - Grok

  • grok filter provides parsing of the unstructured log data into something query-able

  • Log data:

    2017-06-21 22:18:25,276 - util.py[DEBUG]: Reading from /proc/uptime (quiet=False)
  • Custom parse patterns added:

    For module: (?<= - )(.+)(?=\[)
    For loglevel (?<=\[)(.+)(?=\])
  • Capture into a field: %{TIMESTAMP_ISO8601:datetime} where datetime is a field name

  • Full capture pattern:

    %{TIMESTAMP_ISO8601:datetime}%{SPACE}%{SPACE}-%{SPACE} (?<module>(?<= - )(.+)(?=\[))(\[)(?<loglevel>(.+)(?=\]))(\]: )%{GREEDYDATA:message}
  • Transformed data (1st log record):

    {
      "datetime": "2017-06-21 22:18:25,276",
      "module": "util.py",
      "loglevel": "DEBUG",
      "message": "2017-06-21 22:18:25,276 - util.py[DEBUG]: Reading from /proc/uptime (quiet=False)"
    }
  • Configuration settings:

    filter {
      grok {
        match=> {
        "message"=>"%{TIMESTAMP_ISO8601:datetime}%{SPACE}%{SPACE}-%{SPACE} (?<module>(?<= - )(.+)(?=\[))(\[)(?<loglevel>(.+)(?=\]))(\]: )%{GREEDYDATA:message}"
      }
    }
  • Invaluable tools:

Last updated

Was this helpful?