Log Processing - Grok
grok
filter provides parsing of the unstructured log data into something query-ableLog data:
2017-06-21 22:18:25,276 - util.py[DEBUG]: Reading from /proc/uptime (quiet=False)
Custom parse patterns added:
For module: (?<= - )(.+)(?=\[) For loglevel (?<=\[)(.+)(?=\])
Capture into a field:
%{TIMESTAMP_ISO8601:datetime}
wheredatetime
is a field nameFull capture pattern:
%{TIMESTAMP_ISO8601:datetime}%{SPACE}%{SPACE}-%{SPACE} (?<module>(?<= - )(.+)(?=\[))(\[)(?<loglevel>(.+)(?=\]))(\]: )%{GREEDYDATA:message}
Transformed data (1st log record):
{ "datetime": "2017-06-21 22:18:25,276", "module": "util.py", "loglevel": "DEBUG", "message": "2017-06-21 22:18:25,276 - util.py[DEBUG]: Reading from /proc/uptime (quiet=False)" }
Configuration settings:
filter { grok { match=> { "message"=>"%{TIMESTAMP_ISO8601:datetime}%{SPACE}%{SPACE}-%{SPACE} (?<module>(?<= - )(.+)(?=\[))(\[)(?<loglevel>(.+)(?=\]))(\]: )%{GREEDYDATA:message}" } }
Invaluable tools:
grok debugger
Kibana Grok Debugger (as part of X-Pack)
Last updated
Was this helpful?