For troubleshooting and getting insight into application problems, logs can be an invaluable resource. But collecting logs in a way that makes them useful is a challenge.
The main problem is that log management tools and services do not treat multiline logs as a single event without proper configuration.
Because related information appears in disparate log messages instead of a single log message, each line is processed separately, increasing logging overhead, and making it difficult to interpret your applications' activity.
Luckily, there are a few things you can do about it.
➡️ JSON
By logging to JSON, you will ensure your multiline logs are processed as one event. This allows you to easily aggregate all log events into a single element. Among all the methods we know, this is by far the simplest.
➡️ Log shippers
But there may be situations where you can’t log to JSON. For instance, if your environment uses a third-party logging tool that you cannot configure to write to JSON, or for some reason you can't change your code or logging strategies.
In this case, you can configure a log shipper to handle multi-line logs by looking for specific patterns. Log shippers, for example, can look for timestamps since they only appear at the beginning of logs.
There are plenty of different log shippers. You can try Rsyslog, an open-source extension of syslog with enhanced configuration capabilities. Alternatively, Logstash can parse multi-line logs using a plugin you configure as part of the input settings for your log pipeline.
Our go-to log processor and forwarder is Fluent Bit, which is lightweight, fast, extendable, is capable of parsing multi-line logs and sending them to multiple destinations.
👋 Say hi at hello@devolut.io to learn how you can get the most out of Kubernetes.