I have an application running inside a gcp container, outputting log that looks like this: {"severity":"INFO","message":"Http request served","timestamp":{"seconds":1712394651,"nanos":76426063},"httpRequest":{"requestMethod":"GET","requestUrl":"/health","requestSize":"0","responseSize":"0","userAgent":"curl/7.76.1","latency":"0","referer":""}}
I've tried docker gcp logging driver (directly to gcp), docker fluent logging driver with ops-agent, and docker journald logging driver with ops-agent. Ops-agent has parse_json
processor always enabled.
logging: receivers: rec-v1: type: fluent_forward listen_port: 24224 rec-v1-journald: type: systemd_journald processors: rec-v1-processor: type: parse_json service: pipelines: default_pipeline: receivers: [] tokenz_backend_v1: receivers: - rec-v1 - rec-v1-journald processors: [v1-processor]
but the logs always end up unparsed as text.
With fluent transport, it ends up in the log
field inside the jsonPayload (as specified here: https://docs.docker.com/config/containers/logging/fluentd/)
with journald transport, it ends up in the MESSAGES
field.
How to have it parsed correctly as json?