How to include multiple JSON fields when using JSON logging with SLF4J?

2k Views Asked by At

I'm working with Dropwizard 1.3.2, which does logging using SLF4J over Logback. I am writing logs for ingestion into ElasticSearch, so I thought I'd use JSON logging and make some Kibana dashboards. But I really want more than one JSON item per log message - if I am recording a status update with ten fields, I would ideally like to log the object and have the JSON fields show up as top level entries in the JSON log. I did get MDC working but that is very clumsy and doesn't flatten objects.

That's turned out to be difficult! How can I do that? I have it logging in JSON, but I can't nicely log multiple JSON fields!

Things I've done:

My Dropwizard configuration has this appender:

  appenders:
    - type: console
      target: stdout
      layout:
        type: json
        timestampFormat: "ISO_INSTANT"
        prettyPrint: false
        appendLineSeparator: true
        additionalFields:
          keyOne: "value one"
          keyTwo: "value two"
        flattenMdc: true

The additional fields show up, but those values seem to be fixed in the configuration file and don't change. There is a "customFieldNames" but no documentation on how to use it, and no matter what I put in there I get a "no String-argument constructor/factory method to deserialize from String value" error. (The docs have an example value of "@timestamp" but no explanation, and even that generates the error. They also have examples like "(requestTime:request_time, userAgent:user_agent)" but again, undocumented and I can't make anything similar work, everything I've tried generates the error above.

I did get MDC to work, but it seems silly to plug in each item into MDC and then clear it.

And I can deserialize an object and log it as nested JSON, but that also seems weird.

All the answers I've seen on this are old - does anyone have any advice on how to do this nicely inside Dropwizard?

1

There are 1 best solutions below

0
On

You can use logback explicitly in Dropwizard using a custom logger factory, and then set it up with logstash-logback-encoder, and configure it to write out to a JSON appender.

The JSON encoder may look like this:

<included>

    <encoder class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
        <providers>
            <pattern>
                <pattern>
                    {
                      "id": "%uniqueId",
                      "relative_ns": "#asLong{%nanoTime}",
                      "tse_ms": "#asLong{%tse}",
                      "start_ms": "#asLong{%startTime}",
                      "cpu": "%cpu",
                      "mem": "%mem",
                      "load": "%loadavg"
                    }
                </pattern>
            </pattern>
            <timestamp>
                <!-- UTC is the best server consistent timezone -->
                <timeZone>${encoders.json.timeZone}</timeZone>
                <pattern>${encoders.json.timestampPattern}</pattern>
            </timestamp>
            <version/>
            <message/>
            <loggerName/>
            <threadName/>
            <logLevel/>
            <logLevelValue/><!-- numeric value is useful for filtering >= -->
            <stackHash/>
            <mdc/>
            <logstashMarkers/>
            <arguments/>
            <provider class="com.tersesystems.logback.exceptionmapping.json.ExceptionArgumentsProvider">
                <fieldName>exception</fieldName>
            </provider>

            <stackTrace>
                <!--
                  https://github.com/logstash/logstash-logback-encoder#customizing-stack-traces
                -->
                <throwableConverter class="net.logstash.logback.stacktrace.ShortenedThrowableConverter">
                    <rootCauseFirst>${encoders.json.shortenedThrowableConverter.rootCauseFirst}</rootCauseFirst>
                    <inlineHash>${encoders.json.shortenedThrowableConverter.inlineHash}</inlineHash>
                </throwableConverter>
            </stackTrace>
        </providers>
    </encoder>
</included>

File on Github

and produce output like this:

{"id":"FfwJtsNHYSw6O0Qbm7EAAA","relative_ns":20921024,"tse_ms":1584163814965,"start_ms":null,"@timestamp":"2020-03-14T05:30:14.965Z","@version":"1","message":"Creating Pool for datasource 'logging'","logger_name":"play.api.db.HikariCPConnectionPool","thread_name":"play-dev-mode-akka.actor.default-dispatcher-7","level":"INFO","level_value":20000}