Logstash TCP encoder

The logs go to the logstash server, but they are not indexed because they are not in the format I want.

How should I configure it?

output {

        elasticsearch {
                hosts => "elasticsearch:9200"
                manage_template => true
                index => "%{[fields][app]}"
            //LOGSTASH TCP
            LogstashTcpSocketAppender logstashTcpSocketAppender = new LogstashTcpSocketAppender();
            logstashTcpSocketAppender.setEncoder(new EncoderJSON());

import ch.qos.logback.classic.spi.ILoggingEvent;
import com.fasterxml.jackson.databind.ObjectMapper;
import net.logstash.logback.encoder.LogstashEncoder;

import java.nio.charset.StandardCharsets;
import java.text.SimpleDateFormat;
import java.util.*;

public class EncoderJSON extends LogstashEncoder {

    private final ObjectMapper objectMapper = new ObjectMapper();
    public byte[] encode(ILoggingEvent event) {

        SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS");

        try {

            String timestamp = dateFormat.format(new Date(event.getTimeStamp()));
            String threadName = event.getThreadName();
            String level = event.getLevel().toString();
            String loggerName = event.getLoggerName();
            String requestId = String.valueOf(event.getMDCPropertyMap().get("requestId") != null ? event.getMDCPropertyMap().get("requestId") : "");
            String message = event.getFormattedMessage();
            String exception = event.getThrowableProxy() != null ? event.getThrowableProxy().toString() : "";

            Map<String, Object> logMap = new LinkedHashMap<>();
            logMap.put("timestamp", timestamp);
            logMap.put("level", level);
            logMap.put("logId", requestId);
            logMap.put("message", message);
            logMap.put("threadName", threadName);
            logMap.put("loggerName", loggerName);
            logMap.put("exception", exception);

            Map<String, String> fields = new HashMap<>();
            fields.put("app", "test");

            String jsonLog = objectMapper.writeValueAsString(logMap)  + "\n";

            return jsonLog.getBytes(StandardCharsets.UTF_8);
        } catch (Exception e) {
            return new byte[0];

Do you have any errors in Logstash? Logs not being indexed is a Elasticsearch issue that will generate an error log in Logstash as well.

What does your logstash pipeline looks like? You only shared the output.

Also, what does your output looks like?

Badly formatted index, after interpolation still contains placeholder: [%{[fields][app]}]

The reason why it gives this error is that the encoderJSON class did not do its job, otherwise it would have added the fields and logstash would have captured the syntax in the output.

How do I implement LogstashTcpSocketAppender encoder

And what is the issue with Logstash? This does not seem to be an issue with Logstash, but with the code you are using to send data to Logstash, which is out of the scope of this forum.

You didn't share anything else, like what is your input, what are your filters, and what is the output that logstash is generating.

Without it is not possible to know if the issue is in your Logstash configuration or with your code.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.