Logs not getting to elasticsearch from logstash

I have an elastic cluster with a logstash server as the first point of entry. In this ecosystem, I have filebeat running on several windows servers. They are sending logs to logstash, which then forwards them to elasticsearch. I use kibana to view the documents.

All of this is working correctly, with no issues.

I have installed filebeat on a linux server recently, to collect more application logs. I have modified the filebeat config, and started the service. Using tcpDump, I was able to see traffic from the linux server to the logstash server on the correct port. However, I can't find the logs in kibana.

How can I troubleshoot further? What could be the issue?

Thanks in advance.

Can you show us your configuration? Is the output conditional? Can you add output { stdout {} } and see if the events from Linux show up?

Below is my logstash pipeline. I've modified a few things to try to keep private. Where would I add the "{ stdout {} } " that you mentioned?

input {
    beats {
            port => 1234
            tags => ["logstash"]
    }
}
# The filter part of this file is commented out to indicate that it is
# optional.
filter {
    if "xx-log" in [tags] {
        grok {
            break_on_match => false
            tag_on_failure => [ ]
            match => { "message" => [
                     "(?<log_timestamp>[0-9]{1,4}/[0-9]{1,2}/[0-9]{1,2} [0-9]{1,2}:[0-9]{1,2}:[0-9]{1,2}\.[0-9]{1,3})\s{1,}(?<ServerName>[^\s]+|[^\t]+)\s{1,}(?<UUID>[^\s]+|[^\t]+)\s{1,}\[(?<UserId>[0-9]+)[-](?<DeviceId>[0-9]+)\]\s{1,}\S(?<ThreadID>[0-9]{1,})\S\s{1,}(?<LogLevel>[^\s]+|[^\t]+)\s{1,}(?<Logger>\S+)(?<Message>.*)", #AW Logger 1 DONE
                     "\*\*\*(\s*)EXCEPTION(\s*)\*\*\*(\s|\n|\t|\r)*(?<aspect_exception>[^:]*):(\s*)?", #aspect_exception
                     "(\s|;)\s+LocationGroupID:\s+(?<aspect_location_group_id>-?\d+);(\s+)UserID", #aspect_location_group_id
                     "(\s)+Method:(\s+)(?<aspect_method>[^; ]*);(\s+)(Duration|Returns|Parameters|LocationGroupID)", #aspect_method
                     "(\s|;)Duration:(\s)+(?<aspect_duration>-?\d+)\s+(ms)", #aspect_method_duration
                     "RequestUri:\s+(https|http)\:\/\/(.*).(com|net|org)(?<aspect_method_param_api_url>.*); RequestMethod", #aspect_method_param_api_url
                     "(\s|;)Parameters:(\s)?(?<aspect_method_params>.*)(;)?", #aspect_method_params
                     "(\s|;)Platform=(\s*)(?<aspect_method_params_platform>[^:]*);", #aspect_method_param_platform
                     ", Platform:\s+(?<aspect_method_param_device_platform>[^,]*)\, (Version|\s+)", #aspect_method_param_device_platform
                     ",\s+Version:\s+(?<aspect_method_param_device_version>[^,]+)\, (\s+|IMEI)", #aspect_method_param_device_version
                     "systemApplication =\s+(?<aspect_method_param_system_application>.*),", #aspect_method_param_system_application
                     "(\s|;)Returns:(\s)+(?<aspect_method_returns>.*);\sDuration", #aspect_method_returns
                     "(\s|;)\s+UserID:\s+(?<aspect_user_id>-?\d+);(\s+)(UserName|Returns)", #aspect_user_id
                     "(\s|;)\s+UserName:\s+(?<aspect_user_name>.*);\s+(Parameters|Return)"
                     ] #aspect_user_name
            }
        }
    }
    if "iis-log" in [tags] {
        grok {
            match => { "message" => ["(?<timestamp>(\d){4}-(\d){1,2}-(\d){1,2}\s(\d){1,2}:(\d){1,2}:(\d){1,2})\s(?<Source_IP>(\d){1,3}.(\d){1,3}.(\d){1,3}.(\d){1,3})\s(?<Method>GET|POST|HEAD|PUT)\s(?<Url>/.+\w)\s(?:-)?(?:\s)?(?<Port>\d{2,5})\s-\s(?<Dest_IP>(\d){1,3}.(\d){1,3}.(\d){1,3}.(\d){1,3})\s(?<Agent>[^ ]+)\s(?<Referer>[^ ]+)\s(?<StatusCode>\d{3})\s(?<SubStatusCode>\d{1,3}?)\s(?<WinStatus>\d{1,2})\s(?<TimeTaken>\d+)"]
           }
        }
    }
    if "xx1-services-log" in [tags] {
        grok {
            match => { "message" => [
                "(?<log_timestamp>[0-9]{1,4}/[0-9]{1,2}/[0-9]{1,2} [0-9]{1,2}:[0-9]{1,2}:[0-9]{1,2}\.[0-9]{1,3})%{SPACE}%{NOTSPACE:machinename}%{SPACE}%{NOTSPACE:ActivityId}%{SPACE}\[%{INT:UsertId}-%{INT:DeviceId}\]%{SPACE}\(%{INT:ThreadId}\)%{SPACE}%{LOGLEVEL:LogLevel}%{SPACE}%{NOTSPACE:Logger}%{SPACE}%{SPACE}%{NOTSPACE:Username}%{SPACE}%{NOTSPACE:AuthenticationType}%{SPACE}%{URIPATH:Url}%{SPACE}%{NOTSPACE:QueryString}%{SPACE}%{WORD:Method}%{SPACE}%{NUMBER:StatusCode}%{SPACE}%{NUMBER:TimeTaken:int}%{SPACE}%{URIPATH:BaseUrl}%{SPACE}"
                ,"(?<log_timestamp>[0-9]{1,4}/[0-9]{1,2}/[0-9]{1,2} [0-9]{1,2}:[0-9]{1,2}:[0-9]{1,2}\.[0-9]{1,3})%{SPACE}%{NOTSPACE:machinename}%{SPACE}%{NOTSPACE:ActivityId}%{SPACE}\[%{INT:UsertId}-%{INT:DeviceId}\]%{SPACE}\(%{INT:ThreadId}\)%{SPACE}%{LOGLEVEL:LogLevel}%{SPACE}%{NOTSPACE:Logger}%{SPACE}%{GREEDYDATA:Message}"
                ]
            }
        }
    }
    if "xx-1234-log" in [tags] {
        grok {
            match => { "message" => ["(?<log_timestamp>[0-9]{1,4}-[0-9]{1,2}-[0-9]{1,2} [0-9]{1,2}:[0-9]{1,2}:[0-9]{1,2}\,[0-9]{1,3})\s(?<LogLevel>[a-zA-Z]{1,})\s\(pool-(?<ProcessId>[^\-]+)-thread-(?<ThreadId>[^\)]+)\)\s\[(?<Function>[^\]]+)\]\s\-\s(?<Message>.*)"]
            }
        }
    }
    date {
       match => [ "log_timestamp","yyy2y-MM-dd'T'HH:mm:ss.SSSZ", "yyyy/MM/dd HH:mm:ss.SSS", "HH:mm:ss.SSS", "yyyy-MM-dd HH:mm:ss.SSS", "yyyy-MM-dd HH:mm:ss", "MM/dd/yyy HH:mm:ss.SSS", "yyyy-MM-dd HH:mm:ss,SSS" ]
       timezone => "GMT"
       target => ["log_timestamp"]
       }
    mutate {
       gsub => [
         "message", "\r\n", " ",
         "message", "\n", " ",
         "message", "\t", " "
       ]
       remove_field => ["beat","offset","input_type","_grokparsefailure"]
       rename => { "source" => "path" }
       copy => { "ServerName" => "host" }
      }
      if "beats_input_codec_plain_applied" in [tags] {
            mutate {
                    remove_tag => ["beats_input_codec_plain_applied"]
            }
    }
}
output {
        elasticsearch {
            hosts => [""]
            index => "applogs-%{+YYYY.MM.dd}"
            user => ""
            password => ""
            ssl => true
            cacert => ""
        }
}

The elasticsearch output is unconditional, and you never drop {} events, so they should be getting to elasticsearch unless you are getting errors. Is the Linux box in the same timezone? Is it possible the date is not matching so you are just looking in the wrong time window in Kibana?

You could add

output { stdout {} }

to the end of your configuration file and logstash will dump the events to stdout as it processes them.

Add,

stdout { codec => rubydebug }

Thank you for the info. Just so I'm clear, would this replace the current output section, or would I add it after the current output section?

Add it after the current one.

Update:

I added the suggested line at the end of my pipeline config. While I still don't see the logs in kibana, I now noticed some errors in the logstash log. Please see below. Thanks for your help.
[2019-01-29T14:35:58,924][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=&gt;400, :action=&gt;["index", {:_id=&gt;nil, :_index=&gt;"logs-2019.01.29", :_type=&gt;"doc", :_routing=&gt;nil}, #&lt;LogStash::Event:0x3c8d0b9c&gt;], :response=&gt;{"index"=&gt;{"_index"=&gt;"logs-2019.01.29", "_type"=&gt;"doc", "_id"=&gt;"DEwcm2gB_KkFSMcobyzy", "status"=&gt;400, "error"=&gt;{"type"=&gt;"mapper_parsing_exception", "reason"=&gt;"failed to parse [host]", "caused_by"=&gt;{"type"=&gt;"illegal_state_exception", "reason"=&gt;"Can't get text on a START_OBJECT at 1:276"}}}}}

Is there an error in the elasticsearch log at that time?

Thank you for the help. It turned out to be a bug using a mix of 6.0 filebeat on some servers and 6.3 on this one. The host and host.name attributes sent by filebeat collide in logstash apparently. The solution is either add a line to ignore host.name in this config go back and install 6.0 version instead.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.