Logstash not sending to elasticsearch

Hello. I have a logstash configuration with a custom template. The template appears when I ask for it: "GET /_template/*". I followed this example: Logstash mapping

If I send the data to a file with json format this is what I get:

{
"@version": "1",
"@timestamp": "2016-07-12T22:35:03.423Z",
"date": "20160704",
"time": "105522.383935",
"thread": "13340",
"uuid": "18d1ccf3-9d2d-4602-bdb7-52a84351a57d",
"resource_id_module": "S2",
"method": "tryDeliver",
"file": "XMLdispatcher.cpp.532",
"severity": "info",
"interpreter": "InterpreterTimestamp",
"execute_time": "0",
"id_log": "sigas-control.log",
"received_from": "zero"
}

My template looks like this:

{
  "template" : "sigas-control.log",
  "settings" : { "index.refresh_interval" : "60s" },
  "mappings" : {
    "_default_" : {
      "_all" : { "enabled" : false },
      "dynamic_templates" : [{
        "message_field" : {
          "match" : "message",
          "match_mapping_type" : "string",
          "mapping" : { "type" : "string", "index" : "not_analyzed" }
        }
      }, {
        "string_fields" : {
        "match" : "*",
        "match_mapping_type" : "string",
        "mapping" : { "type" : "string", "index" : "not_analyzed" }
        }
      }],
      "properties" : {
        "@timestamp" :        { "type" : "date", "format" : "dateOptionalTime" },
        "@version" :          { "type" : "integer", "index" : "not_analyzed" },
        "thread":             { "type" : "integer", "index" : "not_analyzed" },
        "uuid":               { "type" : "string", "index" : "not_analyzed"  },
        "resource_id_module": { "type" : "string", "index" : "not_analyzed"  },
        "file":               { "type" : "string", "index" : "not_analyzed"  },
        "severity":           { "type" : "string", "index" : "not_analyzed"  },
        "log_message":        { "type" : "string", "index" : "not_analyzed"  },
        "interpreter":        { "type" : "string", "index" : "not_analyzed"  },
        "execute_time":       { "type" : "integer", "index" : "not_analyzed" },
        "date":               { "type" : "date", "format" : "yyyyMMdd"},
        "time":               { "type" : "date", "format" : "HHmmss" },
        "id_log" :            { "type" : "string", "index" : "not_analyzed" },
        "received_from" :     { "type" : "string",  "index" : "not_analyzed" }
      }
    }
  }
}

What is that I'm doing wrong?

What's the problem? Is Logstash not sending anything to ES? Or do the events end up with the wrong mappings? What does your Logstash configuration look like?

The mapping is created, but Logstash is not sending data to ES.

This is my Logstash conf:

input {
  file {
    path => "/home/spavez/Desktop/sigas.log"
    start_position => beginning
    ignore_older => 0
  }
}

filter {
  grok {
    patterns_dir => ["/etc/logstash/conf.d/patterns"]
    match => { "message" => "%{DATE2:date}\|%{TIME2:time}\|%{NUMBER:thread}\|%{UUID2:uuid}-%{MODULERES:resource_id_module}\|%{WORD:method}\|%{JAVACLASS:file}\|%{LOGLEVEL:severity}\|%{WORD:interpreter} %{WORD:method2} %{NUMBER:execute_time}" }
    add_field => [ "id_log", "sigas-control.log" ]
    add_field => [ "received_from", "%{host}" ]
    remove_field => [ "method2", "host", "message", "path" ]
  }
}

output {
  if [id_log] == "sigas-control.log" {
    elasticsearch {
      index => "sigas-control.log"
      template => "/etc/logstash/sigas-control.log.json"
      template_name => "sigas-control.log"
      template_overwrite => true
    }
    stdout {
      codec => "json"
    }
    file {
      codec => "json"
      path => "/tmp/debug-filters.json"
    }
  }
}

Is Logstash even reading anything from the file or is it waiting for more data to be appended, i.e. is its sincedb position pointing to the end of the file.

It's waiting for more data to be appended. I need to add new lines to the file in order for logstash to do anything.

So.... problem solved then?

No, because I can see in my temp file that something is being parsed, but not going to elastic. I can read the line and send it to a file, but not to ES.

What do the events logged to debug-filter.json look like?

Like this:

Okay, that's odd. Things I'd do:

  • Crank up Logstash's logging and look for clues.
  • Make sure that it's not buffering in the elasticsearch output that's making you think that Logstash isn't sending anything. Appending a number of messages to fill up the buffer should accomplish that.
  • Maybe Logstash is sending to ES and you're just looking in the wrong place? Capturing the network traffic will tell for sure.

The thing is, when I don't use my template, the data appears in ES as new docs.

health status index pri rep docs.count docs.deleted store.size pri.store.size yellow open sigas-control.log 5 1 2 0 6.2kb 6.2kb
I know that the problem is in my template, but I don't know why. I'm new to ELK so there is a lot I don't know.

1 Like

This is what I get when in Logstash log:

"reason"=>"failed to parse [time]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"105522.383935\" is malformed at \".383935\""}}}}, :level=>:warn}

Even though grok recognizes that as time format HHmmss, ES does not accept it. I'm going to set that as a plain string and see what happens.

Edit:

That was the problem, ES didn't accept 105522.383935 as HHmmss time format. Now the data is being sended to ES and I can access it via Kibana with my template format. Sorry for the issue, should of just started by debugging Logstash.

Thanks for your help @magnusbaeck .

here logstash is creating logs but not sending it to ES.
this is my log_config file

input {
tcp {
port => 5022
type => "syslog"
}
udp {
port => 5022
type => "syslog"
}
}

output {
elasticsearch {
hosts => ["192.168.1.103:9200"]
user => "elastic"
password => "changeme"
index => "logstash-%{+YYYY.MM.dd}"
}
stdout { codec => "rubydebug" }
}
can anyone help me with this?

@honey, please start a new thread for your question.