Something wrong with multiline

Hello,

I try to use multiline pattern for my ELK system (ElasticSearch+LogStash+Kibana+FileBeat). There is my filebeat.yml:

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /testvar/test.log
  multiline:
    pattern: '^\['
    negate: true
    match: after

output.logstash:
  hosts: ["localhost:5044"]

setup.template.overwrite: true
setup.dashboards.overwrite: true
setup.template.enabled: true

setup.kibana:
  host: "localhost:5601"

This configuration seems to be the simplest but this example does not combine into one log message:

[beat-logstash-some-name-832-2015.11.28] IndexNotFoundException[no such index]
    at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver$WildcardExpressionResolver.resolve(IndexNameExpressionResolver.java:566)
    at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver.concreteIndices(IndexNameExpressionResolver.java:133)
    at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver.concreteIndices(IndexNameExpressionResolver.java:77)
    at org.elasticsearch.action.admin.indices.delete.TransportDeleteIndexAction.checkBlock(TransportDeleteIndexAction.java:75)

Hi @g.myznikov.tinkoff,

In order to narrow down the scope of your problem, I just tested your configuration without Logstash in the mix. Doing that I was able to get multiline messages into Elasticsearch as expected.

Do you mind testing without Logstash in the mix as well on your end and confirming? To do this, comment out:

# output.logstash
#   hosts: ["localhost:5044"]

And make sure that output.elasticsearch is configured correctly.


If this "solves" the issue, then could you post your Logstash pipeline configuration as well?

Thanks,

Shaunak

Thanks for advice. I have just tested without LogStash and it've worked fine. But it does not work with LogStash. There is my pipeline.conf:

input {
  beats {
    port => [5044]
#    codec => multiline {
#      pattern => "^\["
#      what => "previous"
#    }
  }
}

}
filter {
  grok {
    match => { "message" => "%{TIMESTAMP_ISO8601:logtime" }
  }
}

output {
  elasticsearch {
    hosts => "localhost:9200"
    manage_template => false
    index => "filebeat-testing"
  }
}

Аre there any mistakes?

A few points:

  1. There seem to be a few syntax errors in the Logstash pipeline configuration you posted. I've cleaned them up and re-posted the configuration below. See my comments starting with ###
input {
  beats {
    port => [5044]
#    codec => multiline {
#      pattern => "^\["
#      what => "previous"
#    }
  }
}

# } ### You probably meant to delete this line?
filter {
  grok {
    match => { "message" => "%{TIMESTAMP_ISO8601:logtime}" } ### There was a } missing in the grok pattern
  }
}

output {
  elasticsearch {
    hosts => "localhost:9200"
    manage_template => false
    index => "filebeat-testing"
  }
}

In the future, you can run ./bin/logstash -t -f /path/to/your/pipeline.conf to check the syntax of your pipeline configuration.

  1. I tried sending the sample multiline message using filebeat + the (syntactically valid) logstash pipeline configuration. I'm getting a grok parse failure error. So I would suggest commenting out the filter section until you've got the multiline parsing and ingestion working as expected.

  2. To easily test what documents will be indexed into Elasticsearch, without actually indexing them, you can temporarily comment out the elasticsearch output section and instead add a stdout output section that looks like this:

output {
  stdout {
    codec => rubydebug
  }
}

You can use this to debug any issues until you get the event structure just like you want it. After that you can remove (or comment out) the stdout output and put back the elasticsearch output.

Hope that helps,

Shaunak

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.