How can I parse 2 log files to 2 index in Kibana

Hi ELK community,

I am a absolutely newbie in the Elasticsearch.
I am getting a stuck in creating 2 pipelines towards 2 indexes of 2 log file (2 services in corresponding).
My topo likes that: Filebeat ==> logstash ==> Elasticsearch==> Kibana.
In the filebeat.yml, i have a configuation to 2 files: (of course, I have comment output to Elasticsearch and uncomment to Logstash)

filebeat.inputs:
- type: log
  paths:
    - **/var/log/message**
  fields: {log_type: syslog}
- type: log
  paths:
    - **/var/log/apache2/access.log**
  fields:
    apache: true
  fields_under_root: true

In the logstash-sample.conf file, I have 3 parts:

  • input
input {
  beats {
    port => 5044
  }
}
  • filter
filter {
  if ([fields][log_type] == "syslog") {
   grok {
    match => { "message" => ['%{WORD:month} %{NUMBER:date} %{NOTSPACE:hour} %{NOTSPACE:hostname} %{GREEDYDATA:rest}'] }
    }
  }

  else if {
   grok {
    match => { "message" => ['%{COMBINEDAPACHELOG:message}'] }
    }
  }
}
  • Output
output {
  if ([fields][log_type] == "sys") {
    elasticsearch {
      hosts => ["http://localhost:9200"]
      index => "test"
    }
  }
  else if ([fields][log_type] == "apa") {
    elasticsearch {
      hosts => ["http://localhost:9200"]
      index => "apach"
    }
  }
}

Kindly help to figure out any my mistake and eventually, please help to share if you have any idea to approach my goal.
P/S: With single pipeline, i can do well.

Thanks in advance!

Additional information
I got an issue. It seems it comes from pipeline and logstash configuration.
Now Logstash is continuously restarting by itself :frowning:

Aug 17 22:36:21 thien-virtual-machine logstash[11138]: [2021-08-17T22:36:21,781][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \\t\\r\\n], \"#\", \"(\", \"!\", '\"', \"'\", \"-\", [0-9], \"[\", [A-Za-z_], '/' at line 17, column 11 (byte 345) after filter {\n  if ([fields][log_type] == \"syslog\") {\n   grok {\n    match => { \"message\" => ['%{WORD:month} %{NUMBER:date} %{NOTSPACE:hour} %{NOTSPACE:hostname} %{GREEDYDATA:rest}'] }\n    }\n  }\n\n  else if ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in `compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:187:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:72:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:391:in `block in converge_state'"]}

This condition is the source of your error.

i don't know what it should be.
I have read some articles, else if { if correct format.

According to the documentation a if need to be followed by an expression.

If you want to run the second grok plugin every time the [fields][log_type] don't contains syslog then only use else {

Thanks for your document.
Let me try with else only.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.