How to route logs to different Indexs

Hi,

I am trying to route certain logs to index 1 and other logs to index 2. I know that this not the best practice but this will solve us many problems.

One log looks like:
/opt/mod/a.log
The other looks like:
/opt/mod/b.log

in log type a.log I have a term called: "apiMedTimeSent", There for after looking at the link that I attached at the bottom I tried to do the following configuration:

output {
     if "apiMedTimeSent" in (What to put here) [
         index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
     ]
  }
}

But as you see I don't know how to continue this. Also is there a better way to do this?

Many thanks,
Tomer

https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html

I am not sure if you can do it (what to put there) in the output section.
Having said that, you can write a small ruby code fetch file "path" value in filter section. Then set the type of log based on the path value.

filter {
 ruby {
             " if event.get(path) is equal to '/opt/mod/a.log' // pseudocode
                   event.set('type', 'a')
              else
                  event.set('type', 'a')"
 }
 }

Now you can check the type in the output section -

if [type] == "a" {
                                 elasticsearch {

   }
}
else
          {
                                elasticsearch {

                                 }
}

in log type a.log I have a term called: "apiMedTimeSent"

What does this mean, exactly? That the string "apiMedTimeSent" occurs in the log message? If so:

if "apiMedTimeSent" in [message] {

How do these logs end up in Logstash? A file input in Logstash? Filebeat? Would it be possible to classify them at the source? In other words, would it be an option to configure the input so that it'll tag the messages in a way so that later stages in the pipeline knows what to do about them?

1 Like

Hi they come in by filebeat, but I dont know how to tag multiple file in the filebeat. Also if I tag them how can I send the different tagged files to different indexs?

Hi they come in by filebeat, but I dont know how to tag multiple file in the filebeat.

https://www.elastic.co/guide/en/beats/filebeat/current/configuration-filebeat-options.html#_tags
https://www.elastic.co/guide/en/beats/filebeat/current/configuration-filebeat-options.html#configuration-fields

Also if I tag them how can I send the different tagged files to different indexs?

You're already using conditionals to send different events to different indexes so I don't know what's unclear.

Hi,

Thanks for the help until now. Sorry on the late response, I am just stuck on few issues.

Well I tried:

output {
  elasticsearch {
    hosts => ["192.168.1.116:9200"]
    manage_template => false
    if "apiMedTimeSent" in [message] {
        index => "%{[@metadata_traffic][beat]}-%{+YYYY.MM.dd}"
    }
    index => "%{[@metadata_else][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

but this gives me in the logs of logstash an error:

[2017-06-27T11:11:05,012][ERROR][logstash.agent ] fetched an invalid config {:config=>"input {\n beats {\n port => 5044\n }\n}\n\nfilter {\n json{\n source => "message"\n }\n date {\n match => [ "msgSubmissionTime", "UNIX_MS" ]\n target => "msgSubmissionTime"\n }\n date {\n match => [ "msgDeliveryTime", "UNIX_MS" ]\n target => "msgDeliveryTime"\n }\n date {\n match => [ "eventTs", "UNIX_MS" ]\n target => "eventTs"\n }\n\n\n mutate {\n convert => { \n\t"concatenated" => "boolean" \n\t"msgLength" => "integer"\n }\n }\n\n}\n\n\noutput {\n elasticsearch {\n hosts => ["192.168.1.116:9200"]\n manage_template => false\n if "apiMedTimeSent" in [message] {\n index => "%{[@metadata_traffic][beat]}-%{+YYYY.MM.dd}"\n }\n index => "%{[@metadata_else][beat]}-%{+YYYY.MM.dd}"\n document_type => "%{[@metadata][type]}"\n }\n}\n\n\ninput {\n beats {\n port => 5044\n }\n}\n\nfilter {\n json{\n source => "message"\n }\n date {\n match => [ "msgSubmissionTime", "UNIX_MS" ]\n target => "msgSubmissionTime"\n }\n date {\n match => [ "msgDeliveryTime", "UNIX_MS" ]\n target => "msgDeliveryTime"\n }\n date {\n match => [ "eventTs", "UNIX_MS" ]\n target => "eventTs"\n }\n\n\n mutate {\n convert => { \n\t"concatenated" => "boolean" \n\t"msgLength" => "integer"\n }\n }\n\n}\n\n\noutput {\n elasticsearch {\n hosts => ["192.168.1.116:9200"]\n manage_template => false\n index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"\n document_type => "%{[@metadata][type]}"\n }\n}\n\n\n", :reason=>"Expected one of #, => at line 39, column 8 (byte 561) after output {\n elasticsearch {\n hosts => ["192.168.1.116:9200"]\n manage_template => false\n if "}

Any idea why is this?

I also tried to tag the different logs by doing the following:

- input_type: log

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    #- /opt/modulo/log/smsc.full.*.log
    - /opt/modulo/smsc/cdr/*.cdr
    tags: ["cdr"]
    - /opt/modulo/smsc/cdr/traffic*
    tags: ["traffic"]
    #- c:\programdata\elasticsearch\logs\*

but this resulted in:

Exiting: error loading config file: yaml: line 23: did not find expected '-' indicator

You can't have conditionals inside the elasticsearch output. You need this:

if ... {
  elasticsearch {
    index => "x"
    ...
  }
} else  {
  elasticsearch {
    index => "y"
    ...
  }
}
1 Like

Understood :slight_smile:
Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.