Create multiple indexes

I am trying to send two json logs from server A to my logstash server. Previously, when we only sent a json log file from server A to the logstash server,
it was possible to create a data view and view the log in discover. But by sending two json log files it seems that I am not receiving anything now.
Below my filebeat config and below that my logstash config.
Does anyone have an idea what I might be doing wrong ? I am supposed to get two different indexes.

Elastic version 8.1

 input {
   beats {
     port => 5044
   }
}

filter {
   if [document_type] == "test.json" { # Controleer op het "document_type" veld
     json {
       source => "message"
     }
     mutate {
       remove_field => ["sequence", "loggerClassName", "ndc", "processName", "processId"]
     }
   }
   else if [document_type] == "abc.json" { # Controleer op het "document_type" veld
     json {
       source => "message"
     }
     mutate {
       remove_field => ["sequence", "loggerClassName", "ndc", "processName", "processId"]
     }
   }
}

output {
   if [document_type] == "test.json" { # Controleer op het "document_type" veld
     elasticsearch {
       hosts => ["XXX.XXX.XXX.XXX:9200"]
       index => "test.logs-%{+YYYY.MM.dd}"
       user => "admin"
       password => "unknown"
     }
   }
   else if [document_type] == "abc.json" { # Controleer op het "document_type" veld
     elasticsearch {
       hosts => ["XXX.XXX.XXX.XXX:9200"]
       index => "abc.server-logs-%{+YYYY.MM.dd}"
       user => "admin"
       password => "unknown"
     }
   }
}

filbeat.conf.yaml

filebeat.inputs:
  - type: log
    enabled: true
    paths:
     - /data/test.json
    multiline.pattern: '^{'
    multiline.negate: true
    multiline.match: after
    scan_frequency: 60000
    fields:
      type: "test.json" # Voeg hier een "type" veld toe
  - type: log
    enabled: true
    paths:
     - /data/abc.json
    multiline.pattern: '^{'
    multiline.negate: true
    multiline.match: after
    scan_frequency: 60000
    fields:
      type: "abc.json" # Voeg hier een "type" veld toe

processors:
- drop_fields:
    fields: ["sequence", "loggerClassName", "ndc", "processName", "processId"]

output.logstash:
  hosts: ["XXX.XXX.XXX.XXX:5044"]

Thanks in advance

Hello,

Please share a sample of your messages before logstash and how the output looks like.

But there are a couple of things that seems wrong.

First, you are not parsing the message from beats, you need to parse it.

Add a json filter as your first filter.

filter {
    json {
        source => "message"
        remove_field => "message"
    }
}

Second, where is the document_type field coming from? In filebeat you are adding a field named type, this will be added to your document as fields.type, not document_type.

So your conditionals should use [fields][type] in your output block.

Also, from what you shared you do not need those conditionals on your filter block, they won't work because the document_type field does not exist, and even if you just change it to [fields][type] they will also not work because the message was not parsed, and since the conditionals are used to parse the message you can change your filter block to just this:

filter {
    json {
        source => "message"
        remove_field => "message"
    }
    mutate {
        remove_field => ["sequence", "loggerClassName", "ndc", "processName", "processId"]
    }
}

On your filebeat configuration I don't think this does anything, you are not parsing your json document on filebat, so those fields would not exist on filebeat.

Hello,
First of all thanks for your reply.
Here are two examples. Mostly we get short / normal json messages, but sometimes also with a multiline stacktrace

{
    "timestamp": "2023-11-07T06:23:44.777+01:00",
    "sequence": 50441671,
    "loggerClassName": "logger",
    "loggerName": "test",
    "level": "DEBUG",
    "message": "hello",
    "threadName": "default task-1704",
    "threadId": 3876,
    "mdc": {
    },
    "ndc": "",
    "hostName": "google.com.just.named.something",
    "processName": "dunno",
    "processId": 1640
}
{
    "timestamp": "2023-11-07T06:23:44.777+01:00",
    "sequence": 50441671,
    "loggerClassName": "logger",
    "loggerName": "test",
    "level": "DEBUG",
    "message": "hello",
    "threadName": "default task-1704",
    "threadId": 3876,
    "mdc": {
    },
    "ndc": "",
    "hostName": "google.com.just.named.something",
    "processName": "dunno",
    "processId": 1640
    "stackTrace": ": org. cxf.interceptor.Fault: Unmarshalling Error: 
	cvc-pattern-valid: Value 'wat’ 
	is not facet-valid with respect to   multiline stacktrace much longer then this example
}

The document_type i found on the internet. I got a warning in my logstash logging “type` event field won't be used to determine the document _type {:es_version=>8}” and i didnt know what to anymore, so i tryed something. I first used the type field to distinguish between two different log files for logstash, but that doesn't work.
I will try to make changes tonight, i will keep you posted.

Topic can get closed.

leandrojmp helped me enough with his comments got it working now. Tx!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.