Unable to send data from filebeat 7.7.0 > logstash 7.7.0 > elasticsearch 7.7.0

Hello,

I've got a server "A" which collect informations from a local directory and i try to send these informations to a server "B" with filebeat.

My logstash pipeline running into the B server looks like :

input {
  beats {
    port => 5044
  }
}
filter
{
  grok {
    patterns_dir => [ "/etc/logstash/conf.d/patterns" ]
    match => { "message" => [ "%{DATA:DATE};%{WORD:ELD};%{INT:ETAT}" ] }
  }
  mutate {
    convert => { "ETAT" => "integer" }
  }
  date {
    match => [ "DATE", "yy-MM-dd HH:mm" ]
  }

}
output
{
  elasticsearch
  {
    hosts => "http://localhost:9200"
    index => "disponibilite_production_mldx"
  }
  stdout { codec => rubydebug }
}

Datas from file which filebeat collects looks like :

20-07-01 11:31;TAL;0

But when I run filebeat on A, logstash on B see the logs but is not able to index with this error message :

[WARN ] 2020-07-01 11:35:13.271 [[main]>worker1] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"disponibilite_production_mldx", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x46f5f323>], :response=>{"index"=>{"_index"=>"disponibilite_production_mldx", "_type"=>"_doc", "_id"=>"T6i6CXMB_0id-bFc8URk", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [host] of type [text] in document with id 'T6i6CXMB_0id-bFc8URk'. Preview of field's value: '{hostname=lpansmld1, os={kernel=3.10.0-957.1.3.el7.x86_64, codename=Maipo, name=Red Hat Enterprise Linux Server, family=redhat, version=7.6 (Maipo), platform=rhel}, containerized=false, ip=[192.168.210.101], name=lpansmld1, id=a7213c3e390e4ed1bca684e5f5fe37f9, mac=[00:50:56:84:fd:b3], architecture=x86_64}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:9"}}}}}

The mapping of my data showed by rubydebug looks like :

{
          "host" => {
             "hostname" => "lpansmld1",
        "containerized" => false,
                   "os" => {
              "family" => "redhat",
            "platform" => "rhel",
              "kernel" => "3.10.0-957.1.3.el7.x86_64",
            "codename" => "Maipo",
                "name" => "Red Hat Enterprise Linux Server",
             "version" => "7.6 (Maipo)"
        },
                  "mac" => [
            [0] "00:50:56:84:fd:b3"
        ],
         "architecture" => "x86_64",
                   "id" => "a7213c3e390e4ed1bca684e5f5fe37f9",
                   "ip" => [
            [0] "192.168.210.101"
        ],
                 "name" => "lpansmld1"
    },
           "ELD" => "TAL",
          "ETAT" => 0,
         "input" => {
        "type" => "log"
    },
         "agent" => {
             "version" => "7.7.0",
            "hostname" => "lpansmld1",
                  "id" => "d4508889-9e51-4717-b04e-5616647d3273",
                "type" => "filebeat",
        "ephemeral_id" => "73272905-8b24-41ff-a8b0-881cc1091b62"
    },
           "ecs" => {
        "version" => "1.5.0"
    },
       "message" => "20-07-01 11:31;TAL;0",
      "@version" => "1",
          "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
           "log" => {
          "file" => {
            "path" => "/etc/ansible/roles/SCRIPTS/gestion_MLDV13/GESTION_PRODUCTION/files/LOG_GESTION_PRODUCTION_H24/DISPONIBILITE_20-07-01"
        },
        "offset" => 11130
    },
          "DATE" => "20-07-01 11:31",
    "@timestamp" => 2020-07-01T09:31:00.000Z
}

Is there something wrong ? I don't understand ..

Thanks

Your field "host" is mapped as text in your index, but you're trying to insert an object. You'll need to reindex your data into a new index with the correct mapping.

I did'nt chose that.

Is it possible to drop all info provided by filebeat into my pipeline logstash ? I just want these 3 fields :

DATE
ELD
ETAT

=> Which is in my data files.

I don't matter about the rest of infos

If you don't configure a mapping, ES will do it automatically as soon as the field appears for the first time. When that happened, host was a string, not an object.
If you only want these few fields anyway, you can use a prune filter in Logstash to get rid of anything else.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.