Additional fields while inserting to elastic

Hello, i am very new to logstash. I want to import the entries in a file to the elastic.

first I created the index on elastic :

{
"mytype-template": {
"template": "mytype-*",
"settings": {
"refresh_interval": "60s"
},
"mappings": {
"mytype": {
"_source": {
"enabled": true
},
"_all": {
"enabled": true
},
"properties": {
"timestamp": {
"type": "date",
"format": "dateTime",
"doc_values": true
},
"state": {
"type": "string",
"index": "not_analyzed",
"doc_values": true
},
"msisdn": {
"type": "string",
"index": "not_analyzed"
}
*** other fields
}
}
}
}
}

Sample lines in the file : 20170808155336480|Success|tel:a tel number|other fields...
*20170808155416380|Success|tel:a tel number|other fields...

I want to import the contents of the file to elastic after some transformation.
So, I created the below config file.

my config file is this

input {
file {
type => "mytpe"
path => "/path/*.edr"
start_position => "beginning"
}
}
filter {
if [type] == "mytpe" {
csv {
separator => "|"
columns => ["timestamp","state","msisdn"]
}

 date {
        match => [ "timestamp", "yyyyMMddHHmmssSSS" ]
        target => "timestamp"
   }    
 }

}
output {
if [type] == “mytpe” {
elasticsearch {
hosts => ["localhost:9200"]
index => "mytpe-write"
document_type => "mytpe"
}
stdout { codec => json }
}
}

Logstash is able to import the entries to elastic but when I query them I saw that there are additional fields like "path","type","@version","@timestamp" ,"message". Also in elastic search logs I see this log:

[2017-08-11T11:43:23,647][INFO ][o.e.c.m.MetaDataMappingService] [r5XrGDs] [mytype-2017.08/EPv0T8MzR0CL0tTuhSlFSQ] update_mapping [mytype]

I don't want my mappings updated. Is there a way to prevent the additional fields to be imported?

Note : I tried to modify config with the mutate filter (after date filter) but this time nothing happened. No data were imported.

mutate {

        remove_field => ["path"]
        remove_field => ["@version"]
        remove_field => ["host"]
        remove_field => ["message"]
        remove_field => ["type"]
        remove_field => ["@timestamp"]

    }

Any help would be appreciated.
Thanks

I don’t want my mappings updated. Is there a way to prevent the additional fields to be imported?

Yes, you can disable dynamic mappings so that all fields must be explicitly defined.

You can also use the prune filter to delete all fields except a certain set of fields.

Note : I tried to modify config with the mutate filter (after date filter) but this time nothing happened. No data were imported.

Did you clear the sincedb file before running Logstash again?

Magnus hi, thanks for quick response . I didn't clear the sincedb but I was creating new files with different names.

Also, after your comment I deleted the sincedb files which was under LOGSTASH_PATH/data/plugins/inputs/file . Created new input files and restarted logstash but nothing were imported to elastic .

It is very strange but if l let type to be imported (using below mutate filter) logstash is able to import the data to elastic. As seen below "type" is allowed :

mutate {

    remove_field => ["path"]
    remove_field => ["@version"]
    remove_field => ["host"]
    remove_field => ["message"]
    remove_field => ["@timestamp"]

}

But since type is allowed it updates my mapping in elastic search. I tried to the same thing with prune filter also but it didn't work either.

is there way to handle this?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.