Convert a field from string to ip type

Hello,

I am trying to upload my text file on Elasticsearch using Logstash. Below is how my config file look like. When I upload my text files on elasticsearch, the src_ip and dst_ip fields are by default taken as string type. Can you please tell me how can I convert src_ip and dst_ip field in ip type in filter?

Thanks!

input {
file {
path => "/root/Downloads/new_logs/*.txt"
type => "txt_file"
start_position => "beginning"
}
}
filter {
csv {
columns => ["sig_name", "sig_sid", "triggertimestampString", "src_ip", "dst_ip", "ip_proto", "customer_name", "correlated_ids"]
separator => "| "
}
mutate {
convert => {
"sig_sid" => "integer"
"ip_proto" => "integer"
"correlated_ids" =>"boolean"
}
}
date {
match => [ "triggertimestampString", "YYYY-MM-dd HH:mm:ss.SSS-SS ", "YYYY-MM-dd HH:mm:ss.SS-SS ", "YYYY-MM-dd HH:mm:ss.S-SS " ]
target => "ttimestamp"
}

}
output {
elasticsearch {
action => "index"
index => "testI"
workers => 1
}
stdout {}
}

The conversion is not done within Logstash, but as a manually applied Elasticsearch mapping.

This can be done within a template, so that the mapping is automatically applied to each new index created (that matches the pattern). For example, you'd put the lines:

"src_ip": { "type": "ip"},
"dst_ip": { "type": "ip"),

immediately following the line starting with @version in this example template. You would then need to use the template management directives in the elasticsearch output plugin to use your updated template.

I understand that the conversion doesn't happen within logstash. But I am new to elasticsearch and I don't know other ways of applying mapping to elasticsearch.

Also, can you please tell me more about template? Do I have to create a template? Or it create every time I make new index?

Thank you!

Once a template is in place, all indices matching the name pattern will get that template. There is one that ships with Logstash (the one linked above). I suggest copying that one, and editing it with the lines I recommended.

Then, in your elasticsearch output block, add:

template => "/path/to/your/new/template.json"
template_overwrite => true

This will only work if you are using the default logstash-YYYY.MM.dd naming pattern. (If you do not have a custom index => directive in your elasticsearch block, then you are using the default).

i have been searching for his, thanks for the info but unfortunately for new people this is not very useful so i am sharing the whole code to help :slight_smile: (used default template in elastic 5.1.2 as template)

sec_on-* is the index name pattern

ID_RESP_H & ID_ORIG_H is my source &dest ip

#run the below to create template
PUT /_template/sec_on
{
"template": "sec_on-*",
"order": 1,
"settings": {
"index": {
"refresh_interval": "5s"
}
},
"mappings": {
"default": {
"_all": {
"norms": false,
"enabled": true
},
"properties": {
"ID_RESP_H": { "type": "ip"},
"ID_ORIG_H": { "type": "ip"}
}
}
}
}