Some of our data has dots (.) in them which is preventing us from upgrading from 1.7.1 to 2.3
Our plan is to reindex 433 indices about 15.74 TB of data.
This is a sample of our data containing dots
logs:gxc.xxxx, logs:gxc.x, logs:gxc.xxx
Can I use logstash as follow to convert the dots into dashes?
current version logstash 2.2.0
/opt/logstash/bin/plugin install logstash-filter-de_dot
/opt/logstash/bin/plugin update logstash-filter-elapsed
filter {
ruby {
code => "
event.to_hash.keys.each { |k| event[ k.sub('.','_') ] = event.remove(k) if k.include?'.' }
"
}
}
I installed the plugin: /opt/logstash/bin/plugin install logstash-filter-de_dot and try to reindex but somehow the new index still has dots in the field names. I was expecting the dots to be replaced with dash, but nothing happened. I am not sure what I am doing wrong.
Here is my logstash configuration
input {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "index-name-2016.04.07"
size => 500
scroll => "5m"
docinfo => true
}
}
filter {
ruby {
code => "
event.to_hash.keys.each { |k| event[ k.sub('.','_') ] = event.remove(k) if k.include?'.' }
"
}
}
output {
elasticsearch {
hosts => "remote-cluster:9200"
index => "index-name-2016.04.07"
}
stdout {
codec => "dots"
}
}