Is that possible to choose the informations to index in ElasticSearch?

Hello ,

(sorry in advance for my poor english i'm french and thanks in advance for reading my topic).
I just started using the ELK stack this morning and i was wondering if you can help me with my problème.

My logs follows this pattern :

20150703_112230|Source17|requests/lr_pro/source1/sitemap_lr_0000|subscriberNotFound|0|0|0.0292961597443 <-web_sophia118712,41,6,,20,,wgs84lambert2,,nbResLinking=15,crossLinking=1:2,jumptopagennn=1||| La burbanche|||01||||RUB_807649:|1|subscriber,exact||||||||

The only informations i really need to index are from the beginning to "<-" so :

20150703_112230|Source17|requests/lr_pro/source1/sitemap_lr_0000|subscriberNotFound|0|0|0.0292961597443

I would like to know if it is possible to configure that on my logstash configuration file on the filter part.

For now, i have :

filter {
csv {
separator => "|"
columns => ["a", "b", "c", "d", "e", "f" .........., "z"]
}
}

but all the fields are indexed.

I hope my issue is well expressed.
Thanks.

You can e.g. use the mutate filter to delete the fields you don't care about.

filter {
  csv {
    ...
  }
  mutate {
    remove_field => ["h", "i", "j", "k", ..., "z"]
  }
}

Thank you very much sir!!!!
That was perfect for me. :+1: