Well logstash extracts borne and flow_type so I think that it extracts fields properly.
How do I check if it works well ?
If logtash works well, then yes, it's a problem with Kibana (or maybe with elasticsearch ?).
On kibana I have the field borne and the field flow_type so why not the others ?
Did you try to refresh the field lists in kibana?
In kibana, Go to settings >> Indices
Select which index you are having trouble with and use the Yellow colored 'Refresh' button to refresh the fields list.
I'll try a logtash config with just the grok for this log and see if it works.
EDIT: with just this grok pattern, it works. I have all the fields.
Maybe another grok pattern matches first ? (I checked my config file and I see nothing..)
EDIT 2: Here are the two config files I use, I hope it'll help (the cisco one is huge)
10-cisco.conf:
10-dhcp.conf:
EDIT 3: I FOUND THE ERROR.
In fact there is a sense of priority in grok patterns. I commented 2 patterns and it works again.
Those 2 patterns had the same begin pattern and nothinf after.
Now I'll see how to have those 2 patterns back. I don't want any grokparse failure
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.