I'm new in Elastic, so my question can be one of the 'stupid' one, but i hope it will help me to understand the connection between Logstash and Elastic search.
at the end everything works fine, but ...I would like to add/change the fields matching
As I understand the part responsible for that is the filter part :
match => { "message" => "%{DATA:user} - %{UUID:uid} %{TIMESTAMP_ISO8601:timestamp} [%{DATA:information}] %{LOGLEVEL:loglevel} (%{JAVACLASS:java}) %{GREEDYDATA:log_message}" }
in my case is as above. My understanding of that is that the line "match => ..." will assingn all the grok templates (for example: UUID) to the values (uid). The whole decoded string is assigned to the value "message".
then I want to add another field using add_field => [ "test", "%{host}" ] , but this field is not visible in the Kibana.
So my question is what do I miss? what should I look at ?
any hint and help will be appeciated
in my case is as above. My understanding of that is that the line "match => ..." will assingn all the grok templates (for example: UUID) to the values (uid).
Yes.
The whole decoded string is assigned to the value "message".
The message field contains each line from the log file, yes.
then I want to add another field using add_field => [ "test", "%{host}" ] , but this field is not visible in the Kibana.
Are you sure the grok filter is successful? Your event doesn't have a _grokparsefailure tag?
I strongly suggest that you use a stdout { codec => rubydebug } output as a debugging aid instead of sending events to Elasticsearch and looking at the via Kibana. There are things that could go wrong along the way and as a beginner it'll be much harder to debug.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.