Y-Axis Kibana problem function SUM

Hello,

I analyze a test log file, example of file :

(syntax is : Date ; Name script ; Name application ; data volume)

20170127174559;AC_hour-C-FWK-BMA-EDR-2-Zone-C;Eroe;10000
20170127164559;AC_hour-C-FWK-BMA-EDR-2-Zone-C;UCa;17000
20170127164559;AC_hour-C-FWK-BMA-EDR-2-Zone-C;RM;42000

I use this command to cut lines :

{ "message" => "%{DATA:date}[;]%{DATA:nom_compt}[;]%{DATA:Application}[;]%{NUMBER:volume}" }

All is ok, i see in Kibana Discover logs. I have refresh pattern to save new fields (volume, application...)

Soo, now I want create graph but :

When i want configure my Y-Axis .. Always the offset only.
(I have take volume in NUMBER pattern so)

But in X-Axis, i have my fields (in .keyword) :

.

In parameters, i see 2 volume fields. One is aggregatable other not :

Can you help me or explain ?

Thabnk you

Hi !

Your volume field is a string, that why you can't make sum.

I don't know why %{NUMBER:volume} is a string... but you can try this

filter {
   mutate {
      convert => { "volume" => "integer" }
   }
}

You mean i insert mutate in filter :

filter {
if [type] == "q_compteur" {
grok {
match => { "message" => "%{DATA:date}[;]%{DATA:nom_compteur}[;]%{DATA:zone}[;]%{NUMBER:volume}" }
}

mutate {
convert => { "volume" => "integer" }
}

date {
match => [ "timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}

}

I have a question why not remplace directly NUMBER pattern by INT pattern like this :

{ "message" => "%{DATA:date}[;]%{DATA:nom_compt}[;]%{DATA:Application}[;]%{INT:volume}" }

?

Thank you very much for your help sir

I have applicated change but there is any change.

volume is always shows a string type.

I will test if i write INT pattern in cut line perform.

Have you refresh your index pattern ?

I think yes...

Try this to debug grok : http://grokdebug.herokuapp.com/

Ok is good, I don't know why but my field is become number !

(I believe kibana can't change field if he is declared string at begin)

I create new field with mutate (integer).

I have a question please :

I want to upload old lines of my log file. This lines have a date of this style :

2017 01 01 17 45 59;[...] (YYYYMMDDHHmmss)
20170101184559;[..] (YYYYMMDDHHmmss)
20170101184559;[...] (YYYYMMDDHHmmss)

But, where it's upload to kibana, it's taken currently date (than 07022017). So the graph creation is wrong

How to make sure that kibana saves them with the date contained in the line? To create a consistent graph from january for example

this has nothing to do with kibana, its about logstash and correctly indexing your data.

fir example in your grok filter you set date variable ... but then in your date->match you are trying to use timestamp

https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html

i suggest that you always check your data in the discover tab in kibana first, check that fields have correct types and that you are seeing the right values.

as mentioned above grok debuger can also be of great help.

But the date format that i use is than YYYYMMDDHHmmss, any pattern exist for that

In my filter grok :

I use this to cut lines :

{ "message" => "%{DATA:date}[;]%{DATA:nom_compt}[;]%{DATA:Application}[;]%{NUMBER:volume}" }

So informations about date use the DATA pattern. After that, i want indexer old information with the good date. (my date format is YYYYMMDDHHmmss)

If i understand this is wrong because i use timestamp with my date block :

 date {
   match => [ "timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
  }

This is will be correctly ? :

date {
  match => [ "date", "YYYYMMddHHmmss" ]
}

thayou

thats correct ... first field in match array is the field you would like to parse and then you can provide formats.

you could also specify target to the date filter to set which field to save this to .... but the default is already set to @timestamp so that should be ok.

I will go test my new conf, but the problem with date seems to easy aha

(I view in few forums, users declared her date format in a specific file "patternpersonal" blablabla)

I hope succeed !

Ok it's good thank you, the date match format perform corectly.

I have a question. Lines in my file text are composed actually of 4 fields, but few lines have 3 fields.

With my currently filter, i declare 4 fields (date,compt,appli,volume) but where lines have 3 fields (date,compt,volume) the volume field take the value of appli field.

How to tell him to be careful when a field is missing

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.