Hi All,
I am new to elasticserach....the logstash is pulling the data from cluster and in kibana when viewing the data total filesystem size is different from the actual box...any suggestions as how to have to standardize the file system in logstash
Hi All,
I am new to elasticserach....the logstash is pulling the data from cluster and in kibana when viewing the data total filesystem size is different from the actual box...any suggestions as how to have to standardize the file system in logstash
Hi,
welcome to elastic 
Could you provide some more details of your challenge ? What is your setup in logstash and which dashboard, metric did you use in kibana ?
pts0
Hi,
i have to identify the max file size used by different dir within hdfs.. for that i using kibana ( vertical dashboard )...but filesize within hdfs are of different types ( mb, kb, gb etc) will kibana or logstash convert those file types?? if so how
You should feed elastic with consistent data.
If you store the sizes in elastic with different units, that is not nice, you may than hack it back in consistent form with painless scripts, when you display it.
Just do it in logstash or directly get all sizes in the same units ... you will save you a lot of work.
pts0
Hi,
Thanks,,how do i do that in logstash..i have a file which is in mb, kb, gb how do i convert these files to mb and load into elasticsearch so that i get all file size with same unit..so that in kibana when i visualize the file size i get the sum of file size in mb
well @sudi_2611, google helps.
I found a good answers here in forum Convert strings with different data units (MB,GB,TB) to byte, have a look if it helps.
pts0
Hi,
I have a fsimage snapshot from hadoop cluster the csv file contains file size in kb, mb, gb etc this is how i have configured in logstash:
filter {
if [type] == "fsimage" {
csv {
separator => "|"
columns => [ "HDFSPath", "replication", "ModificationTime", "AccessTime", "PreferredBlockSize", "BlocksCount", "FileSize", "NSQUOTA", "DSQUOTA", "permission", "user", "group" ]
convert =>  {
'replication' => 'integer'
'PreferredBlockSize' => 'integer'
'BlocksCount' => 'integer'
'FileSize' => 'integer'
'NSQUOTA' => 'integer'
'DSQUOTA' => 'integer'
}
In kibana under index pattern i have set the filesize name with type as number and format as byte..because of which i am not getting total filesize...hence wants to convert files with different types ( kb, mb, gb ) to byte how to do that??
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.