How to calculate the filesize in logstash

Hi,

i am new to elasticsearch, am i trying to calculate the HDFS space conumption via kibana..
i have a fsimage which contains the csv file from cluster..the file size is in byes and my code in logstash is as follows:

file {
path => '/etc/logstash/scripts/fsimage.csv.*'
start_position => "beginning"
type => "fsimage"
}
}

filter {
if [type] == "fsimage" {
csv {
separator => "|"
columns => [ "HDFSPath", "replication", "ModificationTime", "AccessTime", "PreferredBlockSize", "BlocksCount", "FileSize", "NSQUOTA", "DSQUOTA", "permission", "user", "group" ]
convert => {
'replication' => 'integer'
'PreferredBlockSize' => 'integer'
'BlocksCount' => 'integer'
'FileSize' => 'integer'
'NSQUOTA' => 'integer'
'DSQUOTA' => 'integer'
}
eg of a csv file:
XXXXXX|3|2016-12-3011:34|2016-12-3011:34|134217728|1|88807|0|0|-rw-r--r--|kbd_b9xf|hdfs

the problem is filesize which 88807 bytes is calculated only once but in hdfs total space is fs * replication count..how do i calculate filesize as fs * replication count in above script....

Hello sudi,

For me you have 2 options :
You have the logstash solution (do the multiplication by logstash) : you should use the Ruby plugin as recommended by @magnusbaeck on this topic.

Or (it's better for me), try to use Kibana visualisation to do the operation, it's more flexible to get raw data and manipulate it trough Kibana (you may use in the future).

Hope it's help you !

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.