Grok after CSV filter

Hi

I would be thankful who helps me in extracting substring between two forward slashes using grok after csv filter
My CSV file is
GIS,/Dianon/CancerRegPHNMS,22876365,96589706,ID_Dianon_170708235043,2905,2017-07-09,01:05:10,Received,

here is my logstash conf file, all I need is to create new field out of column MailboxName
input {
file {
path => "/apps/mft/ELK/MBXLogFiles/20170710.log"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["Destination","MailboxName","BPId","MessageId","FileName","Bytes","Date","Time","Action"]
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["192.168.56.105:9200"]
index => "mbxlog"
document_type => "mbxlogdoctype"
}

}

grok {
  match => ["MailboxName", "^/(?<whatever>[^/]+)/"]
}

Thank you Magnus

What I didnt understand in your solution is "whatever", is this a new filed name to give?
grok {
match => ["MailboxName", "^/(?[^/]+)/"]
}

I have two csv lines like below, I need to create a new field with values ADP & Dianon, how I can achieve with grok?
Internal,/ADP_WorkForce_IVR,22907388,96732247
Internal,/Dianon/LCLS/Orders,22907225,9673181,

What I didnt understand in your solution is “whatever”, is this a new filed name to give?

Yes, that's the name of the field you want to create. See the grok filter documentation.

I have two csv lines like below, I need to create a new field with values ADP & Dianon, how I can achieve with grok?
Internal,/ADP_WorkForce_IVR,22907388,96732247
Internal,/Dianon/LCLS/Orders,22907225,9673181,

So what exactly are the rules? Extract everything between the forward slashes but only up until the first underscore?

Hi Magnus,

oh ok.

Yes, some CSV lines extract data between first two forward slashes and in some CSV lines I need to extract data after first forward slash and before underscore which comes after first forward slash.

Please let me know how to achive the above in the same gork expression

Thanks
Somu

^/(?<whatever>[^/_]+)/

HI Magnus,

it was there for a long time, hardly i have 5 lines in the csv file, here is the console output

bin/logstash -f MBXLogstash20170709.conf
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
Sending Logstash's logs to /apps/mft/ELK/logstash/logs which is now configured via log4j2.properties
[2017-08-09T07:55:00,084][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://192.168.56.105:9200/]}}
[2017-08-09T07:55:00,098][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://192.168.56.105:9200/, :path=>"/"}
[2017-08-09T07:55:00,245][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#Java::JavaNet::URI:0x14200553}
[2017-08-09T07:55:00,247][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-08-09T07:55:00,314][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-08-09T07:55:00,332][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#Java::JavaNet::URI:0x1b4a3154]}
[2017-08-09T07:55:00,392][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>250}
[2017-08-09T07:55:00,581][INFO ][logstash.pipeline ] Pipeline main started
[2017-08-09T07:55:00,653][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Here is the complete logstash config

$ cat MBXLogstash20170709.conf
input {
file {
path => "/apps/mft/ELK/MBXLogFiles/test.log"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
columns => ["Destination","MailboxName","BPId","MessageId","FileName","Bytes","Date","Time","Action"]
}
mutate {
convert => { "Bytes" => "integer" }
}
grok {
match => ["MailboxName", "^/(?[^/]+)/"]
}

}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["192.168.56.105:9200"]
index => "mbx201707log"
document_type => "mbx201707doctype"
}

}

Could you please help on why it was not loading elastic, my configuration seems to be correct

I suspect it's tailing /apps/mft/ELK/MBXLogFiles/test.log and waiting for more data to be added to it. Read about sincedb in the file input documentation. Setting sincedb_path to /dev/null disables sincedb.

Perfect magnus, I added sincedb now it is loading but I have one error, it is not able to get CMBP from the blow message, could you help

"message" => "TP,/CMBP_PremiereGlobalServices,22875803,96589231,18805956550_2017_4600_0708_213524.DAT.tmp,79643,2017-07-09,00:02:34,Transmitted,",
"Date" => "2017-07-09",
"tags" => [
[0] "_grokparsefailure"
],

here is my configuration with sincedb, as of now it able to extract data between first two forward slashes but not data between first forward slash and underscore(_)

input {
file {
path => "/apps/mft/ELK/MBXLogFiles/test.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["Destination","MailboxName","BPId","MessageId","FileName","Bytes","Date","Time","Action"]
}
grok {
match => ["MailboxName", "^/(?[^/_]+)/"]
}
mutate {
convert => { "Bytes" => "integer" }
}

}

Magnus,

I am trying to play in below url for grok, still no luck..if you have some thing in mind to fix let me know
http://grokconstructor.appspot.com/do/match#result

Magnus,

it worked for both now, I removed extra forward slash at the end. Thanks a lot which really helps me testing the grok using online http://grokconstructor.appspot.com

/(?[^/_]*)

Thanks
Somu

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.