mangeshs
(Mangesh Shinde)
January 31, 2022, 8:25am
1
I have log line as follows
file_name,file_id,size,file_owner,folder_id,folder_name,path_ids,folder_owner,deleted,date_modified,date_uploaded,folder_paths
nash+animal models.docx,884054786846,20068,xyz@123.com,149986394846,PRATIBHA TEST,/148290247289/148291354564/149986382846/,xyz@123.com,No,2021-02-12,2021-11-12,/Folder Movement Test/For Sagar/ResearchHubFiles/
so there is field called path_ids where it comes like as follows
/148290247289/148291354564/149986382846/
it can be multiple values separated by /
I want multiple values from this field in elastic to aggregate using logstash filter
mangeshs
(Mangesh Shinde)
January 31, 2022, 10:15am
2
i have used ruby code but getting exception
I am new to ruby is code right?
ruby {
code => '
ids = event["path_ids"].split("/")
ids.each { |i| event["path_id#{ids[i]}"] = values[i]}
'
}
Tomo_M
(Tomohiro Mitani)
January 31, 2022, 11:00am
3
mangeshs
(Mangesh Shinde)
January 31, 2022, 11:14am
4
i have used csv filter to get fields but now I have filed as path_ids as /12345/54564/5466/
so I want each numeric path id in different field
and field value is dynamic
Tomo_M
(Tomohiro Mitani)
January 31, 2022, 11:50am
5
if the depth of the path is not fixed, ruby filter could be better as you said.
I have not tried, but how about this?
ruby {
code => '
ids = event["path_ids"].split("/")
ids.each_with_index{|id,i|
event["path_id#{i}"] = id
}
'
}
system
(system)
Closed
February 28, 2022, 11:50am
6
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.