Hello,
Here is the issue: My logstash conf file is not sending files to Elasticsearch when it's full:
filter{
json {
source => "message"
remove_field =>[ "message" ]
}
ruby {
path => '/etc/logstash/conf.d/splitData.rb'
script_params => { field => "activeAlarms" target => "activeAlarms" }
}
ruby {
code => '
a = []
event.get("alarmMetadata").each { |k, v|
h = Hash.new
h["alarmLevel"] = k
h["alarmValue"] = v
a << h
}
event.remove("alarm")
event.set("alarm", a)
'
}
split { field => "alarm" }
split{field=> "entities"}
if "tele" in [entities][sdn]{
mutate{add_field=>["ServerType","tele"] }
}
if "ntf" in [entities][sdn]{
mutate{add_field=>["ServerType" ,"ntf"]}
}
if "db-access" in [entities][sdn]{
mutate{add_field=>["ServerType","db-access"]}
}
if "storage" in [entities][sdn]{
mutate{add_field=>["ServerType","storage"]}
}
if "diag" in [entities][sdn]{
mutate{add_field=>["ServerType","diag"]}
}
if "ops" in [entities][sdn]{
mutate {add_field=>["ServerType","ops"]}
}
mutate {
rename => { "scope" => "Server" }
}
grok {
match => {"[entities][sdn]" => "(?<VM_number>(vm)[0-9])"}
}
mutate{add_field => {
"new_Objld" => "%{Server} %{ServerType} %{VM_number}"
}}
mutate {convert => {"ServerType" => "string"}}
}
BUT, when I comment the second ruby part, it does work. Then, when I do another filter with the two ruby parts and none of the remaining, the filter also works. The only reason I see is that I create too much splits etc and logstash ends up running out of memory, but maybe you can find another reason and even a way to make this works
I get this error:
Dumping heap to java_pid 27510.hprof ... Heap dump file created [3015426727 bytes in 109.318 secs]
PS: the script used in the second ruby part is :
def register(params)
@field = params['field']
@target = params['target']
end
def filter(event)
data = event.get(@field)
event.remove(@field)
a = []
data.each { |x|
e = event.clone
e.set(@target, x)
a << e
}
a
end
thanks in advance