Hi Guys,
I want to remove a field that starts with an integer. How can I do that using logstash.
EX: of the field.
"1. 2015.07.27 17": "25:35 message sent"
Regards,
Gabriel
Hi Guys,
I want to remove a field that starts with an integer. How can I do that using logstash.
EX: of the field.
"1. 2015.07.27 17": "25:35 message sent"
Regards,
Gabriel
Well if you build a grok pattern for the message and define that field, you can just use mutate and drop to remove it.
Hi Mark,
I receive the message in JSON format ... And that field gets created every hour. I want to be able to remove it before gets inserted in ES.
Regards,
Gabriel
Ok so take a look at the mutate filter with drop
@Gabriel_Rosca so I understand, your event has a field foo
which some times start with an int and some times does not. And you want to remove it when it starts with an int? is that correct?
@suyograo No. That field always starts with 1. follow my the data and hour
EX:
"1. 2015.07.28 08"
I need to remove this field from the message before gets in to ES.
something like remove_field => [ "^1.* ]
Not sure how I can do that. Mutate filter works only if I know the name of the field. And the name of the field it is static. In this case when a new message comes I get a new field per hour.
So now that is 9 AM when I get a new message ... a new field gets created and named "1. 2015.07.28 09"
Regards,
Gabriel
It's not terribly clear what's going on here. Can you provide a sample message? What is creating these fields? What's your configuration?
Hi Magnus,
here is my config:
filter {
if [type] == "zabbix-alerts" {
json {
source => "message"
remove_field => "message"
}
csv {
source => "trigger_hostgroup_name"
columns => [ "group1", "group2", "group3", "group4", "group5", "group6", "group7", "group8", "group9", "group10", "group11", "group12", "group13", "group14", "group15" ]
}
mutate {
gsub => [
"group2", "^ ", "",
"group3", "^ ", "",
"group4", "^ ", "",
"group5", "^ ", "",
"group6", "^ ", "",
"group7", "^ ", "",
"group8", "^ ", "",
"group9", "^ ", "",
"group10", "^ ", "",
"group11", "^ ", "",
"group12", "^ ", "",
"group13", "^ ", "",
"group14", "^ ", "",
"group15", "^ ", ""
]
}
}
}
Here is one sample message that I received:
I just want to remove the first field "1. 2015.07.28 09" which gets created ever hour as 1. date and hours
Hi Magnus,
Here is my config
filter {
if [type] == "zabbix-alerts" {
json {
source => "message"
remove_field => "message"
}
csv {
source => "trigger_hostgroup_name"
columns => [ "group1", "group2", "group3", "group4", "group5", "group6", "group7", "group8", "group9", "group10", "group11", "group12", "group13", "group14", "group15" ]
}
mutate {
gsub => [
"group2", "^ ", "",
"group3", "^ ", "",
"group4", "^ ", "",
"group5", "^ ", "",
"group6", "^ ", "",
"group7", "^ ", "",
"group8", "^ ", "",
"group9", "^ ", "",
"group10", "^ ", "",
"group11", "^ ", "",
"group12", "^ ", "",
"group13", "^ ", "",
"group14", "^ ", "",
"group15", "^ ", ""
]
}
}
}
and here is the json message that I received:
What I want is to remove the field highlight in bold which gets created every hour
{
"_index": "logstash-ptc-zabbix-alerts-2015.07.28",
"_type": "zabbix-alerts",
"_id": "AU7V5kheXErN6gNFz_bs",
"_score": null,
"_source": {
"@version": "1",
"@timestamp": "2015-07-28T18:21:55.667Z",
"host": "10.250.26.108",
"type": "zabbix-alerts",
"event_status": "PROBLEM",
"event_ip1": "10.250.15.208",
"event_time": "14:16:37",
"event_value": "1",
"event_age": "5m",
"inventory_url_b1": "",
"trigger_hostgroup_name": [
"10.250.15.0/24, 10.250.15.0/24 Windows Agent, Discovered hosts, PTC, PTC PROD, PTC PROD Windows, Windows Servers"
],
"inventory_os1": "Microsoft(R) Windows(R) Server 2003, Standard Edition",
"event_ack_status": "No",
"item_lastvalue1": "3.85",
"trigger_events_unack": "1489",
"item_id1": "35917",
"event_recovery_status": "OK",
"inventory_poc_secondary_cell1": "1",
"event_dns1": "NYVM0428",
"inventory_poc_secondary_phone_a1": "example.com",
"trigger_template_name": "UN Template OS Windows",
"item_name1": "Processor load (1 min average)",
"trigger_expression": "{NYVM0428:system.cpu.load[percpu,avg1].avg(5m)}>5",
"inventory_model1": "VMware Virtual Platform",
"action_id": "14",
"inventory_poc_secondary_name1": "1",
"problem ended": "2015.07.28 14:21:37",
"item_name_orig1": "Processor load (1 min average)",
"inventory_serialno_a1": "VMware-42 1f cb f0 57 fc d9 44-ef e2 4e 0e cd 76 94 98",
"inventory_poc_secondary_phone_b1": "1023.39 MB",
"name": "zabbix_data",
"event_conn1": "10.250.15.208",
"inventory_vendor1": "Phoenix Technologies LTD",
"event_name1": "NYVM0428",
"trigger_value": "0",
"1. 2015.07.28 14": "16:41 message sent Logstash nysv0654 "Logstash User logstash (Logstash User)"",
"event_port1": "10050",
"trigger_severity": "Average",
"trigger_name_orig": "Processor load is too high on {HOST.NAME}",
"event_host1": "NYVM0428.",
"inventory_name1": "nyvm0428",
"trigger_status": "OK",
"item_key1": "system.cpu.load[percpu,avg1]",
"trigger_events_problem_unack": "744",
"item_value1": "3.85",
"trigger_events_ack": "0",
"event_recovery_time": "14:21:37",
"inventory_poc_primary_cell1": "",
"trigger_events_problem_ack": "0",
"event_recovery_value": "0",
"item_orgi1": "system.cpu.load[percpu,avg1]",
"event_id": "4663681",
"trigger_nseverity": "3",
"event_date": "2015.07.28",
"inventory_hw_arch1": "X86-based PC",
"action_name": "Logstash Notification",
"event_recovery_id": "4664691",
"trigger_id": "16318",
"date": "2015.07.28",
"trigger_name": "Processor load is too high on NYVM0428",
"event_recovery_date": "2015.07.28",
"time": "14:21:39",
"esc_history": "Problem started: 2015.07.28 14:16:37 Age: 5m",
"message_type": "recovery",
"group1": "10.250.15.0/24",
"group2": "10.250.15.0/24 Windows Agent",
"group3": "Discovered hosts",
"group4": "PTC",
"group5": "PTC PROD",
"group6": "PTC PROD Windows",
"group7": "Windows Servers"
},
"fields": {
"@timestamp": [
1438107715667
]
},
"sort": [
1438107715667
]
}
I can't find any field name or field value that contains the string "1. 2015.07.28 09" in the example that you posted.
I got it to work by using ruby filter
ruby {
code => "
event.to_hash.keys.each { |k|
if k.start_with?('1.')
event.remove(k)
end
}
"
}
I mark that in bold
"1. 2015.07.28 14": "16:41 message sent Logstash nysv0654 "Logstash User logstash (Logstash User)"",
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.