I want to automate the process for preparing below pipeline. i.e by command line , api , etc
This is because I want to send logs from filebeat to logstash instead of sending directly to elasticsearch .
GET _ingest/pipeline/filebeat-7.10.1-gsuite-drive-common
{
"filebeat-7.10.1-gsuite-drive-common" : {
"processors" : [
{
"geoip" : {
"field" : "source.ip",
"target_field" : "source.geo",
"ignore_missing" : true
}
},
{
"geoip" : {
"target_field" : "source.as",
"properties" : [
"asn",
"organization_name"
],
"ignore_missing" : true,
"database_file" : "GeoLite2-ASN.mmdb",
"field" : "source.ip"
}
},
{
"rename" : {
"field" : "source.as.asn",
"target_field" : "source.as.number",
"ignore_missing" : true
}
},
{
"rename" : {
"ignore_missing" : true,
"field" : "source.as.organization_name",
"target_field" : "source.as.organization.name"
}
},
{
"remove" : {
"field" : "json",
"ignore_missing" : true
}
},
{
"set" : {
"field" : "event.ingested",
"value" : "{{ _ingest.timestamp }}"
}
}
],
"on_failure" : [
{
"set" : {
"field" : "error.message",
"value" : "{{ _ingest.on_failure_message }}"
}
}
],
"description" : "Pipeline for parsing gsuite logs"
}
}
Looks like the pipeline is related to filebeat version so if there are any tools or commands exists in the git repo , that will be great.
Is there any tools which you can generate the pipeline ?