I knew that, but have some reason.
Currently, I'm using both of logstash and ingest ES to process document.
I have 4 logsource (difference data format) with 4 pipeline to process them. My mission is using all of data and push it to new index with new format (rename, remove, enrich ...etc) but still keep original data (after process by logstash). So my flow data : Raw data -> logstash (grok, mutate, enrich with es filter ..) -> es ingress pipeline ( reformat, rename, remove some field ...).
What I need is the documentation after running through es pipeline.
That why i can't use logstash to put event to another output.
Or maybe my flow data process have problem
Sample :
input {
beats {
port => 5044
host => "0.0.0.0"
client_inactivity_timeout => 100
id => "tcp5044"
}
}
filter {
grok {}
mutate {}
elasticsearch {} -> for enrich data
}
output {
elasticsearch { index => original }
if <condition > {
elasticsearch {
index=> formated-index
pipeline => format-cts
}
}
}
ES ingest pipeline
{
"description" : "Format data and remove some field are not necessary",
"processors" : [
{
"set": {
"field": "newfield.type",
"value": "SOFTWARE"
}
},
{
"set": {
"field": "newfield.field_name",
"value": "{{data.name}}"
}
},
{
"set": {
"field": "newfield.field1",
"value": "{{data.version}}"
}
},
{
"set": {
"field": "newfield.field1",
"value": "{{data.type}}"
}
},
{
"uppercase": {
"field": "field_L",
"ignore_missing": true
}
},
{
"date" : {
"field" : "timestamp",
"target_field" : "@timestamp",
"formats" : ["dd/MMM/yyyy:HH:mm:ss Z", "ISO8601"],
"timezone" : "Asia/Ho_Chi_Minh",
"ignore_failure" : true
}
},
{
"dot_expander": {
"field": "http.request.uri"
}
},
{
"dot_expander": {
"field": "http.response.body.bytes"
}
},
{
"dot_expander": {
"field": "http.response.status_code"
}
{
"remove" : {
"field": ["useragent", "http", "log", "agent", "input", "ecs", "message", "type", "tags", "data", "tenant"],
"ignore_missing": true
}
}
]
}