Hello,
I created an ingest pipeline for my filebeat logs:
POST /_ingest/pipeline/_simulate
{
"pipeline": {
"processors": [
{
"dissect": {
"field": "message",
"pattern": "%{*p1}=\"%{&p1}\" %{*p2}=%{&p2} %{*p3}=%{&p3} %{*p4}=%{&p4} %{*p5}=%{&p5} %{*p6}=\"%{&p6}\" %{*p7}=%{&p7} %{*p8}=%{&p8}",
"on_failure": [
{
"dissect": {
"field": "message",
"pattern": "%{*p1}=%{&p1} %{*p2}=%{&p2} %{*p3}=%{&p3} %{*p4}=%{&p4}",
"ignore_failure": true
}
}
]
}
},
{
"dissect": {
"field": "message",
"pattern": "%{*p1}=%{&p1} %{*p2}=%{&p2} %{*p3}=%{&p3} %{*p4}=%{&p4} %{*p5}=%{&p5} %{*p6}=%{&p6} %{*p7}=%{&p7} %{*p8}=%{&p8} %{*p9}=%{&p9}" ,
"ignore_failure": true
}
},
{
"convert": {
"field": "duration_ms",
"ignore_failure": true,
"type": "float"
}
},
{
"convert": {
"field": "duration",
"ignore_failure": true,
"type": "float"
}
},
{
"remove": {
"field": "message",
"if": "ctx.duration != null || ctx.duration_ms != null"
}
}
]
},
"docs": [
{
"_source": {
"message": "method=GET path=/api/v1/search/barcoded_products format=json controller=Api::V1::SearchController action=barcoded_products status=200 duration=113.87 view=4.13 db=14.83"
}
},
{
"_source": {
"message": "action=\"Create\" model=\"Delayed::Job\" user=2 duration_ms=1.471729"
}
},
{
"_source": {
"message": "message=\"ETL finished\" usage=0 items=0 media=0 products=0 dates=\"2020-06-04, 2020-06-05\" site=3 duration_ms=0.005"
}
}
]
}
I have 3 different types of logs so I'm using 3 dissect processors.
I know it's a messy pipeline but still when I use the simulate API, it works. The problem is when I use the ingest pipeline with filebeat, it works only for the first log structure
message": "method=GET path=/api/v1/search/barcoded_products format=json controller=Api::V1::SearchController action=barcoded_products status=200 duration=113.87 view=4.13 db=14.83
and for the other two types it doens't work at all. Any ideas? Perhaps the problem has to do with the quotation marks since the only difference between simulation API and filebeat is that I manually added backslashes on quotation marks so the logs can be parsed sucessfully.