I am ingesting logs in the ECS format from the Elastic Serverless Forwarder into Elasticsearch. These logs are generated by the ECS Python logging library. Because they are being generated by ESF, I need to expand the ndjson
events using an ingest pipeline instead of being able to have the agent do it.
However, when I ingest the logs, I get the following error message:
cannot add non-map fields to root of document
I do not receive the same error if I extract the logs to a target_field
, however there is no easy way to magically move all logs from the target field to the root field, as not all fields are present in all events and some fields need to be merged (e.g., log.*
).
Sample Log Entry
POST _ingest/pipeline/expand-json-events/_simulate
{
"docs": [
{
"_index": "index",
"_id": "a9247583-8b75-4fa4-b8e7-053500f2736e",
"_source": {
"@timestamp": "2023-08-15T14:12:01.486Z",
"message": "{\"@timestamp\":\"2023-08-15T14:12:01.486Z\",\"log.level\":\"debug\",\"message\":\"My log message\",\"ecs\":{\"version\":\"1.6.0\"},\"log\":{\"logger\":\"logger_name\",\"origin\":{\"file\":{\"line\":123,\"name\":\"file.py\"},\"function\":\"function_name\"},\"original\":\"My log message\"},\"process\":{\"name\":\"MainProcess\",\"pid\":8,\"thread\":{\"id\":140241849009984,\"name\":\"MainThread\"}}}"
}
}
]
}
Elasticsearch Ingest Pipeline
{
"description": "Expand JSON events",
"processors": [
{
"rename": {
"description": "Save original event",
"field": "message",
"target_field": "event.original"
}
},
{
"json": {
"description": "Expand JSON message payload",
"field": "event.original",
"add_to_root": true,
"add_to_root_conflict_strategy": "replace",
"allow_duplicate_keys": false,
"strict_json_parsing": true
}
}
],
"on_failure": [
{
"set": {
"field": "error.message",
"value": "{{ _ingest.on_failure_message }}"
}
},
{
"set": {
"field": "error.type",
"value": "{{ _ingest.pipeline }}"
}
},
{
"set": {
"field": "event.kind",
"value": "pipeline_error"
}
}
]
}