I have an issue that i have written two pipelines using devtools in v6.7.0, but do not know how to attach them with index, Can someone please assist on this?
kibana 6.7.0 is EOL and no longer supported. Please upgrade ASAP.
(This is an automated response from your friendly Elastic bot. Please report this post if you have any suggestions or concerns )
Hello, you need to provide more context.
What pipelines are you talking about? Ingest pipelines? And what you want to do with them?
PUT _ingest/pipeline/log-split-pipeline
{
"description": "Split the log into two parts and extract JSON fields",
"processors": [
{
"grok": {
"field": "log",
"patterns": ["%{GREEDYDATA} %{WORD:log_type} %{GREEDYDATA:json}"]
}
},
{
"script": {
"source": """
ctx.log_type = ctx.log_type.trim();
"""
}
},
{
"json": {
"field": "json",
"target_field": "parsed_log",
"ignore_failure": true
}
},
{
"remove": {
"field": "json"
}
},
{
"script": {
"lang": "painless",
"source": """
if (ctx.parsed_log == null) {
ctx.parsed_log = new HashMap();
ctx.parsed_log.message = ctx.log;
}
"""
}
},
{
"remove": {
"field": "log"
}
},
{
"rename": {
"field": "parsed_log",
"target_field": "log"
}
}
]
}
The ingest pipeline above is used for parsing data of the field log
"log": "2023-06-14T11:27:53.113073096Z stdout F {\"severity\":\"info\",\"service\":\"cashin-adapter\",\"ip\":\"10.233.43.44\",\"func\":\"github.com.(*healthCheckController).HealthCheckHandler.func1\",\"timestamp\":\"2023-06-14T11:27:53.112958934Z\",\"message\":\"Running HealthCheckHandler\"}"
I wanted to split these all json fields inside one field "log" i.e., severity, service, ip, func, timestamp, message, but when I deployed this pipeline the result it parses data into two different fields
"log_type": "F",
"log": {
"severity": "info",
"func": "github.com.(*healthCheckController).HealthCheckHandler.func1",
"service": "cashin-adapter",
"ip": "10.233.43.44",
"message": "Running HealthCheckHandler",
"timestamp": "2023-06-14T11:27:53.112958934Z"
}
Now i have written another pipeline to make it more better to vizualize in discover tab
PUT _ingest/pipeline/log-split-pipeline-2
{
"description": "Split the log field into separate fields",
"processors": [
{
"script": {
"lang": "painless",
"source": """
if (ctx.log != null) {
ctx.log_severity = ctx.log.severity;
ctx.log_func = ctx.log.func;
ctx.log_service = ctx.log.service;
ctx.log_ip = ctx.log.ip;
ctx.log_message = ctx.log.message;
ctx.log_timestamp = ctx.log.timestamp;
ctx.remove("log");
}
if (ctx.log_severity == null) {
ctx.remove("log_severity");
}
if (ctx.log_func == null) {
ctx.remove("log_func");
}
if (ctx.log_service == null) {
ctx.remove("log_service");
}
if (ctx.log_ip == null) {
ctx.remove("log_ip");
}
if (ctx.log_message == null) {
ctx.remove("log_message");
}
if (ctx.log_timestamp == null) {
ctx.remove("log_timestamp");
}
"""
}
}
]
}
This is my complete case.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.