I am using Filebeat, Elasticsearch and Kibana 7.10.2 in docker containers
I have a working pipeline:
PUT /_ingest/pipeline/test_grok_pipeline
{
"description": "Test grok pattern",
"processors": [
{
"grok": {
"field": "message",
"patterns": [
"""%{TIMESTAMP_ISO8601:timestamp}%{SPACE}%{WORD:timeoffset}%{SPACE}%{WORD:thread}%{SPACE}%{HOSTNAME:processName}%{SPACE}%{HOSTNAME:sourceName}%{SPACE}%{WORD:logType}%{SPACE}%{GREEDYDATAMULTILINE:message}"""
],
"on_failure": [
{
"set": {
"field": "error.message_grok",
"value": "error in grok processor"
}
}
],
"pattern_definitions": {
"MESSAGE": "(\r|\n|.)*",
"GREEDYDATAMULTILINE": "(.|\n)*"
}
}
},
{
"date": {
"field": "timestamp",
"target_field": "@timestamp",
"formats": ["ISO8601"],
"timezone": "America/Los_Angeles",
"on_failure": [
{
"set": {
"field": "error.message_date",
"value": "error in date processor"
}
}
]
}
}
]
}
except the fields I am parsing with grok are not all typed so I cannot use them as a Term in Kibana. SO I added a Convert Processor:
{
"description": "Test grok pattern",
"processors": [
{
"grok": {
"field": "message",
"patterns": [
"""%{TIMESTAMP_ISO8601:timestamp}%{SPACE}%{WORD:timeoffset}%{SPACE}%{WORD:thread}%{SPACE}%{HOSTNAME:processName}%{SPACE}%{HOSTNAME:sourceName}%{SPACE}%{WORD:logType}%{SPACE}%{GREEDYDATAMULTILINE:message}"""
],
"on_failure": [
{
"set": {
"field": "error.message_grok",
"value": "error in grok processor"
}
}
],
"pattern_definitions": {
"MESSAGE": "(\r|\n|.)*",
"GREEDYDATAMULTILINE": "(.|\n)*"
}
}
},
{
"date": {
"field": "timestamp",
"target_field": "@timestamp",
"formats": ["ISO8601"],
"timezone": "America/Los_Angeles",
"on_failure": [
{
"set": {
"field": "error.message_date",
"value": "error in date processor"
}
}
]
}
},
{
"convert": {
"field": "sourceName",
"type": "string"
}
}
]
}
But, this does not seem to be working. There is still the small '?' next to the field name in Discover and I still cannot use it as a Term in a Visualization.
What am I missing here?
Thanks!