The orignal task is to make an index pattern (here alb-logs*) in Elasticsearch searchable in datetime field (get those histogram).
So I approached this problem and found out that this index does not have any datetime field; but from what I can see that there is a log field and its second index item in the string is a TIMESTAMP_ISO8601 field sandwiched between some text in string value.
ex value in log:
"log": [
"http 2022-06-23T08:05:09.703732Z app/jupyter-notebook-alb/5369d658dabf1dc5 104.217.249.182:34438 - -1 -1 -1 301 - 331 329 "GET http://13.126.211.168:80/ HTTP/1.1" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:76.0) Gecko/20100101 Firefox/76.0" - - - "Root=1-62b41eb5-4ee4ff335df664a832aedc0f" "-" "-" 1 2022-06-23T08:05:09.481000Z "redirect" "[https://13.126.211.168:443/\](https://13.126.211.168/\)" "-" "-" "-" "-" "-""
]
So I followed up by learning Ingest pipeline and used grok filter with pattern "%{WORD:word}%{SPACE}%{TIMESTAMP_ISO8601:TIMEDATE}". This did extracted the local datetime field for me.
For extra security I even followed up this field with the DATE processor . The final request looks like this:
PUT _ingest/pipeline/abs_datetime_pipeline
{
"description": "This extracts datetime field from the log.",
"processors": [
{
"grok": {
"field": "log",
"patterns": [
"%{WORD:word}%{SPACE}%{TIMESTAMP_ISO8601:TIMEDATE}"
]
}
},
{
"date": {
"field": "TIMEDATE",
"formats": [
"ISO8601"
]
}
}
]
}
This extracted @datetime field well from a sample test document I tested upon.
Then went for index-management for alb-logs and did set their index.default_pipeline and index.final_pipeline to the new abs_datetime_pipeline.
Now I expected that in my home Discover section for alb-logs, histogram wrt datetime might occur but sadly it didn't
Can you point out what I am missing and where I am getting it wrong Or what I can do to solve the task.