Ingest Pipeline Convert Processor is not converting

I am using Filebeat, Elasticsearch and Kibana 7.10.2 in docker containers
I have a working pipeline:

PUT /_ingest/pipeline/test_grok_pipeline
{
  "description": "Test grok pattern",
  "processors": [
    {
      "grok": {
        "field": "message",
        "patterns": [
          """%{TIMESTAMP_ISO8601:timestamp}%{SPACE}%{WORD:timeoffset}%{SPACE}%{WORD:thread}%{SPACE}%{HOSTNAME:processName}%{SPACE}%{HOSTNAME:sourceName}%{SPACE}%{WORD:logType}%{SPACE}%{GREEDYDATAMULTILINE:message}"""
        ],
        "on_failure": [
          {
            "set": {
              "field": "error.message_grok",
              "value": "error in grok processor"
            }
          }
        ],
        "pattern_definitions": {
          "MESSAGE": "(\r|\n|.)*",
          "GREEDYDATAMULTILINE": "(.|\n)*"
        }
      }
    },
    {
      "date": {
        "field": "timestamp",
        "target_field": "@timestamp",
        "formats": ["ISO8601"], 
        "timezone": "America/Los_Angeles",
        "on_failure": [
          {
            "set": {
              "field": "error.message_date",
              "value": "error in date processor"
            }
          }
        ]
      }
    }
  ]
}

except the fields I am parsing with grok are not all typed so I cannot use them as a Term in Kibana. SO I added a Convert Processor:

{
  "description": "Test grok pattern",
  "processors": [
    {
      "grok": {
        "field": "message",
        "patterns": [
          """%{TIMESTAMP_ISO8601:timestamp}%{SPACE}%{WORD:timeoffset}%{SPACE}%{WORD:thread}%{SPACE}%{HOSTNAME:processName}%{SPACE}%{HOSTNAME:sourceName}%{SPACE}%{WORD:logType}%{SPACE}%{GREEDYDATAMULTILINE:message}"""
        ],
        "on_failure": [
          {
            "set": {
              "field": "error.message_grok",
              "value": "error in grok processor"
            }
          }
        ],
        "pattern_definitions": {
          "MESSAGE": "(\r|\n|.)*",
          "GREEDYDATAMULTILINE": "(.|\n)*"
        }
      }
    },
    {
      "date": {
        "field": "timestamp",
        "target_field": "@timestamp",
        "formats": ["ISO8601"], 
        "timezone": "America/Los_Angeles",
        "on_failure": [
          {
            "set": {
              "field": "error.message_date",
              "value": "error in date processor"
            }
          }
        ]
      }
    },
    {
      "convert": {
        "field": "sourceName",
        "type": "string"
      }
    }
  ]
}

But, this does not seem to be working. There is still the small '?' next to the field name in Discover and I still cannot use it as a Term in a Visualization.

What am I missing here?

Thanks!

Heya @mhare

Good to see your moving forward.

What Version are you on?

Before 7.11 you have to go into

Stack Management / Index Patterns and refresh the index pattern then go back to Discover before it will show the type in Discover and be useable in a visualization.

7.11 forward you do not need to do that.

Also have you created a mapping/ index template for this index so you define the field types ahead of time?

If not each text field will be both text and keyword (term)

1 Like

OK, I am such an idiot. I had completely forgotten to refresh the index :roll_eyes:
That solved it right away.
So, if I understand, in 7.11 I won't need to do the refresh? Maybe it's time to 'refresh' my containers :grinning:
Thanks so much for the continued help!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.