Date invalid format

I'm trying to import CSV to elastic, I was fallowing this post

My data looks like this:

From,To,NetEnforcer,Total Bandwidth (Gbps),In Bandwidth (Gbps),Out Bandwidth (Gbps),In Packets (Pps),Out Packets (Pps),Live Connections,New Connections (Conn/sec),Dropped Connections
Sep 20 2018 08:50:31,Sep 20 2018 08:51:00,CSC01QOS02,0.913,0.637,0.276,0.0,0.0,54194,887.4,0.0

You can notice straight away the non standard date stamp, please note that there are 2 spaces between 2018 and the time.

I have tested my data with pipeline simulate and it's working fine:

    POST _ingest/pipeline/_simulate
    {
     "pipeline": {
       "description": "Parsing the Netenforcer logs",
       "processors": [
         {
           "grok": {
             "field": "net_enforcer",
             "patterns": [
               "%{CUSTOMSTAMP:time_from},%{CUSTOMSTAMP:time_to},%{HOSTNAME:netenforcer},%{BASE16FLOAT:total_bandwidth},%{BASE16FLOAT:in_bandwidth},%{BASE16FLOAT:out_bandwidth},%{BASE16FLOAT:in_packets},%{BASE16FLOAT:out_packets},%{BASE16FLOAT:live_connections},%{BASE16FLOAT:new_connections},%{BASE16FLOAT:dropped_connections}"
             ],
             "pattern_definitions" : {
              "CUSTOMSTAMP" : "%{MONTH} +%{MONTHDAY} +%{YEAR} \\s* %{HOUR}:%{MINUTE}:%{SECOND}"
            }
           }
         },
         {
           "remove": {
             "field": "net_enforcer"
           }
         }
       ]
     },
     "docs": [
       {
         "_index":"netenforcer_log",
         "_type":"entry",
         "_source": {
         "net_enforcer": "Sep 20 2018  08:50:31,Sep 20 2018  08:51:00,CSC01QOS02,0.913,0.637,0.276,0.0,0.0,54194,887.4,0.0"}
       }
     ]
    }

My template:

    PUT _template/netenforcer_template
    {
     "index_patterns": "netenforcer_log*",
     "settings": {
       "number_of_shards": 1
     },
     "mappings": {
       "net_enforcer": {
     "properties": {
       "time_from": {
         "type": "date",
         "format": "MMM dd yyyy  HH:mm:ss"
       },
       "time_to": {
         "type": "date",
         "format": "MMM dd yyyy  HH:mm:ss"
       },
       "netenforcer": {
         "type": "keyword"
       },
       "total_bandwidth": {
         "type": "float"
       },
       "in_bandwidth": {
         "type": "float"
       },
       "out_bandwidth": {
         "type": "float"
       },
       "in_packets": {
         "type": "float"
       },
       "out_packets": {
         "type": "float"
       },
       "live_connections": {
         "type": "float"
       },
       "new_connections": {
         "type": "float"
       },
       "dropped_connections": {
         "type": "float"
       }
     }
       }
     }
    }

My pipeline:

    PUT _ingest/pipeline/parse_netenforcer_csv
    {
            "description": "Parsing the Netenforcer logs",
            "processors": [{
                    "grok": {
                        "field": "net_enforcer",
                        "patterns": [
                            "%{CUSTOMSTAMP:time_from},%{CUSTOMSTAMP:time_to},%{HOSTNAME:netenforcer},%{BASE16FLOAT:total_bandwidth},%{BASE16FLOAT:in_bandwidth},%{BASE16FLOAT:out_bandwidth},%{BASE16FLOAT:in_packets},%{BASE16FLOAT:out_packets},%{BASE16FLOAT:live_connections},%{BASE16FLOAT:new_connections},%{BASE16FLOAT:dropped_connections}"
                        ],
                        "pattern_definitions" : {
              "CUSTOMSTAMP" : "%{MONTH} +%{MONTHDAY} +%{YEAR} \\s* %{HOUR}:%{MINUTE}:%{SECOND}"
            }
                    }
                },
                {
                    "remove": {
                        "field": "net_enforcer"
                    }
                }
            ]
        }

MY test:

curl -XPOST 'http://docker.oc.lab:9200/netenforcer_log_test/net_enforcer?pipeline=parse_netenforcer_csv' -H "Content-Type: application/json" -u elastic:changeme -d "{ \"net_enforcer\": \"Sep 20 2018  08:50:31,Sep 20 2018  08:51:00,CSC01QOS02,0.913,0.637,0.276,0.0,0.0,54194,887.4,0.0\" }"

Error:

{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"failed to parse [time_to]"}],"type":"mapper_parsing_exception","reason":"failed to parse [time_to]","caused_by":{"type":"illegal_argument_exception","reason":"Invalid format: \"Sep 20 2018  08:51:00\""}},"status":400}

I don't see where I made mistake

@michal123 What version of Elasticsearch are you using ? I was unable to reproduce the error on 6.3.1.

@jakelandis I'm on the latest 6.4.1, my guess is that it doesn't like that the fact that i have 2 fields for data.

       "time_from": {
         "type": "date",
         "format": "MMM dd yyyy  HH:mm:ss"
       },
       "time_to": {
         "type": "date",
         "format": "MMM dd yyyy  HH:mm:ss"

I just found out that removing time_to field when parsing with grok, removes the error.

Is it possible to have 2 data fields and then just chose one in Index patters in Kibana?

Hmm, still can't reproduce on 6.4.1.

Could it be that your index mapping was wrong at time of index creation ? You can see the mappinging use by the index GET netenforcer_log_test/_mapping

..or you can just delete the index (assuming it is a test index) and try again and the latest version of the mapping will get applied

DELETE netenforcer_log_test

POST netenforcer_log_test/net_enforcer/1/?pipeline=parse_netenforcer_csv
{
  "net_enforcer": "Sep 20 2018  08:50:31,Sep 20 2018  08:51:00,CSC01QOS02,0.913,0.637,0.276,0.0,0.0,54194,887.4,0.0"
}

Is it possible to have 2 data fields and then just chose one in Index patters in Kibana?

Your mappings can have as many date field mappings as you like. The one choosen for that index pattern in Kibana will be the default one used by the date/time picker and date histogram on the discover page (as well possibly other places were a date/time could be defaulted)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.