Encounter error "Saved field "timeStamp" of data view "index-name" is invalid for use with the "Date Histogram" aggregation. Please select a new field

Hello everyone,

I'm running Elastic Stack 8.3.0.
I encounter the following error in Kibana "Discover" with an index:

The index is indexed from the following csv file (some fields have been redacted):

timeStamp,elapsed,label,responseCode,responseMessage,threadName,dataType,success,failureMessage,bytes,sentBytes,grpThreads,allThreads,URL,Latency,Hostname,IdleTime,Connect
2023/09/17 02:23:03.892,167,otcs_user_interface-1,200,,login-browse-upload-download-search-logout 1-1,text,true,,10746,681,1,1,http://{URL},113,{HostName},0,30
2023/09/17 02:23:03.892,113,otcs_user_interface-1-0,302,,login-browse-upload-download-search-logout 1-1,text,true,,2420,143,1,1,http://{URL},113,{HostName},0,30
2023/09/17 02:23:04.008,51,otcs_user_interface-1-1,200,,login-browse-upload-download-search-logout 1-1,text,true,,8326,538,1,1,http://{URL},51,{HostName},0,0
2023/09/17 02:23:04.100,3,Debug Sampler,200,OK,login-browse-upload-download-search-logout 1-1,text,true,,2657,0,1,1,null,0,{HostName},0,0
2023/09/17 02:23:04.104,9,login-2,200,,login-browse-upload-download-search-logout 1-1,text,true,,8041,471,1,1,http://{URL},9,{HostName},0,0
2023/09/17 02:23:04.114,97,login-5a,200,,login-browse-upload-download-search-logout 1-1,text,true,,5549,5258,1,1,http://{URL},79,{HostName},0,0
2023/09/17 02:23:04.114,79,login-5a-0,302,,login-browse-upload-download-search-logout 1-1,,true,,3158,1556,1,1,http://{URL},79,{HostName},0,0
2023/09/17 02:23:04.197,14,login-5a-1,200,,login-browse-upload-download-search-logout 1-1,text,true,,2391,3702,1,1,http://{URL},14,{HostName},0,0
2023/09/17 02:23:04.217,11,login-6,200,,login-browse-upload-download-search-logout 1-1,text,true,,3262,2934,1,1,http://{URL},11,{HostName},0,0
2023/09/17 02:23:04.231,100,login-8,302,,login-browse-upload-download-search-logout 1-1,text,true,,1589,1140,1,1,http://{URL},100,{HostName},0,0
2023/09/17 02:23:04.335,772,login,200,,login-browse-upload-download-search-logout 1-1,text,true,,75118,609,1,1,http://{URL},50,{HostName},0,0
2023/09/17 02:23:04.335,50,login-0,302,,login-browse-upload-download-search-logout 1-1,text,true,,1403,304,1,1,http://{URL},50,{HostName},0,0
2023/09/17 02:23:04.386,721,login-1,200,,login-browse-upload-download-search-logout 1-1,text,true,,73715,305,1,1,http://{URL},669,{HostName},0,0

I use Filebeat to collect this csv log to logstash, this is my logstash pipeline:

input {

beats {
port => "${beats_port}"
ssl => true
ssl_certificate_authorities => ["${ssl_certificate_authorities}"]
ssl_certificate => "${ssl_certificate}"
ssl_key => "${ssl_key}"
ssl_verify_mode => "peer"
}
}

filter {

    if "login-browse-upload-download-search-logout" in [tags]{
            csv {
                columns => ["timeStamp", "elapsed", "label", "responseCode", "responseMessage", "threadName", "dataType", "success", "failureMessage", "bytes", "sentBytes", "grpThreads", "allThreads", "URL", "Latency", "Hostname", "IdleTime","Connect"]
                separator => ","
                convert => {"elapsed" => "integer"}
            }
            date {
                    match => [ "timeStamp", "yyyy/MM/dd HH:mm:ss.SSS" ]
                    target => "timeStamp"
            }
            mutate {
                gsub => ["threadName", " 1-1", ""]
            }
            mutate {
                rename => ["threadName", "TestcaseName"]
                remove_field => ["responseCode"]
                remove_field => ["responseMessage"]
                remove_field => ["dataType"]
                remove_field => ["bytes"]
                remove_field => ["sentBytes"]
                remove_field => ["grpThreads"]
                remove_field => ["allThreads"]
                remove_field => ["URL"]
                remove_field => ["Latency"]
                remove_field => ["IdleTime"]
                remove_field => ["Connect"]
            }

            if [label] == "login_logout-testcase" or [label] == "browse-testcase" or [label] == "export_content-testcase" or [label] == "import_content-testcase" or [label] == "search-testcase" or [label] == "login-browse-upload-download-search-logout-testcase" {
                if [success] == "true" {
                    mutate {
                        add_field => {"Testcase_Status" => "success" }
                    }
                }
                else {
                    mutate {
                        add_field => {"Testcase_Status" => "failure" }
                    }
                }

                mutate {
                    rename => ["elapsed", "Testcase_ExecutionTime"]
                    remove_field => ["label"]
                    remove_field => ["success"]
                    remove_field => ["failureMessage"]
                }
            }
            else {
                mutate {
                    rename => ["label", "Transaction_Name"]
                    rename => ["failureMessage", "Transaction_ErrorMessage"]
                }

                if [success] == "true" {
                    mutate {
                         add_field => {"Transaction_Status" => "success" }
                    }
                 }
                 else {
                    mutate {
                         add_field => {"Transaction_Status" => "failure" }
                    }
                 }

                 mutate {
                       remove_field => ["elapsed"]
                       remove_field => ["success"]
                }
            }
    }

}

Output {
if "login-browse-upload-download-search-logout" in [tags]{
elasticsearch {
hosts => ["${elasticsearch_host}:${elasticsearch_port}"]
index => "otcs_unit_transactions-%{+yyyy.MM.dd}-000001"
user => "${logstash_writer_username}"
password => "${logstash_writer_password}"
ssl => true
ssl_certificate_verification => true
cacert => "${cacert}"
ilm_enabled => true
ilm_rollover_alias => "otcs_unit_transactions"
ilm_pattern => "{now/d}-000001"
ilm_policy => "otcs_unit_transactions_ilm"
}
}
}

As you can see I have a Date filter plugin, not sure if there is an error there that prevents logstsash from parsing the correct date format to the timeStamp column. Has anyone encounter this error? Any help would be appreciated. Thank you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.