Date Filter Configuration Error Help Required

Hi All,

Version ELK Stack 6.4 / Windows 10 / zip downloads

Trying to ingest data from csv file into elasticsearch 6.4.

Want to parse the csv data columns containing dates in specific format with Logstash date filter pluggin. FORMAT Required : "16-07-2018 19:33:00"

Error in cmd console:

[2018-09-10T17:19:52,533][DEBUG][logstash.filters.csv     ] config LogStash::Filters::CSV/@autodetect_column_names = false
[2018-09-10T17:19:52,594][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"date", :type=>"filter", :class=>LogStash::Filters::Date}
[2018-09-10T17:19:52,625][ERROR][logstash.filters.date    ] Invalid setting for date filter plugin:

  filter {
    date {
      # This setting must be a string
      # Expected string, got ["Receive_Time", "Generate_Time", "Time_Logged", "Start_Time"]
      target => ["Receive_Time", "Generate_Time", "Time_Logged", "Start_Time"]
      ...
    }
  }
[2018-09-10T17:19:52,634][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Something is wrong with your configuration.", :backtrace=>["D:/ELK Stack/logstash-6.4.0/logstash-core/lib/logstash/config/mixin.rb:86:in `config_init'", "D:/ELK Stack/logstash-6.4.0/logstash-core/lib/logstash/filters/base.rb:126:in `initialize'", "D:/ELK Stack/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-date-3.1.9/lib/logstash/filters/date.rb:158:in `initialize'", "org/logstash/plugins/PluginFactoryExt.java:58:in `filter_delegator'", "org/logstash/plugins/PluginFactoryExt.java:226:in `plugin'", "org/logstash/plugins/PluginFactoryExt.java:166:in `plugin'", "D:/ELK Stack/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:71:in `plugin'", "(eval):39:in `<eval>'", "org/jruby/RubyKernel.java:994:in `eval'", "D:/ELK Stack/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:49:in `initialize'", "D:/ELK Stack/logstash-6.4.0/logstash-core/lib/logstash/pipeline.rb:90:in `initialize'", "D:/ELK Stack/logstash-6.4.0/logstash-core/lib/logstash/pipeline_action/create.rb:38:in `execute'", "D:/ELK Stack/logstash-6.4.0/logstash-core/lib/logstash/agent.rb:309:in `block in converge_state'"]}
[2018-09-10T17:19:52,695][DEBUG][logstash.agent           ] Starting puma
[2018-09-10T17:19:52,704][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}

=================================================

Config File Information(FYI there are #ed Comments within the config file at multiple locations which i have deleted in the below format - reason i mentioned this is that I have had errors occur in the past which were resolved once the comments were removed - just mentioning should it be significant):

input{
    file{
        path => "D:/ADMM_Firewall_Logs/ADMM_Firewall_Logs_Set_1.csv"
        start_position => "beginning"
        sincedb_path => "nul"
    }
}
filter{
    csv{
        separator => ","
        skip_header => true
        columns => ["Domain","Receive_Time","Serial","Type","Threat_Content_Type","Config_Version","Generate_Time","Source_address","Destination_address","NAT_Source_IP","NAT_Destination_IP","Rule","Source_User","Destination_User","Application","Virtual_System","Source_Zone","Destination_Zone","Inbound_Interface","Outbound_Interface","Log_Action","Time_Logged","Session_ID","Repeat_Count","Source_Port","Destination_Port","NAT_Source_Port","NAT_Destination_Port","Flags","IP_Protocol","Action","Bytes","Bytes_Sent","Bytes_Received","Packets","Start_Time","Elapsed_Time_sec","Category","Padding","Sequence_Number","Action_Flags","Source_Country","Destination_Country","cpadding","pkts_sent","pkts_received","session_end_reason","dg_hier_level_1","dg_hier_level_2","dg_hier_level_3","dg_hier_level_4","Virtual_System_Name","Device_Name","action_source","Source_VM_UUID","Destination_VM_UUID","Tunnel_ID_IMSI","Monitor_Tag_IMEI","Parent_Session_ID","parent_start_time","tunnel"]
        id => "CSV_ADMMFWLogs_1"
    }
    date{
        match => ["Receive_Time", "dd-MM-yyyy HH:mm:ss", "ISO8601"] # Format required :16-07-2018  19:33:00
        target => "Receive_Time" 
        match => ["Generate_Time", "dd-MM-yyyy HH:mm:ss", "ISO8601"]
        target => "Generate_Time"
        match => ["Time_Logged", "dd-MM-yyyy HH:mm:ss", "ISO8601"]
        target => "Time_Logged"
        match => ["Start_Time", "dd-MM-yyyy HH:mm:ss", "ISO8601"]
        target => "Start_Time"
        timezone => "A***/D*****"
        locale => "en"
        id => "DATE_ADMMFWLogs_1"
    }
    mutate{
        convert => {
            "Serial" => "string"
            "Bytes" => "integer"
            "Bytes_Received" => "integer"
            "Bytes_Sent" => "integer"
            "Repeat_Count" => "integer"
            "Packets" => "integer"
            "pkts_received" => "integer"
            "pkts_sent" => "integer"
        } 
        id => "MUTATE-CONVERT_ADMMFWLogs_1"
    }
}
output{
    stdout{
        codec => "rubydebug"
        id => "-RUBY-" 
    }
    elasticsearch{
        hosts => ["localhost:9200"]
        id => "-ES:9200-" 
    }
}

============================================================================

Some Help and Guidance ASAP is much appreciated!

Thanks in advance!

Regards,
JK

If you want to parse multiple fields you need multiple date filters. You can't just list multiple match & target pairs in the same filter.

Thanks Magnus

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.