Changing the datatype of Date Filter in Logstash ,doesn't change the datatype in Kibana

Hi ALL,

My issue is I am using logstash to parse a CSV file and then see the visulization in Kibana ,however the date field is changing to string in Kibana,What I want is I will pass Date field as date and get Date in Kibana instead of String my code snippet is as below

LOGSTASH version logstash-5.3.0
Elasticsearch Version elasticsearch-5.3.0
Kibana Version kibana-5.3.0-windows-x86

`input {

file {
path => "C:\Users\pramanick\Desktop\Projects\Accelarators\ES\bitcoin-data\mail2.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["Subject","DateSent","DateLastmodified","From","Last Modifier Name","Size"]
convert => {"DateSent" => "date"}
convert => {"DateLastmodified" => "date"}

 }
  date {
			match => ["DateSent","yyyy-MM-dd HH:mm:ss" ]
			target => "DateSent"
			timezone => "UTC"				
		  }
		  
  date {
			match => ["DateLastmodified","yyyy-MM-dd HH:mm:ss" ]
			target => "DateLastmodified"
			timezone => "UTC"				
		  }
	 

 }

output {
elasticsearch {
action => "index"
hosts => "http://localhost:9200"
index => "ajit3"
workers => 1
}
stdout {}
}

Kindly help me for the same.

Thanks
Prosenjit
`

A timestamp created by the date filter will be recognized as a date by ES, so if your date filters work you'll be fine.

However, the mapping of a field can't be changed so if the DateSent and DateLastmodified fields in your ajit3 index at some point have been mapped as something other than a date you'll have to reindex the data. If you currenlty only have garbage data because you're testing Logstash you can just delete the index and recreate it.

1 Like

Hi Magnus,

As suggested I have deleted the index and recreated an Index totally with the new name ,
The field DateSent in csv is having values like 2/3/2017 2:49:00 PM till 2016 .

However in Kibana I can see DateSent as string only

DateSent.keyword string

In Kibana Discovery tab when I choose the new Index ,I get only one bar for date 2017-07-01 ,can you please help me here.

Thanks
Prosenjit

Please show an example document. Copy/paste from the JSON tab in Kibana's Discover panel.

Hi Thanks,

For getting me back ,please find the JSON as requested :-

{

"_index": "king",
"_type": "logs",
"_id": "AV03SMA37nVTasmtbISt",
"_score": null,
"_source": {
"path": "C:\Users\ppramanick\Desktop\Projects\Accelarators\ES\bitcoin-data\mail2.csv",
"@timestamp": "2017-07-12T14:51:39.199Z",
"DateSent": "2/3/2017 19:09",
"DateLastmodified": "2/11/2017 12:02",
"Size": "122935",
"@version": "1",
"host": "PPRAMANIC1",
"Last Modifier Name": "Pramanick, Prosenjit",
"From": "Juman",
"message": "RE: URGENT: Your Action Required: IT Setup - Azure training,2/3/2017 19:09,2/11/2017 12:02,"Juman","Pramanick",122935\r",
"Subject": "RE: URGENT: Your Action Required: IT Setup - Azure training",
"tags": [
"_dateparsefailure"
]
},
"fields": {
"@timestamp": [
1499871099199
]
},
"highlight": {
"From": [
"Jumani, @kibana-highlighted-field@Romit@/kibana-highlighted-field"
],
"message": [
"RE: URGENT: Your Action Required: IT Setup - Azure training,2/3/2017 19:09,2/11/2017 12:02,"Jumani, @kibana-highlighted-field@Romit@/kibana-highlighted-field@ (US - Mumbai)","Pramanick, Prosenjit",122935\r"
]
},
"sort": [
1499871099199
]
}

These events have the _dateparsefailure tag so your date filters aren't working. Comparing your configuration and its "yyyy-MM-dd HH:mm:ss" date pattern it clearly doesn't match the "2/11/2017 12:02" date in your events.

Hi ,

I have kept my settings as like this before
filter {
csv {
separator => ","
columns => ["Subject","DateSent","DateLastmodified","From","Last Modifier Name","Size"]
convert => {"DateSent" => "date"}
convert => {"DateLastmodified" => "date"}

 }
  date {
			match => ["DateSent","yyyy-MM-dd" ]
			target => "@timestamp"
			timezone => "UTC"				
		  }
		  
  date {
			match => ["DateLastmodified","yyyy-MM-dd" ]
			target => "DateLastmodified"
			timezone => "UTC"				
		  }
	 

 }

In CSV the field is having values like
2/3/2017 2:49:00 PM
1/31/2017 7:17:00 PM
12/15/2016 9:49:00 AM

After that I changed the filter section as below but in JSON I still get _dateparsefailure error :-
filter {
csv {
separator => ","
columns => ["Subject","DateSent","DateLastmodified","From","Last Modifier Name","Size"]
convert => {"DateSent" => "date"}
convert => {"DateLastmodified" => "date"}

 }
  date {
			match => ["DateSent","mm/dd/yyyy" ]
			target => "@timestamp"
			timezone => "UTC"				
		  }
		  
  date {
			match => ["DateLastmodified","mm/dd/yyyy" ]
			target => "DateLastmodified"
			timezone => "UTC"				
		  }
	 

 }

Kindly suggest a work around for the same .

Thanks
Prosenjit

Try this (also for DateLastmodified, obviously):

match => ["DateSent", "m/dd/yyyy hh:mm a", "mm/dd/yyyy hh:mm a"]

If it still doesn't work, look in your Logstash log. When the date filter fails it'll tell you why.

Hi,

I changed as per instructions ,but still I get the _dataparsing error ,when I see the console logs I don't find anything

C:\Users\pramanick\Desktop\Projects\Accelarators\ES\logstash-5.3.0\logstash-5.3.0\bin>logstash -f C:\Users\pramanick\Desktop\Projects\Accelarators\ES\bitcoin-data\btcmails.conf

Could not find log4j2 configuration at path /Users/pramanick/Desktop/Projects/Accelarators/ES/logstash-5.3.0/logstash-5.3.0/config/log4j2.properties. Using default config which logs to console
21:35:31.346 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
21:35:31.346 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
21:35:31.471 [[main]-pipeline-manager] WARN logstash.outputs.elasticsearch - Restored connection to ES instance {:url=>#<URI::HTTP:0x6345980c URL:http://localhost:9200/>}
21:35:31.487 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Using mapping template from {:path=>nil}
21:35:31.767 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
21:35:31.767 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::HTTP:0x126a8a81 URL:http://localhost:9200>]}
21:35:31.783 [[main]-pipeline-manager] INFO logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
21:35:32.788 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
21:35:33.273 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
2017-07-12T16:05:32.898Z PPRAMANIC1 Subject,DateSent,DateLastmodified,From ,Last Modifier Name,Size
2017-07-12T16:05:32.898Z PPRAMANIC1 RE: URGENT: Your Action Required: IT Setup - Azure training,2/3/2017 14:49,2/11/2017 12:02,"Jumani, Romit (US - Mumbai)","Pramanick, Prosenjit (US - Mumbai)",119426

Kindly suggest

Weird. Try increasing the loglevel with --verbose or even --debug.

Hi

I tried the below still I don't get any logs

C:\Users\ppramanick\Desktop\Projects\Accelarators\ES\logstash-5.3.0\logstash-5.3.0\bin>logstash -V --debug C:\Users\ppramanick\Desktop\Projects\Accelarators\ES\bitcoin-data\btcmails.conf

ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
Could not find log4j2 configuration at path /Users/ppramanick/Desktop/Projects/Accelarators/ES/logstash-5.3.0/logstash-5.3.0/config/log4j2.properties. Using default config which logs to console
logstash 5.3.0
jruby 1.7.25 (1.9.3p551) 2016-04-13 867cb81 on Java HotSpot(TM) 64-Bit Server VM 1.8.0_131-b11 +jit [Windows 10-amd64]
java 1.8.0_131 (Oracle Corporation)
jvm Java HotSpot(TM) 64-Bit Server VM / 25.131-b11
gem addressable 2.3.8
gem atomic 1.1.99
gem avl_tree 1.2.1
gem awesome_print 1.7.0
gem jmespath 1.3.1
gem aws-sdk-core 2.3.22
gem aws-sdk-resources 2.3.22
gem aws-sdk 2.3.22
gem json 1.8.6
gem nokogiri 1.7.0.1
gem aws-sdk-v1 1.66.0

without showing any error

With --debug Logstash will produce lots of logs. I find it very hard to believe that you're only getting what you've posted above.

Hi ,

I infact created a LOGS folder to check if logs are getting created ,but it is not ,I wish I could share screen with you to show the output of --debug ,Kindly suggest way forward .

If you have too much to paste into a post here, use a gist or pastebin or some other external service.

Hi,

Please find full logs below :-

C:\Users\ppramanick\Desktop\Projects\Accelarators\ES\logstash-5.3.0\logstash-5.3.0\bin>logstash -V --debug C:\Users\ppramanick\Desktop\Projects\Accelarators\ES\bitcoin-data\btcmails.conf
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
Could not find log4j2 configuration at path /Users/ppramanick/Desktop/Projects/Accelarators/ES/logstash-5.3.0/logstash-5.3.0/config/log4j2.properties. Using default config which logs to console
logstash 5.3.0
jruby 1.7.25 (1.9.3p551) 2016-04-13 867cb81 on Java HotSpot(TM) 64-Bit Server VM 1.8.0_131-b11 +jit [Windows 10-amd64]
java 1.8.0_131 (Oracle Corporation)
jvm Java HotSpot(TM) 64-Bit Server VM / 25.131-b11
gem addressable 2.3.8
gem atomic 1.1.99
gem avl_tree 1.2.1
gem awesome_print 1.7.0
gem jmespath 1.3.1
gem aws-sdk-core 2.3.22
gem aws-sdk-resources 2.3.22
gem aws-sdk 2.3.22
gem json 1.8.6
gem nokogiri 1.7.0.1
gem aws-sdk-v1 1.66.0
gem bindata 2.3.5
gem buftok 0.2.0
gem builder 3.2.3
gem bundler 1.9.10
gem cabin 0.9.0
gem numerizer 0.1.1
gem chronic_duration 0.10.6
gem cinch 2.3.3
gem clamp 0.6.5
gem coderay 1.1.1
gem concurrent-ruby 1.0.0
gem unf 0.1.4
gem domain_name 0.5.20170223
gem dotenv 2.2.0
gem edn 1.1.1
gem multi_json 1.12.1
gem elasticsearch-api 5.0.3
gem multipart-post 2.0.0
gem faraday 0.9.2
gem elasticsearch-transport 5.0.3
gem elasticsearch 5.0.3
gem equalizer 0.0.10
gem ffi 1.9.17
gem minitar 0.5.4
gem file-dependencies 0.1.6
gem filesize 0.0.4
gem filewatch 0.9.0
gem gelfd 0.2.0
gem gems 0.8.3
gem hitimes 1.2.4
gem http-cookie 1.0.3
gem http-form_data 1.0.1
gem http_parser.rb 0.6.0
gem http 0.9.9
gem i18n 0.6.9
gem insist 1.0.0
gem jar-dependencies 0.3.10
gem jls-grok 0.11.4
gem jls-lumberjack 0.0.26
gem jrjackson 0.4.2
gem jrmonitor 0.4.2
gem jruby-openssl 0.9.16
gem jruby-stdin-channel 0.2.0
gem ruby-maven-libs 3.3.9
gem ruby-maven 3.3.12
gem logstash-core-event-java 5.3.0
gem logstash-core-queue-jruby 5.3.0
gem method_source 0.8.2
gem slop 3.6.0
gem spoon 0.0.6
gem pry 0.10.4
gem puma 2.16.0
gem rubyzip 1.1.7
gem rack 1.6.5
gem rack-protection 1.5.3
gem tilt 2.0.6
gem sinatra 1.4.8
gem stud 0.0.22
gem thread_safe 0.3.6
gem polyglot 0.3.5
gem treetop 1.4.15
gem logstash-core 5.3.0
gem logstash-core-plugin-api 2.1.12
gem logstash-codec-cef 4.1.2
gem logstash-codec-collectd 3.0.3
gem logstash-codec-dots 3.0.2
gem logstash-codec-edn 3.0.2
gem logstash-codec-line 3.0.2
gem logstash-codec-edn_lines 3.0.2
gem logstash-codec-es_bulk 3.0.3
gem msgpack-jruby 1.4.1
gem logstash-codec-fluent 3.0.2
gem logstash-codec-graphite 3.0.2
gem logstash-codec-json 3.0.2
gem logstash-codec-json_lines 3.0.2
gem logstash-codec-msgpack 3.0.2
gem logstash-patterns-core 4.0.2
gem logstash-codec-multiline 3.0.3
gem logstash-codec-netflow 3.3.0
gem logstash-codec-plain 3.0.2
gem logstash-codec-rubydebug 3.0.2
gem logstash-filter-clone 3.0.2
gem logstash-filter-csv 3.0.2
gem logstash-filter-date 3.1.3
gem logstash-filter-dissect 1.0.8
gem lru_redux 1.1.0
gem logstash-filter-dns 3.0.3
gem logstash-filter-drop 3.0.2
gem murmurhash3 0.1.6
gem logstash-filter-fingerprint 3.0.2
gem logstash-filter-geoip 4.0.4
gem logstash-filter-grok 3.4.0
gem logstash-filter-json 3.0.2
gem logstash-filter-kv 4.0.0
gem metriks 0.9.9.7
gem logstash-filter-metrics 4.0.2
gem logstash-filter-mutate 3.1.3
gem logstash-filter-ruby 3.0.2
gem logstash-filter-sleep 3.0.3
gem logstash-filter-split 3.1.1
gem logstash-filter-syslog_pri 3.0.2
gem logstash-filter-throttle 4.0.1
gem logstash-filter-urldecode 3.0.2
gem user_agent_parser 2.3.0
gem logstash-filter-useragent 3.0.3
gem logstash-filter-uuid 3.0.2
gem xml-simple 1.1.5
gem logstash-filter-xml 4.0.2
gem logstash-input-beats 3.1.12
gem logstash-input-couchdb_changes 3.1.1
gem logstash-input-elasticsearch 4.0.2
gem logstash-input-exec 3.1.2
gem logstash-input-file 4.0.0
gem logstash-input-ganglia 3.1.0
gem logstash-input-gelf 3.0.2
gem logstash-input-generator 3.0.2
gem logstash-input-tcp 4.1.0
gem logstash-input-graphite 3.0.2
gem logstash-input-heartbeat 3.0.2
gem logstash-input-http 3.0.3
gem manticore 0.6.1
gem logstash-mixin-http_client 4.0.3
gem tzinfo 1.2.2
gem rufus-scheduler 3.0.9
gem logstash-input-http_poller 3.1.1
gem mime-types 2.6.2
gem mail 2.6.4
gem logstash-input-imap 3.0.2
gem logstash-input-irc 3.0.2
gem sequel 4.43.0
gem tzinfo-data 1.2017.1
gem logstash-input-jdbc 4.1.3
gem logstash-input-kafka 5.1.6
gem logstash-input-log4j 3.0.3
gem logstash-input-lumberjack 3.1.1
gem logstash-input-pipe 3.0.2
gem march_hare 2.22.0
gem logstash-mixin-rabbitmq_connection 4.2.2
gem logstash-input-rabbitmq 5.2.2
gem redis 3.3.3
gem logstash-input-redis 3.1.2
gem logstash-mixin-aws 4.2.1
gem logstash-input-s3 3.1.3
gem snmp 1.2.0
gem logstash-input-snmptrap 3.0.2
gem logstash-input-sqs 3.0.2
gem logstash-input-stdin 3.2.2
gem logstash-input-syslog 3.2.0
gem memoizable 0.4.2
gem naught 1.1.0
gem simple_oauth 0.3.1
gem twitter 5.15.0
gem logstash-input-twitter 3.0.3
gem logstash-input-udp 3.1.0
gem logstash-input-unix 3.0.3
gem xmpp4r 0.5
gem logstash-input-xmpp 3.1.2
gem logstash-output-cloudwatch 3.0.4
gem logstash-output-file 4.0.1
gem logstash-output-csv 3.0.3
gem logstash-output-elasticsearch 6.2.6
gem logstash-output-email 4.0.4
gem logstash-output-graphite 3.1.1
gem logstash-output-http 4.1.0
gem logstash-output-irc 3.0.2
gem logstash-output-kafka 5.1.5
gem logstash-output-nagios 3.0.2
gem logstash-output-null 3.0.2
gem logstash-output-pagerduty 3.0.3
gem logstash-output-pipe 3.0.2
gem logstash-output-rabbitmq 4.0.6
gem logstash-output-redis 3.0.3
gem logstash-output-s3 4.0.6
gem logstash-output-sns 4.0.2
gem logstash-output-sqs 4.0.1
gem statsd-ruby 1.2.0
gem logstash-output-statsd 3.1.1
gem logstash-output-stdout 3.1.0
gem logstash-output-tcp 4.0.0
gem logstash-output-udp 3.0.2
gem snappy-jars 1.1.0.1.2
gem snappy 0.0.12
gem webhdfs 0.8.0
gem logstash-output-webhdfs 3.0.2
gem logstash-output-xmpp 3.0.2
gem mustache 0.99.8
gem paquet 0.2.0
gem pleaserun 0.0.28
gem ruby-progressbar 1.8.1

Thanks
Prosenjit

Hi Could you get any details after seeing the logs ,kindly suggest way forward.

Thanks
Prosenjit

Until I can see logs I don't have anything to add.

Hi ,

I could this much of log ,I have pasted full logs in the earlier chat ,can you please suggest other way around to get the logs ,as I am running ES,KIbana,Logstash from Windows.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.