Hello,
I could see the date field are populated wrongly into Elasticsearch. I am generating CSV file and inserting the data through logstash conf file. Below is the screenshot where I have highlighted the date columns which are wrongly inserted for ex: In CSV the vendor_start_dt, vendor_end_dt and event_tm columns are 01-feb-2019, but when I see the data through kibana, it shows as 01-Jan-2019. This is the case for all the values.
What does your Elasticsearch mapping look like for those fields?
GET addd.xxxx.local:9200/rr_log_gen/_mapping/_doc
I've previously seen issues where a user configured a date field with the format including D (day of year) instead of d (day of month), causing similar effect.
I have attached the screenshot in my initial topic itself. I am using shell script to generate the date column with the format +%Y-%m-%d %T and if we viewin csv file, it shows the data as m/d/yyyy hh:mi:ss format.
The dates in your screenshot are not of the format you say you have specified. Sometimes visual editors like Excel can present data in a different way than the raw bytes. Please paste the raw bytes from your CSV (perhaps open it with Notepad or some other basic editor that doesn't manipulate presentation).
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.