Write csv to elastic search by using logstash

Hello All,

I want to write a csv file to the elasticsearch, but after running ./logstash -f logstash.conf, es and kibana, you can check the indices is created properly but only with ten records. It doesn't import all data to the ES. Can anyone tell me where's the problem and how to fix it? Thanks so much.

Here's my logstach.conf:

input {
file {
path => "/Users/simon/Desktop/simon.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
filter {
csv {
separator => ","
#Date,Open,High,Low,Close,Volume (BTC),Volume (Currency),Weighted Price
columns => ["Date","Open","High","Low","Close","Volume (BTC)", "Volume (Currency)" ,"Weighted Price"]
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "bitcoin-prices"
stdout {}

Here's one record looks like on kibana when u check the indice:
"_index": "bitcoin-prices",
"_type": "logs",
"_id": "AVpt-S3DeuqB-XFNJ-4m",
"_score": 1,
"_source": {
"High": "139.0",
"Volume (BTC)": "6405.77673665",
"Volume (Currency)": "878670.987072",
"Weighted Price": "137.168531342",
"message": "2013-10-07,137.01002,139.0,135.12,135.80001,6405.77673665,878670.987072,137.168531342",
"Date": "2013-10-07",
"tags": [],
"Open": "137.01002",
"path": "/Users/simon/Desktop/simon.csv",
"@timestamp": "2017-02-24T02:35:30.624Z",
"Low": "135.12",
"@version": "1",
"host": "simons-MacBook-Air.local",
"Close": "135.80001"

Here's the debug information from ES:
[DEBUG][o.e.a.b.TransportShardBulkAction] [ZUWG2gq] [bitcoin-prices][2] failed to execute bulk item (index) index {[bitcoin-prices][logs][AVpt-S3GeuqB-XFNJ-6L], source[{"High":"High","Volume (BTC)":"Volume (BTC)","Volume (Currency)":"Volume (Currency)","Weighted Price":"Weighted Price","message":"Date,Open,High,Low,Close,Volume (BTC),Volume (Currency),Weighted Price","Date":"Date","tags":[],"Open":"Open","path":"/Users/simon/Desktop/simon.csv","@timestamp":"2017-02-24T02:35:29.743Z","Low":"Low","@version":"1","host":"simons-MacBook-Air.local","Close":"Close"}]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse [Date]

Thanks again

That error is likely because it's trying to treat the header line as a record.

How many document short are you?

what do you mean "document short"? Thx

I got this error :slight_smile:

Caused by: org.elasticsearch.ElasticsearchParseException: failed to parse date field [2015-05-01] with format [date_time]

You mentioned not all the data is in ES, so how much is missing?

1320 rows with 1310 missing, thanks,

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.