S3 csv data does not reach to elasticsearch

hi community

i try to send s3 > logstash > aws es
so, i use s3 input filter > csv filter > aws es output filter
It seems can download s3 csv files and reach csv filter as well .
but i don't see data in kibana

here is logstash debug log:

[2019-07-31T15:22:32,199][DEBUG][logstash.pipeline ] filter received {"event"=>{"@version"=>"1", "@timestamp"=>2019-07-31T07:22:32.125Z, "message"=>"?Date,Product Categories,Geo,Revenue\r\n", "type"=>"s3"}}
[2019-07-31T15:22:32,202][DEBUG][logstash.filters.csv ] Running csv filter {:event=>2019-07-31T07:22:32.125Z %{host} ?Date,Product Categories,Geo,Revenue
}
[2019-07-31T15:22:32,208][DEBUG][logstash.filters.csv ] Event after csv filter {:event=>2019-07-31T07:22:32.125Z %{host} ?Date,Product Categories,Geo,Revenue
}
[2019-07-31T15:22:32,210][DEBUG][logstash.pipeline ] output received {"event"=>{"Geo"=>"Geo", "@timestamp"=>2019-07-31T07:22:32.125Z, "@version"=>"1", "Product Categories"=>"Product Categories", "Revenue"=>"Revenue", "message"=>"?Date,Product Categories,Geo,Revenue\r\n", "type"=>"s3", "Date"=>"?Date"}}
[2019-07-31T15:22:32,229][DEBUG][logstash.inputs.s3 ] S3 input processing {:bucket=>"testing3172019", :key=>"test/test2.csv"}
[2019-07-31T15:22:32,232][DEBUG][logstash.inputs.s3 ] S3 input: Download remote file {:remote_key=>"test/test2.csv", :local_filename=>"C:/Users/SHWESI~1/AppData/Local/Temp/logstash/test2.csv"}
[2019-07-31T15:22:32,238][DEBUG][logstash.pipeline ] filter received {"event"=>{"@version"=>"1", "@timestamp"=>2019-07-31T07:22:32.165Z, "message"=>"1/1/17,Digital,Turkey,1738.04848\r\n", "type"=>"s3"}}
[2019-07-31T15:22:32,239][DEBUG][logstash.filters.csv ] Running csv filter {:event=>2019-07-31T07:22:32.165Z %{host} 1/1/17,Digital,Turkey,1738.04848
}
[2019-07-31T15:22:32,241][DEBUG][logstash.filters.csv ] Event after csv filter {:event=>2019-07-31T07:22:32.165Z %{host} 1/1/17,Digital,Turkey,1738.04848
}
[2019-07-31T15:22:32,242][DEBUG][logstash.pipeline ] filter received {"event"=>{"@version"=>"1", "@timestamp"=>2019-07-31T07:22:32.195Z, "message"=>"1/1/17,Movies,Turkey,3359.74848\r\n", "type"=>"s3"}}
[2019-07-31T15:22:32,243][DEBUG][logstash.filters.csv ] Running csv filter {:event=>2019-07-31T07:22:32.195Z %{host} 1/1/17,Movies,Turkey,3359.74848
}
[2019-07-31T15:22:32,244][DEBUG][logstash.filters.csv ] Event after csv filter {:event=>2019-07-31T07:22:32.195Z %{host} 1/1/17,Movies,Turkey,3359.74848
}
[2019-07-31T15:22:32,249][DEBUG][logstash.pipeline ] filter received {"event"=>{"@version"=>"1", "@timestamp"=>2019-07-31T07:22:32.195Z, "message"=>"1/1/17,Industrial,Turkey,3553.54288\r\n", "type"=>"s3"}}
[2019-07-31T15:22:32,250][DEBUG][logstash.filters.csv ] Running csv filter {:event=>2019-07-31T07:22:32.195Z %{host} 1/1/17,Industrial,Turkey,3553.54288
}
[2019-07-31T15:22:32,252][DEBUG][logstash.filters.csv ] Event after csv filter {:event=>2019-07-31T07:22:32.195Z %{host} 1/1/17,Industrial,Turkey,3553.54288
}
[2019-07-31T15:22:32,253][DEBUG][logstash.pipeline ] filter received {"event"=>{"@version"=>"1", "@timestamp"=>2019-07-31T07:22:32.195Z, "message"=>"1/1/17,Games,Turkey,257.30696\r\n", "type"=>"s3"}}
[2019-07-31T15:22:32,253][DEBUG][logstash.filters.csv ] Running csv filter {:event=>2019-07-31T07:22:32.195Z %{host} 1/1/17,Games,Turkey,257.30696
}
[2019-07-31T15:22:32,255][DEBUG][logstash.filters.csv ] Event after csv filter {:event=>2019-07-31T07:22:32.195Z %{host} 1/1/17,Games,Turkey,257.30696
}
[2019-07-31T15:22:32,256][DEBUG][logstash.pipeline ] filter received {"event"=>{"@version"=>"1", "@timestamp"=>2019-07-31T07:22:32.196Z, "message"=>"1/1/17,Office Supplies,Turkey,7479.57508\r\n", "type"=>"s3"}}
[2019-07-31T15:22:32,257][DEBUG][logstash.filters.csv ] Running csv filter {:event=>2019-07-31T07:22:32.196Z %{host} 1/1/17,Office Supplies,Turkey,7479.57508
}
[2019-07-31T15:22:32,263][DEBUG][logstash.filters.csv ] Event after csv filter {:event=>2019-07-31T07:22:32.196Z %{host} 1/1/17,Office Supplies,Turkey,7479.57508
}
[2019-07-31T15:22:32,265][DEBUG][logstash.pipeline ] filter received {"event"=>{"@version"=>"1", "@timestamp"=>2019-07-31T07:22:32.197Z, "message"=>"1/1/17,Computers,Turkey,787.6508\r\n", "type"=>"s3"}}
[2019-07-31T15:22:32,265][DEBUG][logstash.filters.csv ] Running csv filter {:event=>2019-07-31T07:22:32.197Z %{host} 1/1/17,Computers,Turkey,787.6508
}
[2019-07-31T15:22:32,267][DEBUG][logstash.filters.csv ] Event after csv filter {:event=>2019-07-31T07:22:32.197Z %{host} 1/1/17,Computers,Turkey,787.6508
}
[2019-07-31T15:22:32,268][DEBUG][logstash.pipeline ] filter received {"event"=>{"@version"=>"1", "@timestamp"=>2019-07-31T07:22:32.198Z, "message"=>"1/1/17,Books,Turkey,1091.6226\r\n", "type"=>"s3"}}
[2019-07-31T15:22:32,268][DEBUG][logstash.filters.csv ] Running csv filter {:event=>2019-07-31T07:22:32.198Z %{host} 1/1/17,Books,Turkey,1091.6226
}
[2019-07-31T15:22:32,272][DEBUG][logstash.filters.csv ] Event after csv filter {:event=>2019-07-31T07:22:32.198Z %{host} 1/1/17,Books,Turkey,1091.6226
}
[2019-07-31T15:22:32,274][DEBUG][logstash.pipeline ] filter received {"event"=>{"@version"=>"1", "@timestamp"=>2019-07-31T07:22:32.198Z, "message"=>"1/1/17,Health,Turkey,540.40948\r\n", "type"=>"s3"}}
[2019-07-31T15:22:32,276][DEBUG][logstash.filters.csv ] Running csv filter {:event=>2019-07-31T07:22:32.198Z %{host} 1/1/17,Health,Turkey,540.40948
}
[2019-07-31T15:22:32,278][DEBUG][logstash.filters.csv ] Event after csv filter {:event=>2019-07-31T07:22:32.198Z %{host} 1/1/17,Health,Turkey,540.40948
}
[2019-07-31T15:22:32,279][DEBUG][logstash.pipeline ] filter received {"event"=>{"@version"=>"1", "@timestamp"=>2019-07-31T07:22:32.198Z, "message"=>"1/1/17,Outdoors,Turkey,548.13056", "type"=>"s3"}}
[2019-07-31T15:22:32,281][DEBUG][logstash.filters.csv ] Running csv filter {:event=>2019-07-31T07:22:32.198Z %{host} 1/1/17,Outdoors,Turkey,548.13056}
[2019-07-31T15:22:32,283][DEBUG][logstash.filters.csv ] Event after csv filter {:event=>2019-07-31T07:22:32.198Z %{host} 1/1/17,Outdoors,Turkey,548.13056}
[2019-07-31T15:22:32,286][DEBUG][logstash.pipeline ] output received {"event"=>{"Geo"=>"Turkey", "@timestamp"=>2019-07-31T07:22:32.165Z, "@version"=>"1", "Product Categories"=>"Digital", "Revenue"=>"1738.04848", "message"=>"1/1/17,Digital,Turkey,1738.04848\r\n", "type"=>"s3", "Date"=>"1/1/17"}}

I thought I can see that data in kibana because log says input,filter,output receive the data.
But I can not see any data in kibana.

Any help appreciated

Can you update your configuration file.

Oh thank you Jeeth

It works now. I change type = "csv" in s3 input

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.