Logstash to ES with postgres

Hi there,

I'm trying to set up an ELK system to have a better overview over some data.
I have to flash code scanne which fill up a PostgreSQL database (Date - Hour - ID ect)

So i installed ELK follywing a random tuto btu i'm stuck on sending data to ES by logstash :

firstly i created an index in elastic search :
curl -X PUT http://localhost:9200/production -d '{"index.mapper.dynamic" : true}'

curl -X PUT http://localhost:9200/production/_mapping/prod -d '
{
** "prod":{**
** "properties": {**
** "idtraca_datas":{**
** "index" : "not_analyzed",**
** "type" : "string"**
** },**
** "date_code":{**
** "index" : "analyzed",**
** "type" : "date"**
** },**
** "heure_data":{**
** "index" : "analyzed",**
** "type" : "date"**
** },**
** "code_produit":{**
** "index" : "not_analyzed",**
** "type" : "string"**
** },**
** "num_lot":{**
** "index" : "not_analyzed",**
** "type" : "string"**
** },**
** "code_lu":{**
** "index" : "not_analyzed",**
** "type" : "string"**
** },**
** "id_code":{**
** "index" : "not_analyzed",**
** "type" : "string"**
** },**
** "ligne":{**
** "index" : "not_analyzed",**
** "type" : "string"**
** },**
** "orientation":{**
** "index" : "not_analyzed",**
** "type" : "string"**
** }**
** }**
** }**
}'

Then, with this logstash conf :

# file: ES_logstash.conf
input {
** jdbc {**
** jdbc_connection_string => "jdbc:postgresql://localhost:5432/production"**
** jdbc_user => "test"**
** jdbc_password => ""
** jdbc_validate_connection => true**
** jdbc_driver_library => "/home/test/Documents/postgresql-9.4.1208.jre7.jar"**
** jdbc_driver_class => "org.postgresql.Driver"**
** statement => 'SELECT * from "Traca_Datas" limit 1'**
** }**
}

output {
** elasticsearch {**
** hosts => "localhost:9200"**
** document_id => "%{uid}"**
** index => "production"**
** document_type => "prod"**
** }**
}

I got this error message :

{:timestamp=>"2016-03-22T09:16:13.539000+0100", :message=>"Failed action. ", :status=>400, :action=>["index", {:_id=>"%{uid}", :_index=>"production", :_type=>"prod", :_routing=>nil}, #<LogStash::Event:0x65783cd8 @metadata_accessors=#<LogStash::Util::Accessors:0x3653c1ed @store={}, @lut={}>, @cancelled=false, @data={"idtraca_datas"=>15, "date_code"=>#<Date: 2013-07-29 ((2456503j,0s,0n),+0s,2299161j)>, "heure_data"=>"2016-03-22T17:22:23.020Z", "id_code"=>"Code 128", "code_lu"=>"00082", "ligne"=>"L1", "orientation"=>169, "code_produit"=>"74123", "num_lot"=>"74123", "@version"=>"1", "@timestamp"=>"2016-03-22T08:16:10.687Z"}, @metadata={}, @accessors=#<LogStash::Util::Accessors:0x172c38ef @store={"idtraca_datas"=>15, "date_code"=>#<Date: 2013-07-29 ((2456503j,0s,0n),+0s,2299161j)>, "heure_data"=>"2016-03-22T17:22:23.020Z", "id_code"=>"Code 128", "code_lu"=>"00082", "ligne"=>"L1", "orientation"=>169, "code_produit"=>"74123", "num_lot"=>"74123", "@version"=>"1", "@timestamp"=>"2016-03-22T08:16:10.687Z"}, @lut={"uid"=>[{"idtraca_datas"=>15, "date_code"=>#<Date: 2013-07-29 ((2456503j,0s,0n),+0s,2299161j)>, "heure_data"=>"2016-03-22T17:22:23.020Z", "id_code"=>"Code 128", "code_lu"=>"00082", "ligne"=>"L1", "orientation"=>169, "code_produit"=>"74123", "num_lot"=>"74123", "@version"=>"1", "@timestamp"=>"2016-03-22T08:16:10.687Z"}, "uid"]}>>], :response=>{"index"=>{"_index"=>"production", "_type"=>"prod", "_id"=>"%{uid}", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date_code]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "2013-07-29 00:00:00 +0200" is malformed at " 00:00:00 +0200""}}}}, :level=>:warn}

I have an invalid format on the date, but how can i resolve it ? and if you guys found other mistake thanks for telling me.

Thanks for readin and for your help.

Josselin

I'd put a date filter in normalise it and stop this, take a look at https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html

Ok now my trasfert from logstash to ES work but i only have the last entry of my database.
In the statement of logstash.conf i put the entire table "select * from "Traca_Datas" but i only have the last ligne of the table in kibana.

idtraca_datas:251186date_code:March 7th 2016, 00:00:00.000heure_data:March 22nd 2016, 12:45:37.330id_code:Code 128code_lu:115940875492ligne:L1orientation:179code_produit:79650u2num_lot:594000@version:1@timestamp:March 22nd 2016, 11:07:37.727_id:%{uid}_type:prod_index:production_score:1

It looks like Logstash can not find a uid field to pick the ID from and therefore ends up setting the format string as the Id on all records, causing each to update the previous. Check your output and verify that you have successfully parsed this field and that it is all lower case.

It is usually easier to debug these type of issues if you send the output to stdout using the ruby debug codec until you have verified that the document has the structure you expect.

Hi there,

Sorry but i've got a new problem.

Still got my postgres database with this Inside :

this is my ES index :

curl -X PUT http://localhost:9200/ligneserac/_mapping/serac -d '
{
"serac":{
"properties": {
"id":{
"index" : "not_analyzed",
"type" : "string"
},
"scan_date":{
"type" : "date",
"format" : "yyyy-MM-dd'T'HH:mm:ssZZ"
},
"scan_cab":{
"index" : "not_analyzed",
"type" : "string"
},
"line_id":{
"index" : "not_analyzed",
"type" : "string"
},
"nom_product":{
"index" : "not_analyzed",
"type" : "string"
},
"num_lot":{
"index" : "not_analyzed",
"type" : "string"
}
}
}
}'

The problem is :
when i don't add a date format - ma data seem to not be transfered to ES
when i use this format : "yyyy-MM-dd'T'HH:mm:ssZZ" - i have this error :

When i use the same format but without the 'T' : "yyyy-MM-dd HH:mm:ssZZ" - i got this error on each entry of my database :

The only solution i found is to create the index withotu the time based event but i need it so :confused:

Thanks for reading,

Josselin