Hi,
I'm trying to import basic data from MSSQL table with 2 columns.
Using logstash-input-jdbc built-in pluging I have managed to connect to database and run the simple SELECT query to fetch all data from the table.
But if I want to put that data to output using logstash, all the time I'm getting only header information in Elasticsearch 5.
I have tried many output options but non of them give me result, so I can see all result rows places to the Elasticseach index.
I have tried to run conf file where output part is like this:
output {
elasticsearch {
action => "index"
index => "testdb"
}
stdout { codec => json_lines }
}
When I try to reach index tempdb via Kibana Dev Tools with GET /testdb my results are like this:
{
"testdb": {
"aliases": {},
"mappings": {
"logs": {
"properties": {
"@timestamp": {
"type": "date"
},
"@version": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"id_status": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"status_name": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
},
"settings": {
"index": {
"creation_date": "1480551071623",
"number_of_shards": "5",
"number_of_replicas": "1",
"uuid": "1TkCev0STdKbYCqgZ6tdzA",
"version": {
"created": "5000099"
},
"provided_name": "testdb"
}
}
Json output is showing something what I actually need stored inside of elasticsearch:
[2016-12-01T01:11:11,604][INFO ][logstash.agent ] Successfully started
Logstash API endpoint {:port=>9600}
{"id_status":"0001","@timestamp":"2016-12-01T00:11:10.768Z","status_name":"CardError","@version":"1"}
_ {"id_status":"0002","@timestamp":"2016-12-01T00:11:11.312Z","status_name":"CardDbError","@version":"1"}_
_ {"id_status":"0003","@timestamp":"2016-12-01T00:11:11.318Z","status_name":"CardOk","@version":"1"}_
What is the right way to store all output values to the ElasticSearch index?