I am trying to push my data which I am collecting from various sensors into elasticsearch.
I am able to push my data directly into elasticsearch indices using the post method.
I am now attempting to push the data via logstash.
For this I have created a logstash conf as :
input {
http {
port => 9600
response_headers => {
"Access-Control-Allow-Origin" => "*"
"Content-Type" => "text/json"
"Access-Control-Allow-Headers" => "Origin, X-Requested-With, Content-Type, Accept"
}
}
}
output {
elasticsearch {
index => "panda-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][_type]}"
document_id => "%{[@metadata][_id]}"
}
}
When I post data using postman I see that data is being pushed but the previous data is being lost and the doc count is only one. How to eliminate this problem to store all data?
The output is :
{
"took": 2,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"skipped": 0,
"failed": 0
},
"hits": {
"total": 1,
"max_score": 1,
"hits": [
{
"_index": "panda-2018.11.02",
"_type": "%{[@metadata][_type]}",
"_id": "%{[@metadata][_id]}",
"_score": 1,
"_source": {
"@version": "1",
"@timestamp": "2018-11-02T14:23:42.469Z",
"message": "",
"headers": {
"cache_control": "no-cache",
"http_host": "localhost:9600",
"request_path": "/",
"http_accept": "/",
"postman_token": "61020876-8a8f-428b-8d99-af0ae669574a",
"connection": "keep-alive",
"accept_encoding": "gzip, deflate",
"request_method": "POST",
"test": "duper",
"content_length": "0",
"http_user_agent": "PostmanRuntime/7.2.0",
"http_version": "HTTP/1.1"
},
"host": "0:0:0:0:0:0:0:1"
}
}
]
}
}
Note: I have not created the elasticsearch index. I am letting logstash create it.