Push data into elasticsearch indices from logstash via http

I am trying to push my data which I am collecting from various sensors into elasticsearch.

I am able to push my data directly into elasticsearch indices using the post method.

I am now attempting to push the data via logstash.

For this I have created a logstash conf as :

input {
http {
port => 9600
response_headers => {
"Access-Control-Allow-Origin" => "*"
"Content-Type" => "text/json"
"Access-Control-Allow-Headers" => "Origin, X-Requested-With, Content-Type, Accept"
}
}
}
output {
elasticsearch {
index => "panda-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][_type]}"
document_id => "%{[@metadata][_id]}"
}
}

When I post data using postman I see that data is being pushed but the previous data is being lost and the doc count is only one. How to eliminate this problem to store all data?

The output is :

{
"took": 2,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"skipped": 0,
"failed": 0
},
"hits": {
"total": 1,
"max_score": 1,
"hits": [
{
"_index": "panda-2018.11.02",
"_type": "%{[@metadata][_type]}",
"_id": "%{[@metadata][_id]}",
"_score": 1,
"_source": {
"@version": "1",
"@timestamp": "2018-11-02T14:23:42.469Z",
"message": "",
"headers": {
"cache_control": "no-cache",
"http_host": "localhost:9600",
"request_path": "/",
"http_accept": "/",
"postman_token": "61020876-8a8f-428b-8d99-af0ae669574a",
"connection": "keep-alive",
"accept_encoding": "gzip, deflate",
"request_method": "POST",
"test": "duper",
"content_length": "0",
"http_user_agent": "PostmanRuntime/7.2.0",
"http_version": "HTTP/1.1"
},
"host": "0:0:0:0:0:0:0:1"
}
}
]
}
}

Note: I have not created the elasticsearch index. I am letting logstash create it.

As you do not seem to have these metadata fields set, all documents will be indexed using the same ID. Remove these lines and issue should be resolved.

If I remove these, I will get a warning and I see no data at all!

[WARN ] 2018-11-02 20:13:25.882 [Ruby-0-Thread-8@[main]>worker2: :1] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"kenwin-2018.11.02", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x1ff44dcc], :response=>{"index"=>{"_index"=>"kenwin-2018.11.02", "_type"=>"doc", "_id"=>"YvXg1GYBjBuPNLEf-h8I", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Rejecting mapping update to [kenwin-2018.11.02] as the final mapping would have more than 1 type: [%{[@metadata][_type]}, doc]"}}}}

Upon removing those, I see no data at all!

That is because you have already got documents created with the incorrect type %{[@metadata][_type]}. I would recommend deleting the index and let Logstash create a new one.

It worked after deleting.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.