Pass Metadata Field via HTTP Plugin

Hi,

I want to ingest data to logstash with following command:
curl -XPUT 'mylogstashurl.com' -d 'hello'

I get this data into logstash but unfortunjately the @metadata field is always empty.
Is there any header i need to pass so I can override the metadata field?

Thanks &BR,
Salko

The @metadata is usually not output - but e.g. for the stdout/rubydebug output you can enable it: https://www.elastic.co/guide/en/logstash/7.1/event-dependent-configuration.html#metadata

Thank you for the information. But I am struggeling with a other problem.

This is my Pipeline:

input {
beats {
port => 5044
}
http {
port => 8080
}
}
output {
elasticsearch {
hosts => ["{ELASTICSEARCH_HOST}:{ELASTICSEARCH_PORT}"]
manage_template => false
index => "%{[@metadata]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}

}

As you can see I am saving the data to follwing index => "%{[@metadata]}-%{+YYYY.MM.dd}". Meaning with this command curl -XPUT 'mylogstashurl.com' -d 'hello', the Index where the data gets written to is always is something like this [{}-2019.05.21].

When data is send to logstash via Beats it works totally fine with the index name.

If you're in a testing stage still, try adding a stdout output like this:

 stdout { codec => rubydebug { metadata => true } }

This should output the document to stdout, and you'll see what's contained in the @metadata fields. Most likely, your @metadata doesn't get populated the way you expect it to, when using the http input - resulting in an unexpected elasticsearch index name.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.