Custom Index Name - Logstash

Hello!

I see that logstash is creating a new index on the Elasticsearch server everyday with the name 'logstash-yyyy.mm.dd'

However, I don't want a new file like this everyday and just want to give a custom name for the index where every day's data gets stored.

How do I do this?

1 Like

Check out https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-index

1 Like

I have modified my config to the following but still getting the same default date-based name for the index. Any suggestions?

input {
file {
path => "C:/ELK/logstash-2.2.2/sample.log"
type => "sample"
start_position => "beginning"
sincedb_path => "C:/ELK/logstash-2.2.2/dbfilea"

}
}

filter {
grok { match => { "message" => "%{DAY:day}\s%{MONTH:month}\s%{MONTHDAY:monthday}\s%{YEAR:year}\s%{TIME:time}\sGMT(?[+-]\d\d\d\d)\s([^)]+)\s%{NUMBER:temp}\s%{NUMBER:light}\s%{GREEDYDATA:room}"} }

}

output {

elasticsearch {

index => "TempLightLogs"

}

stdout{}

}

That should not happen with the configuration above. Please check again. This configuration change obviously only applies to new data.

It worked fine.

However, I have now lost the .raw fields that I very eagerly needed to carry out visualizations in the desired manner. Any idea how I can retrieve them?

You need to setup a template to manage those, take a look at the _templates API endpoint and copy the logstash one over to something for your use.

I am trying to define a custom index through my logstash but it doesn't work

input {
file {
path => "Path" #hiding it for confidentiality
type => "csv"
start_position => beginning}
}

filter {
csv {
columns=> ["Title","Impact","Test Outcome","Recommendation","References","Affected asset","Risk Rating","Attack vector","Attack complexity","Privileges required","User interaction","Scope","Confidentiality","Integrity","Availability","Exploit code maturity","Remediation level","Report confidence","Confidentiality requirement","Integrity requirement","Availability requirement","Modified attack vector","Modified attack complexity","Modified privileges required","Modified user interaction","Modified scope","Modified confidentiality","Modified integrity","Modified availability","Design Issue","Configuration issue","Coding Issue"]
separator=> ","
remove_field => ["message"]}
}
output {
elasticsearch { hosts => ["localhost:9200"]
index => "xx"
document_type => "Assessment"
}
stdout { codec => rubydebug }
}

Can anyone suggest what could be the issue with this config??

Please start a new thread for your question and describe exactly what is not working.