Overall objective conversion from Logstash to Filebeat with ES pipeline, looking for pointers about:
- Migrate Logstash filters into an ES pipeline
- Migrated the patterns currently loaded in a directory but as I am using AWS elasticsearch it would be great if there was an API way of doing it. So just to confirm this is about taking the patterns currently in the Logstash directory and load them on ElasticSearch so they can be used in an ES pipeline/filter. I cannot load them manually on a folder in ElasticSearch so I was hoping ES has an API to add patterns?
- Where should the codec multiline be moved to ? Filebeat/ES and how?
- Where can i find examples of pipelines?
Continuing the discussion from Any good way to add custom grok pattern to elasticsearch for pipelines other then in the pipeline file?:
As already asked in that previous ticket is there a way to add custom patterns on ElasticSearch so that they can be used in a pipeline?
As I am experiencing problems in LogStash LogStash runner error: An unexpected error occurred! {:error=>#<SocketError: initialize: name or service not known> I am thinking about moving from LogStash to Filebeat and let ElasticSearch do the filtering in a pipeline.
My current filtering looks like this:
input {
file {
path => "/tmp/logs/**/*.log*"
start_position => "beginning"
sincedb_path => "/dev/null"
#codec => plain {
# charset => "ISO-8859-1"
#}
codec => multiline {
# Grok pattern names are valid! :)
pattern => "^%{YEAR}"
negate => true
what => previous
charset => "ISO-8859-1"
}
}
}
filter {
grok {
match => { "message" => "%{DATESTAMP:timestamp} \[%{LOGLEVEL:log-level}\] \[(?<app>[A-Za-z0-9.\s]*?)\] %{GREEDYDATA:message}" }
patterns_dir => ["/usr/share/logstash/patterns", "/usr/share/logstash/patterns_extra"]
}
date {
match => [ "timestamp" , "yyyy/MM/dd HH:mm:ss" ]
#match => [ "timestamp" , "yyyy/MM/dd HH:mm:ss", "ISO8601" ]
}
}
output {
elasticsearch {
hosts => ["https://search-instance.us-east-1.es.amazonaws.com:443"]
index => "myindex"
}
stdout { codec => rubydebug }
}
### inside /usr/share/logstash/patterns_extra the date file (for extra patterns) includes
DATE_YMD %{YEAR}/%{MONTHNUM}/%{MONTHDAY}
DATE %{DATE_US}|%{DATE_EU}|%{DATE_YMD}
#but i dont know how to set this up on AWS ES
As you can see above the DATETIME / timestamp leverages the use of the custom patterns. In my old setup i load these patterns in the patterns directory but on AWS I dont think there is a way to change configuration manually so it would be great if there was a way to do it in the API or directly in the pipeline. Any examples or points would be great as I tried following documentation but it is too generic and examples free