Adding patterns in ES on AWS, pipeline definition through API

Overall objective conversion from Logstash to Filebeat with ES pipeline, looking for pointers about:

  1. Migrate Logstash filters into an ES pipeline
  2. Migrated the patterns currently loaded in a directory but as I am using AWS elasticsearch it would be great if there was an API way of doing it. So just to confirm this is about taking the patterns currently in the Logstash directory and load them on ElasticSearch so they can be used in an ES pipeline/filter. I cannot load them manually on a folder in ElasticSearch so I was hoping ES has an API to add patterns?
  3. Where should the codec multiline be moved to ? Filebeat/ES and how?
  4. Where can i find examples of pipelines?

Continuing the discussion from Any good way to add custom grok pattern to elasticsearch for pipelines other then in the pipeline file?:

As already asked in that previous ticket is there a way to add custom patterns on ElasticSearch so that they can be used in a pipeline?

As I am experiencing problems in LogStash LogStash runner error: An unexpected error occurred! {:error=>#<SocketError: initialize: name or service not known> I am thinking about moving from LogStash to Filebeat and let ElasticSearch do the filtering in a pipeline.

My current filtering looks like this:

input { 
  file {
    path => "/tmp/logs/**/*.log*"
    start_position => "beginning"
    sincedb_path => "/dev/null"
    #codec => plain {
	#	charset => "ISO-8859-1"
	#}
	codec => multiline {
	  # Grok pattern names are valid! :)
	  pattern => "^%{YEAR}"
	  negate => true
	  what => previous
	  charset => "ISO-8859-1"
	}
  }
}

filter {
  grok {
    match => { "message" => "%{DATESTAMP:timestamp} \[%{LOGLEVEL:log-level}\] \[(?<app>[A-Za-z0-9.\s]*?)\] %{GREEDYDATA:message}" }
    patterns_dir => ["/usr/share/logstash/patterns", "/usr/share/logstash/patterns_extra"]
  }
  date {
    match => [ "timestamp" , "yyyy/MM/dd HH:mm:ss" ]
    #match => [ "timestamp" , "yyyy/MM/dd HH:mm:ss", "ISO8601" ]
  }
}

output {
  elasticsearch { 
    hosts => ["https://search-instance.us-east-1.es.amazonaws.com:443"]
    index => "myindex"
  }
  stdout { codec => rubydebug }
}

### inside /usr/share/logstash/patterns_extra the date file (for extra patterns) includes

DATE_YMD %{YEAR}/%{MONTHNUM}/%{MONTHDAY}
DATE %{DATE_US}|%{DATE_EU}|%{DATE_YMD}
#but i dont know how to set this up on AWS ES

As you can see above the DATETIME / timestamp leverages the use of the custom patterns. In my old setup i load these patterns in the patterns directory but on AWS I dont think there is a way to change configuration manually so it would be great if there was a way to do it in the API or directly in the pipeline. Any examples or points would be great as I tried following documentation but it is too generic and examples free

  1. That really depends on what you have/want to do.
  2. Have a look at Converting Ingest Node Pipelines | Logstash Reference [8.11] | Elastic
  3. This would be best in Filebeat
  4. There's guidance in Ingest Node | Elasticsearch Guide [6.4] | Elastic, but we don't have a while bunch of examples as yet sorry.

Date Processor | Elasticsearch Guide [6.4] | Elastic goes into how you can specify multiple patterns for the time formatting. Is that what you are looking for?

I don't want to be pedantic, but it's a topic. Calling it a ticket would imply a lot of things that don't apply :slight_smile:

Not related to your topic but did you look at Elastic Cloud: Hosted Elasticsearch, Hosted Search | Elastic and AWS Marketplace: Elastic Cloud (Elasticsearch Service) ?

Cloud by elastic is one way to have access to all features, all managed by us. Think about what is there yet like Security, Monitoring, Reporting, SQL and what is coming like Canvas...

Malkom thank you for the reply, it looks like some of the things I am trying to do have been misunderstood/described.

I just expanded point 2. I dont need to convert from the pipeline to logstash but the opposite and as I am using a few DATE custom patterns loaded in a directory on Logstash (as you can see from the Logstash configuration I add another patterns dir , also the patterns are below the logstash config in a few lines down if scrolled) so to have a fully functional filter I am looking for a way to load those patterns on ES. This also should allow me to avoid the use of Logstash and just rely on Filebeat -> ES with a pipeline.

David thank you for pointing at those. I might try the SASS on AWS from Elastic too if my credits allow it. My blocker at the moment is described in this "topic" but i doubt moving to the ES services will make a difference

Yeah. I did not mean that. Sorry if that was confusing.

@dadoonet it is ok, i think i will try the elastic cloud, it looks like it is a bit cheaper than the massive instances i have started with the AWS service. Also I think to move the patterns i will need to look into https://www.elastic.co/guide/en/elasticsearch/reference/master/grok-processor.html#custom-patterns

Also i need to look into the reindex api and reindex to a remote IP so i can move to elastic cloud. Thanks to BaM` suggestion on IRC/freenode

2 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.