Could not execute action: PipelineAction::Create

Hello,

I'm totally new user for ELK stask and I try to make Logstask working with my logfile format on an ubuntu stand alone server with docker.
Kibana, Elastic-search, and logstash containers with the default config is running. All on v7.9.1.
On a client machine I read the log with a filebeat instance and log lines is received on the stdout in logstash console.

Next to support my log format, I played with the Grok Debugger integrated on Kibana.
My log have the following format :
0HM06OF61KPIG:00000002 2020-06-02 09:08:49:225 Info 192.168.2.32 USERNAME GetFileMethod This is an error message

To support the date format YYYY-MM-DD, I use a custom pattern like this, validated with Kibana Dev tools :
TIMESTAMP_DOTNET_EU %{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}
And store the content on logstash/pipeline/patterns/mypattern.conf

My logstash.conf look like this :

input {
  beats {
    port => 5044
  }
}
filter {
    grok {
        patterns_dir => ["/usr/share/logstash/pipeline/patterns"]
        # or patterns_dir => ["./patterns"] which result to the same issue
        match => { "message" => "%{WORD:identifier}:%{WORD:sequence}\t%{TIMESTAMP_DOTNET_EU:date}\t%{LOGLEVEL:level}\t%{IPORHOST:client}\t%{NOTSPACE:user}\t%{NOTSPACE:method}\t%{GREEDYDATA:message}" }
    }
}
output {
  stdout {
    codec => rubydebug
  }
}

I don't see what is wrong on the conf file, the logstash instance shut down and I got an error message: could not execute action.. You can see the detail on the log file as follow. If you can help.

Thanks.

OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.ext.openssl.SecurityHelper (file:/tmp/jruby-1/jruby7847550334605026780jopenssl.jar) to field java.security.MessageDigest.provider
WARNING: Please consider reporting this to the maintainers of org.jruby.ext.openssl.SecurityHelper
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2020-10-02T13:43:28,302][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.9.1", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.8+10-LTS on 11.0.8+10-LTS +indy +jit [linux-x86_64]"}
[2020-10-02T13:43:30,009][WARN ][logstash.monitoringextension.pipelineregisterhook] xpack.monitoring.enabled has not been defined, but found elasticsearch configuration. Please explicitly set `xpack.monitoring.enabled: true` in logstash.yml
[2020-10-02T13:43:30,014][WARN ][deprecation.logstash.monitoringextension.pipelineregisterhook] Internal collectors option for Logstash monitoring is deprecated and targeted for removal in the next major version.
Please configure Metricbeat to monitor Logstash. Documentation can be found at: 
https://www.elastic.co/guide/en/logstash/current/monitoring-with-metricbeat.html
[2020-10-02T13:43:31,509][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2020-10-02T13:43:31,825][WARN ][logstash.licensechecker.licensereader] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
[2020-10-02T13:43:32,066][INFO ][logstash.licensechecker.licensereader] ES Output version determined {:es_version=>7}
[2020-10-02T13:43:32,077][WARN ][logstash.licensechecker.licensereader] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-10-02T13:43:32,358][INFO ][logstash.monitoring.internalpipelinesource] Monitoring License OK
[2020-10-02T13:43:32,360][INFO ][logstash.monitoring.internalpipelinesource] Validated license for monitoring. Enabling monitoring pipeline.
[2020-10-02T13:43:35,585][INFO ][org.reflections.Reflections] Reflections took 852 ms to scan 1 urls, producing 22 keys and 45 values 
[2020-10-02T13:43:36,168][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2020-10-02T13:43:36,209][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
[2020-10-02T13:43:36,240][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] ES Output version determined {:es_version=>7}
[2020-10-02T13:43:36,241][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-10-02T13:43:36,359][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearchMonitoring", :hosts=>["http://elasticsearch:9200"]}
[2020-10-02T13:43:36,376][WARN ][logstash.javapipeline    ][.monitoring-logstash] 'pipeline.ordered' is enabled and is likely less efficient, consider disabling if preserving event order is not necessary
[2020-10-02T13:43:36,590][INFO ][logstash.javapipeline    ][.monitoring-logstash] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, "pipeline.sources"=>["monitoring pipeline"], :thread=>"#<Thread:0x7fa8bb7c run>"}
**[2020-10-02T13:43:36,802][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}**
[2020-10-02T13:43:37,704][INFO ][logstash.javapipeline    ][.monitoring-logstash] Pipeline Java execution initialization time {"seconds"=>1.11}
[2020-10-02T13:43:37,757][INFO ][logstash.javapipeline    ][.monitoring-logstash] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
[2020-10-02T13:43:38,210][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-10-02T13:43:40,047][INFO ][logstash.javapipeline    ] Pipeline terminated {"pipeline.id"=>".monitoring-logstash"}
[2020-10-02T13:43:40,152][INFO ][logstash.runner          ] Logstash shut down.

Set log.level to debug and see if you get additional messages that explain the failure.

Thank you. Now the Log file say that my custom grok format pattern is not defined. On some hundred lines before I saw that the filter has been adding :

Any ideas why?

[2020-10-02T15:35:50,899][DEBUG][logstash.filters.grok    ][main] Adding pattern {"TIMESTAMP_DOTNET_EU"=>"%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}"}
[2020-10-02T15:35:50,919][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?<WORD:identifier>\b\w+\b)
[2020-10-02T15:35:50,920][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?<WORD:sequence>\b\w+\b)
[2020-10-02T15:35:50,920][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?<TIMESTAMP_DOTNET_EU:date>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME})
[2020-10-02T15:35:50,921][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?:(?>\d\d){1,2})
[2020-10-02T15:35:50,922][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?:(?:0?[1-9]|1[0-2]))
[2020-10-02T15:35:50,922][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?:(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9]))
[2020-10-02T15:35:50,923][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?:(?!<[0-9])%{HOUR}:%{MINUTE}(?::%{SECOND})(?![0-9]))
[2020-10-02T15:35:50,923][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?:(?:2[0123]|[01]?[0-9]))
[2020-10-02T15:35:50,924][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?:(?:[0-5][0-9]))
[2020-10-02T15:35:50,924][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?:(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?))
[2020-10-02T15:35:50,924][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?<LOGLEVEL:level>([Aa]lert|ALERT|[Tt]race|TRACE|[Dd]ebug|DEBUG|[Nn]otice|NOTICE|[Ii]nfo|INFO|[Ww]arn?(?:ing)?|WARN?(?:ING)?|[Ee]rr?(?:or)?|ERR?(?:OR)?|[Cc]rit?(?:ical)?|CRIT?(?:ICAL)?|[Ff]atal|FATAL|[Ss]evere|SEVERE|EMERG(?:ENCY)?|[Ee]merg(?:ency)?))
[2020-10-02T15:35:50,924][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?<IPORHOST:client>(?:%{IP}|%{HOSTNAME}))
[2020-10-02T15:35:50,925][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?:(?:%{IPV6}|%{IPV4}))
[2020-10-02T15:35:50,925][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?:((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)(\.(25[0-5]|2[0-4]\d|1\d\d|[1-9]?\d)){3}))|:)))(%.+)?)
[2020-10-02T15:35:50,925][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?:(?<![0-9])(?:(?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5]))(?![0-9]))
[2020-10-02T15:35:50,926][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?:\b(?:[0-9A-Za-z][0-9A-Za-z-]{0,62})(?:\.(?:[0-9A-Za-z][0-9A-Za-z-]{0,62}))*(\.?|\b))
[2020-10-02T15:35:50,926][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?<NOTSPACE:user>\S+)
[2020-10-02T15:35:50,927][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?<NOTSPACE:method>\S+)
[2020-10-02T15:35:50,927][DEBUG][logstash.filters.grok    ][main] replacement_pattern => (?<GREEDYDATA:message>.*)
[2020-10-02T15:35:50,934][DEBUG][logstash.filters.grok    ][main] Grok compiled OK {:pattern=>"%{WORD:identifier}:%{WORD:sequence}\\t%{TIMESTAMP_DOTNET_EU:date}\\t%{LOGLEVEL:level}\\t%{IPORHOST:client}\\t%{NOTSPACE:user}\\t%{NOTSPACE:method}\\t%{GREEDYDATA:message}", :expanded_pattern=>"(?<WORD:identifier>\\b\\w+\\b):(?<WORD:sequence>\\b\\w+\\b)\\t(?<TIMESTAMP_DOTNET_EU:date>(?:(?>\\d\\d){1,2})-(?:(?:0?[1-9]|1[0-2]))-(?:(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])) (?:(?!<[0-9])(?:(?:2[0123]|[01]?[0-9])):(?:(?:[0-5][0-9]))(?::(?:(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?)))(?![0-9])))\\t(?<LOGLEVEL:level>([Aa]lert|ALERT|[Tt]race|TRACE|[Dd]ebug|DEBUG|[Nn]otice|NOTICE|[Ii]nfo|INFO|[Ww]arn?(?:ing)?|WARN?(?:ING)?|[Ee]rr?(?:or)?|ERR?(?:OR)?|[Cc]rit?(?:ical)?|CRIT?(?:ICAL)?|[Ff]atal|FATAL|[Ss]evere|SEVERE|EMERG(?:ENCY)?|[Ee]merg(?:ency)?))\\t(?<IPORHOST:client>(?:(?:(?:(?:((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:)))(%.+)?)|(?:(?<![0-9])(?:(?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5]))(?![0-9]))))|(?:\\b(?:[0-9A-Za-z][0-9A-Za-z-]{0,62})(?:\\.(?:[0-9A-Za-z][0-9A-Za-z-]{0,62}))*(\\.?|\\b))))\\t(?<NOTSPACE:user>\\S+)\\t(?<NOTSPACE:method>\\S+)\\t(?<GREEDYDATA:message>.*)"}
[2020-10-02T15:35:50,974][DEBUG][logstash.filters.grok    ][main] Grok patterns path {:paths=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-patterns-core-4.1.2/patterns", "/usr/share/logstash/patterns/*"]}
[2020-10-02T15:35:50,976][DEBUG][logstash.filters.grok    ][main] Grok patterns path {:paths=>[]}
[2020-10-02T15:35:50,982][DEBUG][logstash.filters.grok    ][main] Match data {:match=>{"message"=>"%{WORD:identifier}:%{WORD:sequence}\\t%{TIMESTAMP_DOTNET_EU:date}\\t%{LOGLEVEL:level}\\t%{IPORHOST:client}\\t%{NOTSPACE:user}\\t%{NOTSPACE:method}\\t%{GREEDYDATA:message}"}}
[2020-10-02T15:35:50,983][DEBUG][logstash.filters.grok    ][main] regexp: /message {:pattern=>"%{WORD:identifier}:%{WORD:sequence}\\t%{TIMESTAMP_DOTNET_EU:date}\\t%{LOGLEVEL:level}\\t%{IPORHOST:client}\\t%{NOTSPACE:user}\\t%{NOTSPACE:method}\\t%{GREEDYDATA:message}"}

[2020-10-02T15:35:51,096][DEBUG][logstash.javapipeline    ][main] Pipeline terminated by worker error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{TIMESTAMP_DOTNET_EU:date} not defined>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:123:in `block in compile'", "org/jruby/RubyKernel.java:1442:in `loop'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:93:in `compile'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.3.0/lib/logstash/filters/grok.rb:288:in `block in register'", "org/jruby/RubyArray.java:1809:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.3.0/lib/logstash/filters/grok.rb:282:in `block in register'", "org/jruby/RubyHash.java:1415:in `each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.3.0/lib/logstash/filters/grok.rb:277:in `register'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:226:in `block in register_plugins'", "org/jruby/RubyArray.java:1809:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:225:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:560:in `maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:238:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:183:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134:in `block in start'"], "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf", "/usr/share/logstash/pipeline/logstash.conf.save"], :thread=>"#<Thread:0x50af2ffd run>"}

Ah, I found what was wrong, by editing my logstash.conf using nano, it automatically created a backup file to logstash.conf.save. And next logstash try to load both files on the pipeline folder..

[2020-10-02T15:35:46,445][DEBUG][logstash.config.source.multilocal] Reading pipeline configurations from YAML {:location=>"/usr/share/logstash/config/pipelines.yml"}
[2020-10-02T15:35:46,543][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["/usr/share/logstash/CONTRIBUTORS", "/usr/share/logstash/Gemfile", "/usr/share/logstash/Gemfile.lock", "/usr/share/logstash/LICENSE.txt", "/usr/share/logstash/NOTICE.TXT", "/usr/share/logstash/bin", "/usr/share/logstash/config", "/usr/share/logstash/data", "/usr/share/logstash/lib", "/usr/share/logstash/logstash-core", "/usr/share/logstash/logstash-core-plugin-api", "/usr/share/logstash/modules", "/usr/share/logstash/pipeline", "/usr/share/logstash/tools", "/usr/share/logstash/vendor", "/usr/share/logstash/x-pack"]}
[2020-10-02T15:35:46,546][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/usr/share/logstash/pipeline/logstash.conf"}
[2020-10-02T15:35:46,553][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/usr/share/logstash/pipeline/logstash.conf.save"}

Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.