Logstash nlp error cannot create pipeline

http://blog.jaywayco.co.uk/analyze-sentiment-and-log-all-the-things/

i have used this blog ,and followed the steps to download NLP jars.

my logstash.conf file is look like as follow
input {
stdin {}
}
filter {
nlp {
source => "message"
}
}
output {
stdout {}
}

After running this my getting error as follows

[root@node1 bin]# ./logstash -f /home/node1/ES/syed/twitter_conf.conf
Sending Logstash's logs to /home/node1/ES/logstash-5.4.1/logs which is now configured via log4j2.properties
[2017-06-22T15:19:03,557][ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Expected one of #, { at line 12, column 7 (byte 435) after filter"}

its is not creating pipeline at NLP after filter.

Any help will be appreciated. Advance Thanks!! Cheers

You're showing us logstash.conf but Logstash is complaining about twitter_conf.conf.

Hi Magnusbaeck ,
Thanks for your response.

Here is my twitter_conf.conf

input {
twitter {
consumer_key => "KEY"
consumer_secret => "Key"
oauth_token => "key"
oauth_token_secret => "key"
keywords => [ "Antman", "spiderman", "batman", "superman"]
full_tweet => true
}
}

filters {
nlp {
source => "message"
}
}

output {
stdout {
codec => dots
}
elasticsearch {
hosts => "localhost:9200"
index => "twitter"
document_type => "tweets"
}
}

This is my full twitter_conf.conf file
I have run this conf without filter its working fine and inserting data in ES.
please assist me thanks once again.....:slight_smile:

filters {
nlp {
source => "message"
}
}

"filter", not "filters".

Thanks

filter is working bt i got anther issue

Sending Logstash's logs to /home/node1/ES/logstash-5.4.1/logs which is now configured via log4j2.properties
[2017-06-22T20:04:16,583][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-06-22T20:04:16,587][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-06-22T20:04:16,664][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x22021d6e URL:http://localhost:9200/>}
[2017-06-22T20:04:16,665][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-06-22T20:04:16,702][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-06-22T20:04:16,707][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0x408f31ac URL://localhost:9200>]}
Adding annotator tokenize
TokenizerAnnotator: No tokenizer type provided. Defaulting to PTBTokenizer.
Adding annotator ssplit
edu.stanford.nlp.pipeline.AnnotatorImplementations:
Adding annotator parse
Loading parser from serialized file edu/stanford/nlp/models/lexparser/englishPCFG.ser.gz ...done [0.5 sec].
Adding annotator sentiment
[2017-06-22T20:04:17,542][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>24, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>3000}
[2017-06-22T20:04:17,743][INFO ][logstash.pipeline ] Pipeline main started
[2017-06-22T20:04:17,747][INFO ][logstash.inputs.twitter ] Starting twitter tracking {:track=>"icc,Champions trophy,india,pakistan,bcci"}
[2017-06-22T20:04:17,775][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-06-22T20:04:19,702][ERROR][logstash.pipeline ] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>"Direct event field references (i.e. event['field']) have been disabled in favor of using event get and set methods (e.g. event.get('field')). Please consult the Logstash 5.0 breaking changes documentation for more details.", "backtrace"=>["/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/event.rb:48:in method_missing'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/filters/nlp.rb:30:infilter'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/filters/base.rb:145:in do_filter'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/filters/base.rb:164:inmulti_filter'", "org/jruby/RubyArray.java:1613:in each'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/filters/base.rb:161:inmulti_filter'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/filter_delegator.rb:43:in multi_filter'", "(eval):46:infilter_func'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/pipeline.rb:370:in filter_batch'", "org/jruby/RubyProc.java:281:incall'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:224:in each'", "org/jruby/RubyHash.java:1342:ineach'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:223:in each'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/pipeline.rb:369:infilter_batch'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/pipeline.rb:350:in worker_loop'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/pipeline.rb:317:instart_workers'"]}
[2017-06-22T20:04:19,753][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<NoMethodError: Direct event field references (i.e. event['field']) have been disabled in favor of using event get and set methods (e.g. event.get('field')). Please consult the Logstash 5.0 breaking changes documentation for more details.>, :backtrace=>["/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/event.rb:48:in method_missing'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/filters/nlp.rb:30:infilter'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/filters/base.rb:145:in do_filter'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/filters/base.rb:164:inmulti_filter'", "org/jruby/RubyArray.java:1613:in each'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/filters/base.rb:161:inmulti_filter'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/filter_delegator.rb:43:in multi_filter'", "(eval):46:infilter_func'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/pipeline.rb:370:in filter_batch'", "org/jruby/RubyProc.java:281:incall'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:224:in each'", "org/jruby/RubyHash.java:1342:ineach'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:223:in each'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/pipeline.rb:369:infilter_batch'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/pipeline.rb:350:in worker_loop'", "/home/node1/ES/logstash-5.4.1/logstash-core/lib/logstash/pipeline.rb:317:instart_workers'"]}


It looks like the nlp filter isn't Logstash 5.x compatible.

Thanks Magnusbaeck for your help,

I want to do some twitter sentimental analysis with ELK , do you have any knowledge about it . i am a new bee i dont have much knowledge about ELK and Sentimental analysis . From google i tried API's like AlchemyAPI and NLP both are not in use as of now. have you ever face any question like these.How can i do twitter sentimental analysis with ELK.

Any help will be appreciated. Advance Thanks!!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.