Configure logstash to input JSON file into elasticsearch


I'm accessing tweets with Tweepy and generating complete .JSON files for each Tweet.
I set up my logstash to read each of these files and index them in Elasticsearch

See below my logstash configuration file:

codec => json
path => ["C:/----my directory-----/twitter_logs*.json*"]
start_position => "beginning"
sincedb_path => "nul"

filter {
json {
source => "message"

elasticsearch {
hosts => "localhost:9200"
index => "twitter"

See the contents of each .JSON file.

"created_at":"Tue May 21 22:18:17 +0000 2019",
"text":"Ainda não superou o final de Game of Thrones? Nós também não! Confira o que preparamos para os fãs que já não sab…",
"source":"Twitter Web Client",
"name":"Conheça a BRQ",
"location":"São Paulo",
"description":"Paixão por transformar negócios com tecnologia, esse é o propósito que move a BRQ em 26 anos de história.",
"created_at":"Fri Aug 27 14:25:19 +0000 2010",

When I run logstash it does not recognize any changes in the directory

I think I'm setting something up wrong.
Can someone help me?

what happens when you do output to screen

output { stdout { codec => rubydebug } }

Same results..
Logstash started but dont read any file..

try using \ rather then / slash in your path

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.