Elastic App Search -- Importing Json files

Hello Guys, I have a problem with importing Jsons into my Elastic App Search.

I've gone through nearly all the existing threads, here and everywhere else but I couldn't get my problem fixed.

I'm currently trying to import a Json file with Logstash, here is my current .conf file

    input{
    	stdin {}
    	file{	
    		codec => "json"
    		path => ["AbsolutePathToMyFile"]
    		start_position => "beginning"
    		sincedb_path => "nul"  (because im on Windows)
    	}
     
    }

    filter {}

    output {
        elastic_app_search{
    	engine =>"MyEngine"
    	api_key => "MyKey"
    	host => "MyHost"
    	}

    	stdout{codec => rubydebug}
    }

As you can see, it's as basic as it gets, but it still doesn't work.

Running the conf file doesn't seem to be the problem, Logstash is starting fine, but I just don't want to import any Json files into App Search, although i can see an empty get request in the API logs.
The stdin is fine and works.
Here is also a snipped from the Logs

    [2021-06-04T15:10:32,646][INFO ][logstash.runner          ] Log4j configuration path used is: D:\Projekt\logstash-7.12.1\config\log4j2.properties
    [2021-06-04T15:10:32,651][WARN ][logstash.runner          ] DEPRECATION WARNING: The flag ["--verbose"] has been deprecated, please use "--log.level=info" instead.
    [2021-06-04T15:10:32,652][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.12.1", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc Java HotSpot(TM) 64-Bit Server VM 15.0.2+7-27 on 15.0.2+7-27 +indy +jit [mswin32-x86_64]"}
    [2021-06-04T15:10:32,710][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
    [2021-06-04T15:10:33,160][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
    [2021-06-04T15:10:33,480][INFO ][org.reflections.Reflections] Reflections took 25 ms to scan 1 urls, producing 23 keys and 47 values 
    [2021-06-04T15:10:35,449][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["D:/Gruppenprojekt/logstash-7.12.1/logstash.conf"], :thread=>"#<Thread:0x7b19ae01 run>"}
    [2021-06-04T15:10:36,011][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.56}
    [2021-06-04T15:10:36,217][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
    [2021-06-04T15:10:36,245][INFO ][filewatch.observingtail  ][main][28af2cc680ae8ccc0fe7b252e2207fa9f8297cf6078757e1752d3e864ddfdd22] START, creating Discoverer, Watch with file and sincedb collections
    [2021-06-04T15:10:36,260][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

Set log.level to trace and see what the filewatch module used by the file input has to say.

Thank you very kindly, I found the solution :grinning:
The problem was that I tried to put in a message that was too long and did not have any newlines.
I got it to work after I increased the file chunk size and the file chunk count.

The configuration you said you were using used the default values. They are to read 4,611,686,018,427,387,903 chunks of 32 KB. Which is many yottabytes. What did you increase them to?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.