Move json file to elasticsearch

trying to move a file from a folder using logstash to elasticsearch.

here's my json.conf:

input {
        file {
                path => "/usr/share/logstash/json"
                start_position => "beginning"
                }
}

output {
        elasticsearch {
                hosts => "10.2.20.21:9200"
                index => "json_test"
                }
}

I just want to see the raw data and then filter it down after that. however, i keep getting this error no matter what i try in the .conf

No configuration found in the configured sources

Oh - btw -
Everything is running version 6.8 ELK...

How are you setting path.config?

path config?
sorry - i'm not entirely sure what you mean by that.
i'm still learning this stuff...

path.config tells logstash where to look for configuration files. It can be set on the command line using -f. It can be set in logstash.yml. If you are using pipelines it can be set in pipelines.yml. I expect it also has a default.

Provided you are not using pipelines.yml, if you add '--log.level debug' then logstash will log where it is looking for configurations.

[DEBUG][logstash.runner          ] *path.config: "/home/user/test.conf"

so here's what's in my logstash.yml

# Which directory should be used by logstash and its plugins
# for any persistent needs. Defaults to LOGSTASH_HOME/data
#
path.data: /var/lib/logstash

I would use this in the command line when running the .conf?

here's some other stuff from my logstash.yml

# ------------ Pipeline Configuration Settings --------------
#
# Where to fetch the pipeline configuration for the main pipeline
#
# path.config:
#
# Pipeline configuration string for the main pipeline
#
# config.string:
#
# At startup, test if the configuration is valid and exit (dry run)
#
# config.test_and_exit: false
#
# Periodically check if the configuration has changed and reload the pipeline
# This can also be triggered manually through the SIGHUP signal
#
# config.reload.automatic: false
#
# How often to check if the pipeline configuration has changed (in seconds)
#
# config.reload.interval: 3s
#
# Show fully compiled configuration as debug log message
# NOTE: --log.level must be 'debug'
#
# config.debug: false
#
# When enabled, process escaped characters such as \n and \" in strings in the
# pipeline configuration files.
#
# config.support_escapes: false

i don't have any piplines configured that i know of...if there are they are the default pipelines that are set on a base install.

OK, so you are not using pipelines and not setting it in logstash.yml. So you are either setting it on the command line (-f or --path.config) or using the default, which is platform dependent.

Yea - that sounds about right.
The conf text is above, and I run the following command from /usr/share/logstash/bin

./logstash -f json.conf

OK, so it is telling you that /usr/share/logstash/bin/json.conf does not exist.

so i have to put the json.conf in the ./bin directory?

I would suggest giving the full path to the location of the file to the -f option, but that would work.

ok - i'll give that a try later today and see what happens...
will post results.

Ok - so I ran it...
I think it's still running.

Anyways - here's what came back:

/usr/share/logstash/bin# ./logstash -f json.conf

WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2019-08-04 09:19:00.786 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2019-08-04 09:19:00.810 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"6.8.1"}
[WARN ] 2019-08-04 09:19:11.467 [Converge PipelineAction::Create<main>] elasticsearch - You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"json_test", id=>"42a0ffc6c2f9eb89b77c0068e554dabedb6d72311d1e93ee7d792c354934dee7", hosts=>[//10.2.20.21:9200], document_type=>"json", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_bccad9aa-960d-41d5-ab10-4de9097e104d", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>false, ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[INFO ] 2019-08-04 09:19:11.546 [Converge PipelineAction::Create<main>] pipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[INFO ] 2019-08-04 09:19:12.224 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://10.2.20.21:9200/]}}
[WARN ] 2019-08-04 09:19:12.539 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://10.2.20.21:9200/"}
[INFO ] 2019-08-04 09:19:13.335 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>6}
[WARN ] 2019-08-04 09:19:13.340 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[INFO ] 2019-08-04 09:19:13.397 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//10.2.20.21:9200"]}
[INFO ] 2019-08-04 09:19:13.402 [Ruby-0-Thread-5: :1] elasticsearch - Using default mapping template
[INFO ] 2019-08-04 09:19:13.516 [Ruby-0-Thread-5: :1] elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[INFO ] 2019-08-04 09:19:14.111 [[main]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_794f8fd11acdfc7933430a216d92f250", :path=>["/usr/share/logstash/misp-data.json"]}
[INFO ] 2019-08-04 09:19:14.158 [Converge PipelineAction::Create<main>] pipeline - Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0xc0f6a15 run>"}
[INFO ] 2019-08-04 09:19:14.259 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ] 2019-08-04 09:19:14.277 [[main]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2019-08-04 09:19:14.784 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}
java.lang.OutOfMemoryError: Java heap space
Dumping heap to java_pid4378.hprof ...
Heap dump file created [1113405528 bytes in 20.444 secs]
Exception in thread "Ruby-0-Thread-11: :1" java.lang.OutOfMemoryError: Java heap space
Exception in thread "[main]>worker0" java.lang.OutOfMemoryError: Java heap space
Exception in thread "[main]>worker2" java.lang.OutOfMemoryError: Java heap space
Exception in thread "Ruby-0-Thread-14: :1" java.lang.OutOfMemoryError: Java heap space
Exception in thread "Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6" java.lang.OutOfMemoryError: Java heap space

No [ERROR] so far at least...

I checked the jvm.options file. Heap was set to 1g. The file is larger than that. I set it to 4g. Rebooting the box and running the .conf again.

Ok - i increased box ram to 12 gig. It pegged out.
the .conf finished with this error:

java.lang.OutOfMemoryError: Java heap space
Dumping heap to java_pid1753.hprof ...
Heap dump file created [2636015002 bytes in 35.975 secs]
warning: thread "[main]>worker3" terminated with exception (report_on_exception is true):
java.lang.OutOfMemoryError: Java heap space
        at java.util.Arrays.copyOf(Arrays.java:3236)
        at java.io.ByteArrayOutputStream.grow(ByteArrayOutputStream.java:118)
        at java.io.ByteArrayOutputStream.ensureCapacity(ByteArrayOutputStream.java:93)
        at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:153)
        at com.fasterxml.jackson.core.json.UTF8JsonGenerator._flushBuffer(UTF8JsonGenerator.java:2085)
        at com.fasterxml.jackson.core.json.UTF8JsonGenerator._writeUTF8Segment2(UTF8JsonGenerator.java:1748)
        at com.fasterxml.jackson.core.json.UTF8JsonGenerator._writeUTF8Segment(UTF8JsonGenerator.java:1728)
        at com.fasterxml.jackson.core.json.UTF8JsonGenerator._writeUTF8Segments(UTF8JsonGenerator.java:1712)
        at com.fasterxml.jackson.core.json.UTF8JsonGenerator.writeUTF8String(UTF8JsonGenerator.java:581)
        at com.jrjackson.RubyUtils.writeBytes(RubyUtils.java:147)
        at com.jrjackson.RubyAnySerializer.serialize(RubyAnySerializer.java:168)
        at com.jrjackson.RubyAnySerializer.serializeHash(RubyAnySerializer.java:226)
        at com.jrjackson.RubyAnySerializer.serialize(RubyAnySerializer.java:162)
        at com.jrjackson.JrJacksonBase.generate(JrJacksonBase.java:70)
        at java.lang.invoke.LambdaForm$DMH/315860201.invokeStatic_L3_L(LambdaForm$DMH)
        at java.lang.invoke.LambdaForm$BMH/1947378744.reinvoke(LambdaForm$BMH)
        at java.lang.invoke.LambdaForm$reinvoker/2126392903.dontInline(LambdaForm$reinvoker)
        at java.lang.invoke.LambdaForm$MH/1620253123.guard(LambdaForm$MH)
        at java.lang.invoke.LambdaForm$reinvoker/2126392903.dontInline(LambdaForm$reinvoker)
        at java.lang.invoke.LambdaForm$MH/1620253123.guard(LambdaForm$MH)
        at java.lang.invoke.LambdaForm$MH/1653309853.linkToCallSite(LambdaForm$MH)
        at usr.share.logstash.logstash_minus_core.lib.logstash.json.RUBY$method$jruby_dump$0(/usr/share/logstash/logstash-core/lib/logstash/json.rb:24)
        at java.lang.invoke.LambdaForm$DMH/1702660825.invokeStatic_L7_L(LambdaForm$DMH)
        at java.lang.invoke.LambdaForm$MH/1164692340.invokeExact_MT(LambdaForm$MH)
        at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:91)
        at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:90)
        at org.jruby.internal.runtime.methods.AliasMethod.call(AliasMethod.java:135)
        at org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:183)
        at java.lang.invoke.LambdaForm$DMH/1377992370.invokeVirtual_L6_L(LambdaForm$DMH)
        at java.lang.invoke.LambdaForm$BMH/1505774574.reinvoke(LambdaForm$BMH)
        at java.lang.invoke.LambdaForm$reinvoker/1067588937.dontInline(LambdaForm$reinvoker)
        at java.lang.invoke.LambdaForm$MH/1097684722.guard(LambdaForm$MH)
[ERROR] 2019-08-04 09:40:01.785 [LogStash::Runner] Logstash - java.lang.OutOfMemoryError: Java heap space

increased the jvm heap option to 8g
now it hangs

i'll post where it hangs momentarily - i just accidentally killed it...

here's where it hangs now...

/usr/share/logstash/bin# ./logstash -f json.conf
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2019-08-04 10:05:18.850 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2019-08-04 10:05:18.867 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"6.8.1"}
[WARN ] 2019-08-04 10:05:26.257 [Converge PipelineAction::Create<main>] elasticsearch - You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"json_test", codec=><LogStash::Codecs::JSON id=>"json_bd40bad6-a579-40b4-b425-b70762950ab8", enable_metric=>true, charset=>"UTF-8">, id=>"4b1c929a3baaee6d9752cc8b258f996588026323c4b4c8e77e01bfa849d35a26", hosts=>[//10.2.20.21:9200], document_type=>"json", enable_metric=>true, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>false, ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[INFO ] 2019-08-04 10:05:26.324 [Converge PipelineAction::Create<main>] pipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[INFO ] 2019-08-04 10:05:26.855 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://10.2.20.21:9200/]}}
[WARN ] 2019-08-04 10:05:27.099 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://10.2.20.21:9200/"}
[INFO ] 2019-08-04 10:05:27.310 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>6}
[WARN ] 2019-08-04 10:05:27.314 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[INFO ] 2019-08-04 10:05:27.372 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//10.2.20.21:9200"]}
[INFO ] 2019-08-04 10:05:27.407 [Ruby-0-Thread-5: :1] elasticsearch - Using default mapping template
[INFO ] 2019-08-04 10:05:27.474 [Ruby-0-Thread-5: :1] elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[INFO ] 2019-08-04 10:05:27.889 [[main]-pipeline-manager] file - No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_794f8fd11acdfc7933430a216d92f250", :path=>["/usr/share/logstash/misp-data.json"]}
[INFO ] 2019-08-04 10:05:27.935 [Converge PipelineAction::Create<main>] pipeline - Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x18699736 run>"}
[INFO ] 2019-08-04 10:05:28.038 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ] 2019-08-04 10:05:28.061 [[main]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2019-08-04 10:05:28.442 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}

With that configuration I would only expect it to need memory for a couple of hundred lines from the file. How big is the file and how long are the lines? (wc -lc /usr/share/logstash/json)