Use logstash to synchronize json files to elactissearch without data

There is no data, even when the file is changed, the console does not output changes.

  1. This is my conf file, test.conf

input {
file {
path => "D:\logstash\test\demo.log"
codec => "json"
}
}

output {
elasticsearch {
ecs_compatibility => disabled
doc_as_upsert => true
action => "update"
hosts => ["....."]
index => "demo"
document_id => "%{id}"
}
}

  1. This is the json file, demo.log
    {"id":"1","count":"2"}

  2. logstash-plain.log, other logs have no content
    [2021-08-11T18:59:00,368][INFO ][logstash.runner ] Log4j configuration path used is: D:\logstash\logstash-7.12.0\config\log4j2.properties
    [2021-08-11T18:59:00,379][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.12.0", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc Java HotSpot(TM) 64-Bit Server VM 25.201-b09 on 1.8.0_201-b09 +indy +jit [mswin32-x86_64]"}
    [2021-08-11T18:59:00,474][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
    [2021-08-11T18:59:01,389][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
    [2021-08-11T18:59:02,554][INFO ][org.reflections.Reflections] Reflections took 35 ms to scan 1 urls, producing 23 keys and 47 values
    [2021-08-11T18:59:03,493][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[......]}}
    [2021-08-11T18:59:03,714][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"......"}
    [2021-08-11T18:59:03,791][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
    [2021-08-11T18:59:03,794][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
    [2021-08-11T18:59:03,886][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//......"]}
    [2021-08-11T18:59:03,902][WARN ][logstash.javapipeline ][main] 'pipeline.ordered' is enabled and is likely less efficient, consider disabling if preserving event order is not necessary
    [2021-08-11T18:59:03,978][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
    [2021-08-11T18:59:03,989][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, "pipeline.sources"=>["D:/logstash/logstash-7.12.0/plugins/test.conf"], :thread=>"#<Thread:0x6d5cae26 run>"}
    [2021-08-11T18:59:04,075][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
    [2021-08-11T18:59:04,608][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.62}
    [2021-08-11T18:59:04,983][INFO ][logstash.inputs.file ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"D:/logstash/logstash-7.12.0/data/plugins/inputs/file/.sincedb_90fca5704cee10e880a16fd13edc3a88", :path=>["D:\logstash\test\demo.log"]}
    [2021-08-11T18:59:05,006][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
    [2021-08-11T18:59:05,076][INFO ][filewatch.observingtail ][main][2faf73846c0d4b5b75aaf8bf82521563e9772e09c90b1453b053aa5f4f1e9a7c] START, creating Discoverer, Watch with file and sincedb collections
    [2021-08-11T18:59:05,076][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}

By default a file input will seek to the end of file and wait for new lines to be appended. Use the start_position option to change this. The sincedb will have saved the current EOF location so you will need to append new data to the file once logstash starts.

:slightly_smiling_face: All paths must use /, and cannot be used \

That is also true. \ is used to escape characters, even if they do not need to be escaped. You can make it work with either / or \\.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.