File transfer error from logstash to elasticsearch

In my windows machine. My Logstash is working fine as administrator. When I input any raw data in the console (when input is stdin { } ) it is indexed into the Elasticsearch. But when any file format document is executed then there is no indexing in the index management of Elasticsearch.

The below code is my logstash.config

input { 
  file {
    path => "C:\Users\desktop\Downloads\jsonfiles\Phone_to_Smartphone.json"
    sincedb_path => "NULL"
  }
 }

 filter {
    json {
      source => "message"
    }
 }
 
output {
  elasticsearch { 
    hosts => ["http://elastic:<myverystrongpassword>@localhost:9200"]
    index => "demo-json"
  }
  stdout { 
    codec => rubydebug
     }
}

This is the console from the command:

PS C:\Users\desktop\Documents\elasticco\logstash-8.4.1\bin> .\logstash.bat -f C:\Users\desktop\Documents\elasticco\logstash-8.4.1\config\logstash-json.config
Using LS_JAVA_HOME defined java: C:\Program Files\Java\jdk-11.0.16
The system cannot find the file specified.
"WARNING: Logstash comes bundled with the recommended JDK(), but is overridden by the version defined in LS_JAVA_HOME. Consider clearing LS_JAVA_HOME to use the bundled JDK."
Java HotSpot(TM) 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
Sending Logstash logs to C:/Users/desktop/Documents/elasticco/logstash-8.4.1/logs which is now configured via log4j2.properties
[2022-09-08T10:44:01,133][INFO ][logstash.runner          ] Log4j configuration path used is: C:\Users\desktop\Documents\elasticco\logstash-8.4.1\config\log4j2.properties
[2022-09-08T10:44:01,139][WARN ][logstash.runner          ] The use of JAVA_HOME has been deprecated. Logstash 8.0 and later ignores JAVA_HOME and uses the bundled JDK. Running Logstash with the bundled JDK is recommended. The bundled JDK has been verified to work with each specific version of Logstash, and generally provides best performance and reliability. If you have compelling reasons for using your own JDK (organizational-specific compliance requirements, for example), you can configure LS_JAVA_HOME to use that version instead.
[2022-09-08T10:44:01,139][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.4.1", "jruby.version"=>"jruby 9.3.6.0 (2.6.8) 2022-06-27 7a2cbcd376 Java HotSpot(TM) 64-Bit Server VM 11.0.16+11-LTS-199 on 11.0.16+11-LTS-199 +indy +jit [x86_64-mswin32]"}
[2022-09-08T10:44:01,142][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2022-09-08T10:44:01,213][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2022-09-08T10:44:02,636][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2022-09-08T10:44:03,001][INFO ][org.reflections.Reflections] Reflections took 72 ms to scan 1 urls, producing 125 keys and 434 values
[2022-09-08T10:44:04,199][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2022-09-08T10:44:04,237][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://elastic:xxxxxx@localhost:9200/"]}
[2022-09-08T10:44:04,440][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@localhost:9200/]}}
[2022-09-08T10:44:04,595][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@localhost:9200/"}
[2022-09-08T10:44:04,605][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.4.1) {:es_version=>8}
[2022-09-08T10:44:04,606][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2022-09-08T10:44:04,635][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
[2022-09-08T10:44:04,636][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
[2022-09-08T10:44:04,640][WARN ][logstash.outputs.elasticsearch][main] Elasticsearch Output configured with `ecs_compatibility => v8`, which resolved to an UNRELEASED preview of version 8.0.0 of the Elastic Common Schema. Once ECS v8 and an updated release of this plugin are publicly available, you will need to update this plugin to resolve this warning.
[2022-09-08T10:44:04,650][INFO ][logstash.filters.json    ][main] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2022-09-08T10:44:04,676][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2022-09-08T10:44:04,707][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>16, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2000, "pipeline.sources"=>["C:/Users/desktop/Documents/elasticco/logstash-8.4.1/config/logstash-json.config"], :thread=>"#<Thread:0x3198c92f run>"}
[2022-09-08T10:44:05,304][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.6}
[2022-09-08T10:44:05,377][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2022-09-08T10:44:05,410][INFO ][filewatch.observingtail  ][main][39b776fe14f8058ae0f999d2d2822df11c2d3079d9516d804279ca5c69f23757] START, creating Discoverer, Watch with file and sincedb collections
[2022-09-08T10:44:05,422][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

You have to use / on Windows
path => "C:/Users/desktop/Downloads/jsonfiles/Phone_to_Smartphone.json"

2 Likes

Implemented your solution but no luck on that. Still same error.

This should be NUL, using NULL will make logstash create a file named NULL in its home directory to track what it read from your file.

If you want to reread the file C:\Users\desktop\Downloads\jsonfiles\Phone_to_Smartphone.json you should use NUL.

I tried it but didn't solve the issue.
sincedb_path => "NUL" is not the issue. The issue is that while importing the file there is no error logged in the console but the file is not saved to the Elasticsearch.

Hosts url is wrong.

Use next conf file:

input { 
  file {
    path => "C:/Users/desktop/Downloads/jsonfiles/Phone_to_Smartphone.json"
    sincedb_path => "NUL"
  }
 }

 filter {
    json {
      source => "message"
    }
 }
 
output {
  elasticsearch { 
    hosts => ["http://host:9200"]
    index => "demo-json"
    user => "elastic"
    password => "pass"
  }
  stdout { 
    codec => rubydebug
     }
}

Yeah I did that too with that also there was error.

What error? Please share the logs.

Also, what is the content of the file C:/Users/desktop/Downloads/jsonfiles/Phone_to_Smartphone.json ? It is for some reason a single line file?

This is the content of the file:

[
   {
      "id":1,
      "Brand":"Nokia",
      "Name":"Nokia 3210",
      "Status":"Discontinued",
      "coverImage":"https:\/\/fdn2.gsmarena.com\/vv\/bigpic\/no3210b.gif",
      "released_at":"Released 1999",
      "body":"151g, 22.5mm thickness",
      "os":"Feature phone",
      "Chipset":null,
      "Colors":"User exchangeable front and back covers",
      "camera_pixels":"NO         ",
      "battery_size":"&nbsp;",
      "Type":"Removable Li-Ion battery",
      "Dual":null,
      "Triple":null,
      "Charging":null,
      "Size":"",
      "Resolution":"5 lines",
      "Other_specifications":"{\"storage\":\"No card slot\",\"video_pixels\":\"No video recorder\",\"battery_type\":\"Li-Ion\",\"Technology\":\"GSM\",\"2G bands\":\"GSM 900 \\\\\\\/ 1800 \",\"GPRS\":\"No\",\"EDGE\":\"No\",\"Announced\":\"1999\",\"Dimensions\":\"123.8 x 50.5 x 16.7-22.5 mm\",\"SIM\":\"Mini-SIM\",\"Type\":\"Removable Li-Ion battery\",\"Card slot\":\"No\",\"Phonebook\":\"SIM only\",\"Call records\":\"8 dialed, 8 received, 8 missed calls\",\"Loudspeaker \":\"No\",\"Alert types\":\"Downloadable monophonic ringtones, composer\",\"3.5mm jack \":\"No\",\"WLAN\":\"No\",\"Bluetooth\":\"No\",\"GPS\":\"No\",\"Radio\":\"No\",\"USB\":\"\",\"Sensors\":\"\",\"Messaging\":\"SMS\",\"Browser\":\"\",\"Games\":\"Rotation, Snake, and Memory\",\"Languages\":\"11 from 35 total\",\"Java\":\"No\",\"Stand-by\":null,\"Talk time\":null,\"Infrared port\":null,\"SAR\":null,\"SAR EU\":null,\"CPU\":null,\"Internal\":null,\"OS\":null,\"Keyboard\":null,\"Single\":null,\"Video\":null,\"3G bands\":null,\"Speed\":null,\"Features\":null,\"Price\":null,\"Music play\":null,\"Protection\":null,\"GPU\":null,\"Models\":null,\"Loudspeaker\":null,\"Audio quality\":null,\"4G bands\":null}"
   },
   {
      "id":2,
      "Brand":"Nokia",
      "Name":"Nokia 5110",
      "Status":"Discontinued",
      "coverImage":"https:\/\/fdn2.gsmarena.com\/vv\/bigpic\/no5110b.gif",
      "released_at":"Released 1998",
      "body":"170g, 31mm thickness",
      "os":"Feature phone",
      "Chipset":null,
      "Colors":"Xpress-On covers, 4 basic, 7 metallic",
      "camera_pixels":"NO         ",
      "battery_size":"600 mAh ",
      "Type":"Removable Li-Po 600 mAh battery",
      "Dual":null,
      "Triple":null,
      "Charging":null,
      "Size":"",
      "Resolution":"5 lines",
      "Other_specifications":"{\"storage\":\"No card slot\",\"video_pixels\":\"No video recorder\",\"battery_type\":\"Li-Po\",\"Technology\":\"GSM\",\"2G bands\":\"GSM 900 \",\"GPRS\":\"No\",\"EDGE\":\"No\",\"Announced\":\"1998\",\"Dimensions\":\"132 x 47.5 x 31 mm, 143 cc (5.20 x 1.87 x 1.22 in)\",\"SIM\":\"Mini-SIM\",\"Type\":\"Removable Li-Po 600 mAh battery\",\"Card slot\":\"No\",\"Phonebook\":\"SIM only\",\"Call records\":\"8 dialed, 5 received, 5 missed calls\",\"Loudspeaker \":\"No\",\"Alert types\":\"Downloadable monophonic ringtones\",\"3.5mm jack \":\"No\",\"WLAN\":\"No\",\"Bluetooth\":\"No\",\"GPS\":\"No\",\"Radio\":\"No\",\"USB\":\"\",\"Sensors\":\"\",\"Messaging\":\"SMS\",\"Browser\":\"\",\"Games\":\"3 (Memory, Snake, Logic)\",\"Languages\":\"28\",\"Java\":\"No\",\"Stand-by\":\"60-270 h\",\"Talk time\":\"3-5 h\",\"Infrared port\":null,\"SAR\":null,\"SAR EU\":null,\"CPU\":null,\"Internal\":null,\"OS\":null,\"Keyboard\":null,\"Single\":null,\"Video\":null,\"3G bands\":null,\"Speed\":null,\"Features\":null,\"Price\":null,\"Music play\":null,\"Protection\":null,\"GPU\":null,\"Models\":null,\"Loudspeaker\":null,\"Audio quality\":null,\"4G bands\":null}"
   }
]

You will have another issue later because your file is a multline file, but first you need to try to make logstash read it.

Try to run this config to see if something is sent to elasticsearch, the sincedb_path needs to be NUL in this case, since you already tried to read this file you need to make sure that logstash is not tracking it.

input { 
  file {
    path => "C:/Users/desktop/Downloads/jsonfiles/Phone_to_Smartphone.json"
    sincedb_path => "NUL"
  }
 }

 
output {
  elasticsearch { 
    hosts => ["http://host:9200"]
    index => "demo-json"
    user => "elastic"
    password => "pass"
  }
}