We have been successfully using Logstash to parse our JSON logs data and
import them to Elasticsearch database, but recently had failures on some
machines. Here's the error Logstash displays:
←[32mRegistering file input
{:path=>["D:/Octopus/Applications/prod-ndoa/Bridge.Web/logs/BridgeSoap..txt"],
:level=>:info}←[0m
←[32mPipeline started {:level=>:info}←[0m
←[31mA plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::File add_field=>{"_environment"=>"prod-ndoa",
"_application"=>"bridge_rest"},
path=>["D:/Octopus/Applications/prod-ndoa/Bridge.Rest.Host/logs/BridgeRest..txt"],
sincedb_path=>"D:/Octopus/Applications/prod-ndoa/Bridge.Rest.Host/logs/sincedb",tags=>["bridge_rest"],
start_position=>"end">
Error: incompatible encodings: Windows-1252 and UTF-8 {:level=>:error}←[0m
The input file is a set of JSON documents in UTF-8 encoding (with BOM). If
edit the file and remove BOM symbols, the import goes fine.
On Wednesday, December 10, 2014 7:13:16 AM UTC-8, Vagif Abilov wrote:
Hello,
We have been successfully using Logstash to parse our JSON logs data and
import them to Elasticsearch database, but recently had failures on some
machines. Here's the error Logstash displays:
←[32mRegistering file input
{:path=>["D:/Octopus/Applications/prod-ndoa/Bridge.Web/logs/BridgeSoap..txt"],
:level=>:info}←[0m
←[32mPipeline started {:level=>:info}←[0m
←[31mA plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::File add_field=>{"_environment"=>"prod-ndoa",
"_application"=>"bridge_rest"},
path=>["D:/Octopus/Applications/prod-ndoa/Bridge.Rest.Host/logs/BridgeRest..txt"],
sincedb_path=>"D:/Octopus/Applications/prod-ndoa/Bridge.Rest.Host/logs/sincedb",tags=>["bridge_rest"],
start_position=>"end">
Error: incompatible encodings: Windows-1252 and UTF-8
{:level=>:error}←[0m
The input file is a set of JSON documents in UTF-8 encoding (with BOM). If
edit the file and remove BOM symbols, the import goes fine.
Thank you Aaron, done. I've created an issue. But I'd like to find out if
there's a workaround for this problem. What's really strange that the same
Logstash installation works with similar JSON files on other machines.
Vagif
On Wednesday, December 10, 2014 4:13:16 PM UTC+1, Vagif Abilov wrote:
Hello,
We have been successfully using Logstash to parse our JSON logs data and
import them to Elasticsearch database, but recently had failures on some
machines. Here's the error Logstash displays:
←[32mRegistering file input
{:path=>["D:/Octopus/Applications/prod-ndoa/Bridge.Web/logs/BridgeSoap..txt"],
:level=>:info}←[0m
←[32mPipeline started {:level=>:info}←[0m
←[31mA plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::File add_field=>{"_environment"=>"prod-ndoa",
"_application"=>"bridge_rest"},
path=>["D:/Octopus/Applications/prod-ndoa/Bridge.Rest.Host/logs/BridgeRest..txt"],
sincedb_path=>"D:/Octopus/Applications/prod-ndoa/Bridge.Rest.Host/logs/sincedb",tags=>["bridge_rest"],
start_position=>"end">
Error: incompatible encodings: Windows-1252 and UTF-8
{:level=>:error}←[0m
The input file is a set of JSON documents in UTF-8 encoding (with BOM). If
edit the file and remove BOM symbols, the import goes fine.
We use the HTTP protocol from logstash to send to Elasticsearch, and
therefore we have never had this issue.
There is a version of ES bundled with logstash, and if it doesn't match the
version of ES you are using to store the logs then you may see problems if
you don't use the HTTP protocol.
Brian
On Wednesday, December 10, 2014 3:53:30 PM UTC-5, Vagif Abilov wrote:
Thank you Aaron, done. I've created an issue. But I'd like to find out if
there's a workaround for this problem. What's really strange that the same
Logstash installation works with similar JSON files on other machines.
On Wednesday, December 10, 2014 4:05:38 PM UTC-8, Brian wrote:
We use the HTTP protocol from logstash to send to Elasticsearch, and
therefore we have never had this issue.
There is a version of ES bundled with logstash, and if it doesn't match
the version of ES you are using to store the logs then you may see problems
if you don't use the HTTP protocol.
Brian
On Wednesday, December 10, 2014 3:53:30 PM UTC-5, Vagif Abilov wrote:
Thank you Aaron, done. I've created an issue. But I'd like to find out if
there's a workaround for this problem. What's really strange that the same
Logstash installation works with similar JSON files on other machines.
On Wednesday, December 10, 2014 4:05:38 PM UTC-8, Brian wrote:
We use the HTTP protocol from logstash to send to Elasticsearch, and
therefore we have never had this issue.
There is a version of ES bundled with logstash, and if it doesn't match
the version of ES you are using to store the logs then you may see problems
if you don't use the HTTP protocol.
Brian
On Wednesday, December 10, 2014 3:53:30 PM UTC-5, Vagif Abilov wrote:
Thank you Aaron, done. I've created an issue. But I'd like to find out
if there's a workaround for this problem. What's really strange that the
same Logstash installation works with similar JSON files on other machines.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.