Is Logstash have a limit for max line characters leight?
I have 12 mb json file wich content single line json with multi objects.
Logstash do not work with this file i'am dont know why no error no debug string just status: Successfully started
[INFO ] 2021-01-04 18:29:35.648 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}
and thats all.
logstash config is ok. i am tested it by spliting original json line by 3357 charecter and it work perfect but if i use my original 12mb file 12463152 charecter nothig happens.
cpu load less than 5%
my logstash config:
input {
file {
path => ["/tmp/multi_valid.json"] #OK work
# path => ["/home/pumba/change/all-items-2020-07-01.log"]
start_position => "beginning"
sincedb_path => ["/dev/null"]
codec => "json"
}
}
filter {
split { field => "[data]" }
#split { field => "[jobs][builds]" }
}
output {
# elasticsearch {
# hosts => ["http://localhost:9200"]
# index => "setlogs-%{+YYYY.MM.dd}"
# }
#
file {
path => ["/tmp/logstash_output.txt"]
}
stdout {
codec => rubydebug
}
}
my litle sample file:
{"Magazine_ID":4,"part":1,"__v":0,"createdAt":"2020-12-03T17:17:07.579Z","data":[{"updated_at":1606980911000,"prices":{"first_seen":1378162800000,"unstable_reason":"LOW_SALES_3PLUS_MONTHS","unstable":true,"sold":{"avg_daily_volume":null,"last_90d":98,"last_30d":22,"last_7d":8,"last_24h":1},"safe_ts":{"last_90d":90.13,"last_30d":90.07,"last_7d":86.92,"last_24h":68.48},"safe":94.71,"median":89.108,"mean":95.03,"max":114.99,"avg":94.95,"min":68.48,"latest":68.48},"image":"https://community.cloudflare.com/economy/image","border_color":"#8650AC","market_hash_name":"chicken","nameID":"1358786"},{"updated_at":1606979738000,"prices":{"first_seen":1376694000000,"unstable_reason":"LOW_SALES_3PLUS_MONTHS","unstable":true,"sold":{"avg_daily_volume":null,"last_90d":56,"last_30d":9,"last_7d":2,"last_24h":0},"safe_ts":{"last_90d":276.21,"last_30d":246.81,"last_7d":238.18,"last_24h":0},"safe":288.12,"median":284.08,"mean":294.98,"max":429.01,"avg":294.88,"min":224.42,"latest":226.64},"image":"https://community.cloudflare.com/image","border_color":"#8650AC","market_hash_name":"frozen_chiken","nameID":"1338724"},{"updated_at":1606967754000,"prices":{"first_seen":1518912000000,"unstable_reason":"LOW_SALES_WEEK","unstable":true,"sold":{"avg_daily_volume":6,"last_90d":721,"last_30d":180,"last_7d":53,"last_24h":7},"safe_ts":{"last_90d":157.02,"last_30d":152.91,"last_7d":147.65,"last_24h":126.83},"safe":152.91,"median":157.3665,"mean":155.62,"max":247.97,"avg":154.85,"min":118.29,"latest":133.81},"image":"https://economy.com/image","border_color":"#8650AC","market_hash_name":"salat","nameID":"175967141"},{"updated_at":1606999857000,"prices":{"first_seen":1442530800000,"unstable_reason":false,"unstable":false,"sold":{"avg_daily_volume":12,"last_90d":1411,"last_30d":392,"last_7d":89,"last_24h":14},"safe_ts":{"last_90d":2.41,"last_30d":2.42,"last_7d":2.36,"last_24h":2.4},"safe":2.36,"median":2.44,"mean":2.36,"max":2.71,"avg":2.36,"min":1.21,"latest":2.58},"image":"https://sfreesh.net/economy/image","border_color":"#CF6A32","market_hash_name":"frozen_fish","market_name":"frozen_fish","nameID":"67067158"}]}
My original big file is exactly as litle sample but with more characters. Original big file was validate by jsonlint-php and get no error.
Please help i dont know what to do with this.