Issue is I dont get time component ( I get zero for time) in the Date field in the Kibana output. I get , myapp.myproject.notice.student.request-time = Dec 13 , 2019 @ 00:00:00.000 at Kibana
okay ....I tried with mutate filter but field and values are not printing in stdout when I start logstash ....it does not work
see this
[2019-12-18T15:14:28,130][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.4.0"}
[2019-12-18T15:14:30,422][INFO ][org.reflections.Reflections] Reflections took 46 ms to scan 1 urls, producing 20 keys and 40 values
[2019-12-18T15:14:31,961][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2019-12-18T15:14:31,967][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0x194809b3 run>"}
[2019-12-18T15:14:32,277][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2019-12-18T15:14:32,387][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-12-18T15:14:32,406][INFO ][filewatch.observingtail ][main] START, creating Discoverer, Watch with file and sincedb collections
[2019-12-18T15:14:32,796][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9601}
no this is wrong. I just read your initial message. whole thing is one line.
so you should be splitting your message in two BY # and then by =
do this
filter {
mutate {
split => ["message","#"]
-- this will divide your line in to two part, before after = and that will be in
-- [message][0] and message[1]
add_field => {"part1" =>"%{[message][0]}"}
add_field => {"part2" =>"%{[message][1]}"}
--now split again
mutate {
split => ["part1", "="]
}
}
}
Check the output and you should have everything seperated out
I restarted logstash , it did not print key , values
see this
[2019-12-18T15:47:57,523][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.4.0"}
[2019-12-18T15:47:59,944][INFO ][org.reflections.Reflections] Reflections took 45 ms to scan 1 urls, producing 20 keys and 40 values
[2019-12-18T15:48:01,590][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2019-12-18T15:48:01,597][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0x555ef8da run>"}
[2019-12-18T15:48:01,959][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2019-12-18T15:48:02,058][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-12-18T15:48:02,061][INFO ][filewatch.observingtail ][main] START, creating Discoverer, Watch with file and sincedb collections
[2019-12-18T15:48:02,443][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9601}
[2019-12-18T15:55:11,449][INFO ][logstash.runner ] Logstash shut down.
[2019-12-18T15:55:37,533][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.4.0"}
[2019-12-18T15:55:39,958][INFO ][org.reflections.Reflections] Reflections took 44 ms to scan 1 urls, producing 20 keys and 40 values
[2019-12-18T15:55:41,574][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2019-12-18T15:55:41,580][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0x19a19969 run>"}
[2019-12-18T15:55:41,862][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2019-12-18T15:55:41,980][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-12-18T15:55:42,009][INFO ][filewatch.observingtail ][main] START, creating Discoverer, Watch with file and sincedb collections
[2019-12-18T15:55:42,356][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9601}
I was trying with your logstash.conf file and test from stdin
But it is not running also
Here it is
[root@ip-xx-x-x-xx logstash]# echo "myapp.myproject.notice.student.request-time = 2019-12-13 12:37:01.4 # myapp.myproject.notice.student.response-time = 2019-12-13 12:37:19.276" | /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/logstash.conf
Thread.exclusive is deprecated, use Thread::Mutex
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2019-12-18 16:43:38.388 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[FATAL] 2019-12-18 16:43:38.408 [LogStash::Runner] runner - Logstash could not be started because there is already another instance using the configured data directory. If you wish to run multiple instances, you must change the "path.data" setting.
[ERROR] 2019-12-18 16:43:38.421 [LogStash::Runner] Logstash - java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
[root@ip-xx-x-x-xx ]#
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.