Hello,
I'm trying to set a simple .txt
file as an input.
And I want a parsed .txt
for the output/result.
Inbetween, I'm using a Grok filter to match a specific pattern.
This is a simple test and it doesn't work :
My.conf :
input {
file {
path => "C:/Users/x/Desktop/logstash-7.2.0/bin/test.txt"
start_position => "beginning"
#sincedb_path => "NUL"
}
}
filter {
grok {
match => {
"message" => "MIAM(%{GREEDYDATA:test_filter})"
}
remove_field => "message"
}
}
output {
file {
path => "C:/Users/x/Desktop/logstash-7.2.0/bin/test3.txt"
}
stdout { codec => rubydebug }
}
Started using : logstash -f my.conf
C:/Users/x/Desktop/logstash-7.2.0/bin/test.txt content :
MIAM(DONE)
C:/Users/x/Desktop/logstash-7.2.0/bin/test3.txt is not created at all.
An intersting fact is that if I use input { stdin {} }
instead of a file as an input everything would work fine I would just have to type MIAM(test)
from the console, it'll output the expected result with the expected file test3.txt containing : test_filter : DONE
so I think the problem is from the input
OS : Windows 10 x64
Output :
Sending Logstash logs to C:/Users/x/Desktop/logstash-7.2.0/logs which is now configured via log4j2.properties
[2019-07-09T17:09:38,774][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-07-09T17:09:38,787][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.2.0"}
[2019-07-09T17:09:45,999][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2019-07-09T17:09:46,004][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, :thread=>"#<Thread:0x2218960d run>"}
[2019-07-09T17:09:46,777][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-07-09T17:09:46,833][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-07-09T17:09:46,836][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-07-09T17:09:47,393][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Thanks