Facing an issue to send the log file Elastic

Hi All,

Am unable to send the log message to elastic using logstash , it would be great help if someone can help me on this. here is my test code. (in fact i tried several scripts but unsuccessful :frowning:
Example one

input {
file {
path => "C:/Users/UX013032/AppData/Local/UiPath/2019-08-26_Execution.log"
start_position => "beginning"
}
}
filter {
grok {

match => { "message" => ["%{COMBINEDAPACHELOG}"] }
match => { "message" => ["%{*passedd}"] }
}
date {
    match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]

}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "uipath-%{+YYYY.MM.dd}"

}
stdout { codec => rubydebug }
}

Example#2
input {
file {
path => "C:/Users/UX013032/AppData/Local/UiPath/2019-08-26_Execution.log"
start_position => "beginning"

}

}

filter {}

output {
elasticsearch {
hosts => "localhost"
index => "uipath"
}
}

In Log Stash Console i see the below message.

c:\Softwares\logstash-7.3.1\logstash-7.3.1\bin>logstash -f C:\Softwares\logstash-7.3.1\logstash-7.3.1\NovusProjectConfigFile\novus2.conf
Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to c:/Softwares/logstash-7.3.1/logstash-7.3.1/logs which is now configured via log4j2.properties
[2019-08-28T18:03:53,826][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-08-28T18:03:53,868][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.3.1"}
[2019-08-28T18:03:56,413][INFO ][org.reflections.Reflections] Reflections took 85 ms to scan 1 urls, producing 19 keys and 39 values
[2019-08-28T18:03:58,699][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2019-08-28T18:03:59,113][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-08-28T18:03:59,230][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-08-28T18:03:59,238][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2019-08-28T18:03:59,285][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2019-08-28T18:03:59,410][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-08-28T18:03:59,596][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluste
r_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2019-08-28T18:03:59,609][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflig
ht"=>500, :thread=>"#<Thread:0xf90798 run>"}
[2019-08-28T18:03:59,709][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interv
al"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"strin
g_fields"=>{"match"=>"
", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>
{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{
"type"=>"half_float"}}}}}}}
[2019-08-28T18:04:01,742][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"c:/Softwares/logstash-7.3.1/logstash-7.3.1/data/plugins/inp
uts/file/.sincedb_87b3fb1daacda33c0274de8c8b01b4eb", :path=>["C:/Users/UX013032/AppData/Local/UiPath/2019-08-26_Execution.log"]}
[2019-08-28T18:04:01,815][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-08-28T18:04:01,930][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-08-28T18:04:01,943][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-08-28T18:04:02,697][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Actually when i search the Index pattern in kiban , am unable find my index file . not sure whether logstash is executed and send the log info to elk.

Please help...

Is your file changing?
If not, you will not reingest the data because the sincedb file knows about it.

You can remove the sincedb file and try again.

Thanks you much Sir for responding quickly. I tried , still my log file in not injected to ES :frowning: , The second example which i have given code snippet is the similar one

another way i tried , still it says Logstash successful but nothing get update into es

input {
file {
path => "//AppData/Local/UiPath/Logs/2019-08-26_Execution.log"
sincedb_path => "null"
}
}
filter {
grok {
match => { "message" => "%{IP:client} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:bytes} %{NUMBER:duration}" }
}
}

output {
elasticsearch {
hosts => "localhost"
#flush_size => 5000
#idle_flush_time => 30
index => "novus-m-%{+YYYY.MM}"
}
stdout {}
}

c:\Softwares\logstash-7.3.1\logstash-7.3.1\bin>logstash -f C:\Softwares\logstash-7.3.1\logstash-7.3.1\NovusProjectConfigFile\novus5.conf
Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to c:/Softwares/logstash-7.3.1/logstash-7.3.1/logs which is now configured via log4j2.properties
[2019-08-28T22:14:10,164][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-08-28T22:14:10,202][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.3.1"}
[2019-08-28T22:14:12,983][INFO ][org.reflections.Reflections] Reflections took 84 ms to scan 1 urls, producing 19 keys and 39 values
[2019-08-28T22:14:16,380][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2019-08-28T22:14:16,798][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-08-28T22:14:16,911][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-08-28T22:14:16,919][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2019-08-28T22:14:16,970][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2019-08-28T22:14:17,110][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-08-28T22:14:17,246][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interv
al"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"strin
g_fields"=>{"match"=>"
", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>
{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{
"type"=>"half_float"}}}}}}}
[2019-08-28T22:14:17,751][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluste
r_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2019-08-28T22:14:17,762][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflig
ht"=>500, :thread=>"#<Thread:0xe64962 run>"}
[2019-08-28T22:14:19,433][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-08-28T22:14:19,589][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-08-28T22:14:19,623][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-08-28T22:14:20,535][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.