WEB log config file


(Rahul) #1

HI i am new to ELK

i have created indices and deleted through Kibana by query and manually from C:\ELK\elasticsearch-5.6.3\elasticsearch-5.6.3\data\nodes\0\indices.

when i try to create again indices with config file modification i got below message but new indices is not creating.

Need help

Sending Logstash's logs to C:/ELK/logstash-5.6.3/logstash-5.6.3/logs which is no
w configured via log4j2.properties
[2018-03-01T19:05:32,566][INFO ][logstash.modules.scaffold] Initializing module
{:module_name=>"fb_apache", :directory=>"C:/ELK/logstash-5.6.3/logstash-5.6.3/mo
dules/fb_apache/configuration"}
[2018-03-01T19:05:32,576][INFO ][logstash.modules.scaffold] Initializing module
{:module_name=>"netflow", :directory=>"C:/ELK/logstash-5.6.3/logstash-5.6.3/modu
les/netflow/configuration"}
[2018-03-01T19:05:36,526][INFO ][logstash.outputs.elasticsearch] Elasticsearch p
ool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-03-01T19:05:36,529][INFO ][logstash.outputs.elasticsearch] Running health
check to see if an Elasticsearch connection is working {:healthcheck_url=>http:/
/localhost:9200/, :path=>"/"}
[2018-03-01T19:05:36,798][WARN ][logstash.outputs.elasticsearch] Restored connec
tion to ES instance {:url=>"http://localhost:9200/"}
[2018-03-01T19:05:36,893][INFO ][logstash.outputs.elasticsearch] Using mapping t
emplate from {:path=>nil}
[2018-03-01T19:05:36,899][INFO ][logstash.outputs.elasticsearch] Attempting to i
nstall template {:manage_template=>{"template"=>"logstash-*", "version"=>50001,
"settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=

{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"pa
th_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text"
, "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"str
ing", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=
"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"da
te", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=
false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "locati
on"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"t
ype"=>"half_float"}}}}}}}}
[2018-03-01T19:05:36,914][INFO ][logstash.outputs.elasticsearch] New Elasticsear
ch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:920
0"]}
[2018-03-01T19:05:37,217][INFO ][logstash.pipeline ] Starting pipeline {"
id"=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.
delay"=>5, "pipeline.max_inflight"=>250}
[2018-03-01T19:05:39,950][INFO ][logstash.pipeline ] Pipeline main starte
d
[2018-03-01T19:05:40,219][INFO ][logstash.agent ] Successfully started
Logstash API endpoint {:port=>9600}


(Magnus Bäck) #2

i have created indices and deleted through Kibana by query and manually from C:\ELK\elasticsearch-5.6.3\elasticsearch-5.6.3\data\nodes\0\indices.

Don't ever do that. Always delete indices via the APIs.

when i try to create again indices with config file modification i got below message but new indices is not creating.

You haven't given us any details, but I'm guessing you're using a file input and if you want it to reprocess an old file you need to delete the sincedb file.


(Rahul) #3

Thanks for your suggestion.

Yes i am using file input and here it is.
my log sample data--------------->

1**...5*- - [21/Apr/2017:00:31:46 -0600] "GET /api/releaseInfo HTTP/1.1" 200 1479
1**...5* - - [21/Apr/2017:00:32:46 -0600] "GET /api/releaseInfo HTTP/1.1" 200 1479


input
{
file
{
path=> "C:\ELK\input_logs\weblogs\Weblogic_logs.txt"
start_position =>"beginning"
}
}
filter
{
if[type]=="weblogic"
{
grok{

match => { "message" => "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:cs-method} %{URIPATH:cs-uri-stem} %{NUMBER:sc-status:int} %{NUMBER:cs-bytes:int}" }


	}
	date {
	match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
	timezone => "UTC"
	}
}
mutate {
	remove_field => [ "log_timestamp"]
}

}
output
{
elasticsearch
{
hosts => ["localhost:9200"]
index => ["weblogic_1"]
document_type => "weblogic_1"
# user => elastic
# password => changeme
}
stdout{}
}



(Magnus Bäck) #4

As I said, if you want it to reprocess an old file you need to delete the sincedb file. See the file input documentation for details.


(Rahul) #5

i'll check and get back to you. thanks..

and one more thing, my input file code is correct or not let me know.


(Magnus Bäck) #6
  • You're never setting the type to weblogic so your filters will be skipped.
  • The grok expression doesn't match the input data.
  • The date pattern doesn't match the timestamp format in the input.

(Rahul) #7

Thank you Magnus, i'll update and try properly.


(system) #8

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.