Hi,
I am trying to push a custom log file generated by my program into elasticsearch(5.3.2) via logstash(5.3.2).
I dont get any error but index is also not getting created... Am I missing something? I am pretty new to ELK stack and trying out things of my own by reading the ELK documentation.
Here is the output:
C:\Data\ELK\logstash-5.3.2\logstash-5.3.2\bin>logstash -f AE-log.conf
Could not find log4j2 configuration at path /Data/ELK/logstash-5.3.2/logstash-5.3.2/config/log4j2.properties. Using default config which logs to conso
le
13:06:15.598 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http:
//elastic:xxxxxx@localhost:9200/]}}
13:06:15.604 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Running health check to see if an Elasticsearch connection is working {:
healthcheck_url=>http://elastic:xxxxxx@localhost:9200/, :path=>"/"}
13:06:15.827 [[main]-pipeline-manager] WARN logstash.outputs.elasticsearch - Restored connection to ES instance {:url=>#<URI::HTTP:0x48d8acfb URL:htt
p://elastic:xxxxxx@localhost:9200/>}
13:06:15.829 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Using mapping template from {:path=>nil}
13:06:16.209 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash
-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_te
mplates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"
=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properti
es"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "p
roperties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
13:06:16.221 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :h
osts=>[#<URI::Generic:0x327ae81d URL://localhost:9200>]}
13:06:16.499 [[main]-pipeline-manager] INFO logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "
pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
13:06:17.440 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
13:06:17.568 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
Here is the config file:
input {
file {
path => [ "C:\Data\ELK\input\AElogs\AssistEdge_SE.log" ]
start_position => "beginning"
type => "log"
}
}
filter {
kv {
value_split => "~"
field_split => " "
}
grok {
match => {"message" => "%{NUMBER:id} %{WORD:level} %{NUMBER:priority} %{WORD:srcmodule} %{WORD:method} %{WORD:message} %{WORD:description} %{WORD:userid} %{IP:client}" }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
user => "elastic"
password => "XXXXXX"
action => "index"
index => "assistedge_se"
}
stdout { }
}
Here is the log file content for reference:
instid~1 level~Info priority~1 srcmodule~Utilities.Logging method~Logging message~Loaded logUserID from app.config. Value is : True description~NA userid~L3\john.kam ipaddress~10.1.99.121
instid~1 level~Info priority~1 srcmodule~Utilities.Loggnig method~Logging message~Loaded logErrorDetails from app.config. Value is : True description~NA userid~L3\john.kam ipaddress~10.1.99.121