Apache log not able to load into elastic through stashlog

Hi I am trying to load apache (dummy log) log into elastic through log stash. Below is my config file.
I am not any index created in kibana. Am i doing something wrong ?

input
{
file {
path => "C:\Elastic\Data\log"
type => "logs"
start_position => "beginning"
}

}

filter
{
grok{
match => {
"message" => "%{COMBINEDAPACHELOG}"
}
}
mutate{
convert => { "bytes" => "integer" }
}
date {
match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ]
locale => en
remove_field => "timestamp"
}
geoip {
source => "clientip"
}
useragent {
source => "agent"
target => "useragent"
}
}

output
{
stdout {
codec => dots
}

elasticsearch {

}

}

Thanks,
Gopal

You cannot use backslash in the path option of a file input. Use forward slash.

1 Like

Thanks for response. I changed backward slashes to forward but still could not able load messages. Here is what i am getting in logstash terminal

C:\Elastic\logstash-7.1.1\bin>logstash -f C:\Elastic\Data\apache.conf
Sending Logstash logs to C:/Elastic/logstash-7.1.1/logs which is now configured via log4j2.properties
[2019-07-26T16:44:30,201][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-07-26T16:44:30,254][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2019-07-26T16:44:43,202][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://127.0.0.1:9200/]}}
[2019-07-26T16:44:43,637][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"}
[2019-07-26T16:44:43,720][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-07-26T16:44:43,729][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2019-07-26T16:44:43,783][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//127.0.0.1"]}
[2019-07-26T16:44:43,808][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"C:/Elastic/logstash-7.1.1/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-6.0.1-java/vendor/GeoLite2-City.mmdb"}
[2019-07-26T16:44:43,844][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-07-26T16:44:44,348][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1, "index.lifecycle.name"=>"logstash-policy", "index.lifecycle.rollover_alias"=>"logstash"}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-07-26T16:44:44,688][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x1ab1a2c run>"}
[2019-07-26T16:44:50,193][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/Elastic/logstash-7.1.1/data/plugins/inputs/file/.sincedb_945ff527acaef3c416995fe0eeb93367", :path=>["C:/Elastic/Data/log"]}
[2019-07-26T16:44:50,252][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-07-26T16:44:50,368][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-07-26T16:44:50,368][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-07-26T16:44:51,105][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Below in input section in config file

input
{
file {
path => "C:/Elastic/Data/log"
type => "logs"
start_position => "beginning"
}

}

Add '--log.level trace' to the command line and see what filewatch has to say.

Hi, I added the same you mentioned but could not find any error. I am not sure what i need to look into.

Below the result of command -
C:\Elastic\logstash-7.1.1\bin>logstash -f C:\Elastic\Data\apache.conf --log.level trace
Sending Logstash logs to C:/Elastic/logstash-7.1.1/logs which is now configured via log4j2.properties
[2019-07-30T11:28:11,477][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"fb_apache", :directory=>"C:/Elastic/logstash-7.1.1/modules/fb_apache/configuration"}
[2019-07-30T11:28:11,525][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x1a344c7 @directory="C:/Elastic/logstash-7.1.1/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>}
[2019-07-30T11:28:11,534][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"C:/Elastic/logstash-7.1.1/modules/netflow/configuration"}
[2019-07-30T11:28:11,550][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0xe91e0b @directory="C:/Elastic/logstash-7.1.1/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
[2019-07-30T11:28:11,921][DEBUG][logstash.runner ] -------- Logstash Settings (* means modified) ---------
[2019-07-30T11:28:11,926][DEBUG][logstash.runner ] node.name: "NJ1L668BKC2"
[2019-07-30T11:28:11,929][DEBUG][logstash.runner ] *path.config: "C:\Elastic\Data\apache.conf"
[2019-07-30T11:28:11,937][DEBUG][logstash.runner ] path.data: "C:/Elastic/logstash-7.1.1/data"
[2019-07-30T11:28:11,939][DEBUG][logstash.runner ] modules.cli:
[2019-07-30T11:28:11,957][DEBUG][logstash.runner ] modules:
[2019-07-30T11:28:11,966][DEBUG][logstash.runner ] modules_list:
[2019-07-30T11:28:11,977][DEBUG][logstash.runner ] modules_variable_list:
[2019-07-30T11:28:11,981][DEBUG][logstash.runner ] modules_setup: false
[2019-07-30T11:28:12,005][DEBUG][logstash.runner ] config.test_and_exit: false
[2019-07-30T11:28:12,021][DEBUG][logstash.runner ] config.reload.automatic: false
[2019-07-30T11:28:12,029][DEBUG][logstash.runner ] config.reload.interval: 3000000000
[2019-07-30T11:28:12,033][DEBUG][logstash.runner ] config.support_escapes: false
[2019-07-30T11:28:12,111][DEBUG][logstash.runner ] config.field_reference.parser: "STRICT"
[2019-07-30T11:28:12,164][DEBUG][logstash.runner ] metric.collect: true
[2019-07-30T11:28:12,240][DEBUG][logstash.runner ] pipeline.id: "main"
[2019-07-30T11:28:12,291][DEBUG][logstash.runner ] pipeline.system: false
[2019-07-30T11:28:12,326][DEBUG][logstash.runner ] pipeline.workers: 4
[2019-07-30T11:28:12,332][DEBUG][logstash.runner ] pipeline.batch.size: 125
[2019-07-30T11:28:12,351][DEBUG][logstash.runner ] pipeline.batch.delay: 50
[2019-07-30T11:28:12,372][DEBUG][logstash.runner ] pipeline.unsafe_shutdown: false
[2019-07-30T11:28:12,396][DEBUG][logstash.runner ] pipeline.java_execution: true
[2019-07-30T11:28:12,406][DEBUG][logstash.runner ] pipeline.reloadable: true
[2019-07-30T11:28:12,425][DEBUG][logstash.runner ] path.plugins:
[2019-07-30T11:28:12,451][DEBUG][logstash.runner ] config.debug: false
[2019-07-30T11:28:12,473][DEBUG][logstash.runner ] *log.level: "trace" (default: "info")
[2019-07-30T11:28:12,476][DEBUG][logstash.runner ] version: false
[2019-07-30T11:28:12,482][DEBUG][logstash.runner ] help: false
[2019-07-30T11:28:12,517][DEBUG][logstash.runner ] log.format: "plain"
[2019-07-30T11:28:12,561][DEBUG][logstash.runner ] http.host: "127.0.0.1"
[2019-07-30T11:28:12,595][DEBUG][logstash.runner ] http.port: 9600..9700
[2019-07-30T11:28:12,604][DEBUG][logstash.runner ] http.environment: "production"
[2019-07-30T11:28:12,612][DEBUG][logstash.runner ] queue.type: "memory"
[2019-07-30T11:28:12,645][DEBUG][logstash.runner ] queue.drain: false
[2019-07-30T11:28:12,658][DEBUG][logstash.runner ] queue.page_capacity: 67108864
[2019-07-30T11:28:12,699][DEBUG][logstash.runner ] queue.max_bytes: 1073741824
[2019-07-30T11:28:12,744][DEBUG][logstash.runner ] queue.max_events: 0
[2019-07-30T11:28:12,793][DEBUG][logstash.runner ] queue.checkpoint.acks: 1024
[2019-07-30T11:28:12,838][DEBUG][logstash.runner ] queue.checkpoint.writes: 1024
[2019-07-30T11:28:12,879][DEBUG][logstash.runner ] queue.checkpoint.interval: 1000
[2019-07-30T11:28:12,887][DEBUG][logstash.runner ] queue.checkpoint.retry: false
[2019-07-30T11:28:12,895][DEBUG][logstash.runner ] dead_letter_queue.enable: false
[2019-07-30T11:28:12,901][DEBUG][logstash.runner ] dead_letter_queue.max_bytes: 1073741824
[2019-07-30T11:28:12,903][DEBUG][logstash.runner ] slowlog.threshold.warn: -1
[2019-07-30T11:28:12,908][DEBUG][logstash.runner ] slowlog.threshold.info: -1
[2019-07-30T11:28:12,933][DEBUG][logstash.runner ] slowlog.threshold.debug: -1
[2019-07-30T11:28:12,936][DEBUG][logstash.runner ] slowlog.threshold.trace: -1
[2019-07-30T11:28:12,936][DEBUG][logstash.runner ] keystore.classname: "org.logstash.secret.store.backend.JavaKeyStore"
[2019-07-30T11:28:12,941][DEBUG][logstash.runner ] keystore.file: "C:/Elastic/logstash-7.1.1/config/logstash.keystore"
[2019-07-30T11:28:12,957][DEBUG][logstash.runner ] path.queue: "C:/Elastic/logstash-7.1.1/data/queue"
[2019-07-30T11:28:12,968][DEBUG][logstash.runner ] path.dead_letter_queue: "C:/Elastic/logstash-7.1.1/data/dead_letter_queue"
[2019-07-30T11:28:12,972][DEBUG][logstash.runner ] path.settings: "C:/Elastic/logstash-7.1.1/config"
[2019-07-30T11:28:12,978][DEBUG][logstash.runner ] path.logs: "C:/Elastic/logstash-7.1.1/logs"
[2019-07-30T11:28:13,000][DEBUG][logstash.runner ] xpack.management.enabled: false
[2019-07-30T11:28:13,006][DEBUG][logstash.runner ] xpack.management.logstash.poll_interval: 5000000000
[2019-07-30T11:28:13,052][DEBUG][logstash.runner ] xpack.management.pipeline.id: ["main"]
[2019-07-30T11:28:13,053][DEBUG][logstash.runner ] xpack.management.elasticsearch.username: "logstash_system"
[2019-07-30T11:28:13,054][DEBUG][logstash.runner ] xpack.management.elasticsearch.hosts: ["https://localhost:9200"]

I am unable to explain that. The log level is clearly trace, yet there are no TRACE level messages. It is as though you do not have a file input configured at all.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.