Logstash + historical import + oddball time stamps

I am experiencing challenges around getting any of my logs files to use the timestamp in the log file rather than the time it was sucked in. I have been googling, reading and even explored the "Your topic similar to..." suggestions by the website. I am a total noob so I could still be missing something... But am hoping it's something obvious to an advanced user.

My apache filter...
`
input { stdin { } }

filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" , "MMM dd yyyy HH:mm:ss" , "MMM dd yyyy HH:mm:ss" , "dd/MMM/yyyy:HH:mm:ss" , "DDD MMM dd HH:mm:ss yyyy" , "[DDD MMM dd HH:mm:ss yyyy]" , "MMM dd HH:mm:ss yyyy" ]
}
}

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
`

The test...
[root@localhost incoming]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/11-apachelog-filter.conf OpenJDK 64-Bit Server VM warning: If the number of processors is expected to increase from one, then you should configure the number of parallel GC threads appropriately using -XX:ParallelGCThreads=N ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console. WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults Could not find log4j2 configuration at path //usr/share/logstash/config/log4j2.properties. Using default config which logs to console 12:11:29.332 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}} 12:11:29.334 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"} 12:11:29.512 [[main]-pipeline-manager] WARN logstash.outputs.elasticsearch - Restored connection to ES instance {:url=>"http://localhost:9200/"} 12:11:29.559 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Using mapping template from {:path=>nil} 12:11:29.958 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}} 12:11:29.964 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]} 12:11:30.063 [[main]-pipeline-manager] INFO logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125} 12:11:30.156 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started The stdin plugin is now waiting for input:
The test input.
[Tue Sep 05 07:13:49 2017] [error] [client 10.10.10.10] client sent HTTP/1.1 request without hostname (see RFC2616 section 14.23): /2jzswap
The return output.

{ "@timestamp" => 2017-09-11T16:13:11.683Z, "@version" => "1", "host" => "localhost.localdomain", "message" => "[Tue Sep 05 07:13:49 2017] [error] [client 10.10.10.10] client sent HTTP/1.1 request without hostname (see RFC2616 section 14.23): /2jzswap", "tags" => [ [0] "_grokparsefailure" ] }

What am I missing here?
Thanks in advance.

Just a small update..

The apache ACCESS logs are working correctly. It's the ERROR logs that do not get parsed correctly.

As the _grokparsefailure tag indicates your grok filter is broken and doesn't extract the timestamp to a field of its own, so the date filter that is supposed to parse the new timestamp field obviously won't work.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.