Using my log timestamp in logstash sample log (first-pipeline.conf)


(hasaN khaN) #1

I have just successfully completed first example (first-pipeline.conf) from Logstash tutorial by uploading logs from FileBeat to Logstash to ElasticSearch

In Kibana while creating new Index "logstash-*" I am not getting timestamp fields from logs instead i am getting @timestamp which actual log uploading time.

Links used

Sample Apache Logs used:
https://download.elastic.co/demos/logstash/gettingstarted/logstash-tutorial.log.gz

Parsing Logs with Logstashed:
https://www.elastic.co/guide/en/logstash/current/advanced-pipeline.html


(Javier) #2

Hello:

This is normal and is seen on the doc you're mentioning.

If you want to match the log date + time fields, I think that you might need using a grok patter to match the date + time piece on each log entry, and map it to a particular field of your choice


(hasaN khaN) #3

Thanks, I am using the first-pipeline.conf as given on
https://www.elastic.co/guide/en/logstash/current/advanced-pipeline.html

input {
beats {
port => "5043"
}
}

filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}

geoip {
    source => "clientip"
}

}

output {
elasticsearch {
hosts => [ "localhost:9200" ]
}
}

In filter section I am using "%{COMBINEDAPACHELOG} as given in example/doc, isn't it enough or do i need to make changes in filter { } section for each and every field, since it is Apache web Logs and logstash understand it the format and it believe it is enough


(Magnus B├Ąck) #4

You need a date filter. See https://www.elastic.co/guide/en/logstash/current/config-examples.html#_processing_apache_logs.


(hasaN khaN) #5

Thanks, its working,
Now I want to use same first-pipeline.conf file to parse my application logs which looks like this

Each line is divided in 3 parts
a) Date
b) LogType
c) Message

My first query is how to I parse it in 3 parts
and second is how to I parse message of every line which is different.

2017-04-16 04:17:24.497+05:30 [I] " Data: ABC Server started.."
2017-04-16 04:17:35.606+05:30 [D] " Data: XYZ List Generation - Start"
2017-04-16 04:18:15.309+05:30 [D] " Data: Restricted User List Generated. MaxTime [4/3/2017 12:32:24 PM]"
2017-04-16 04:18:20.106+05:30 [I] " Data: Normal Order Enabled in : MY_SERVER_A"
2017-04-16 04:18:20.841+05:30 [D] " Data: BulkOrderHome is Created Successfully..."
2017-04-16 04:18:22.778+05:30 [D] " Data: Bulk Report Disabled in : MY_SERVER_B"
2017-04-16 05:46:13.466+05:30 [D] " Data: Logged In Clients,0,Total Clients,0,P.Send Q,0,S.Send Q,0,Ack Size 0"
2017-04-19 06:12:56.312+05:30 [I] " Data: Holiday Master Loaded from 20170423 to 20171230"
2017-04-19 10:00:00.609+05:30 [I] " Data: GetLastSentDataSize: 12511212"
2017-04-19 10:00:02.546+05:30 [I] " Data: General Process Q : R=t|L=INDIA123"


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.