Unable to establish connection between Logstash and Elasticsearch to parse iis log files

OS : Ubuntu 16.04
Installed : Elasticsearch, Kibana, Logstash (All 3 are of version 6.1)
Log type : iis

This is my config file for Logstash

input {
    file {
        type => "iis-w3c"
        path => "/home/harsha/test.log"
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }
}

filter {
    if [message] =~ "^#" {
        drop {}
}

grok {
    match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:serviceName} %{WORD:serverName} %{IP:serverIP} %{WORD:method} %{URIPATH:uriStem} %{NOTSPACE:uriQuery} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clientIP} %{NOTSPACE:protocolVersion} %{NOTSPACE:userAgent} %{NOTSPACE:cookie} %{NOTSPACE:referer} %{NOTSPACE:requestHost} %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:win32response} %{NUMBER:bytesSent} %{NUMBER:bytesReceived} %{NUMBER:timetaken} %{IPORHOST:OriginalIP}"]
}

date {
    match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
    timezone => "Etc/UTC"
}

mutate {
    convert => ["bytesSent", "integer"]
    convert => ["bytesReceived", "integer"]
    convert => ["timetaken", "integer"]

    add_field => { "clientHostname" => "%{clientIP}" }

    remove_field => [ "log_timestamp"]
}
}

output {
elasticsearch {
    embedded => false
    host => "localhost"
    port => 9200
    protocol => "http"
    index => "my_index"
}

stdout {codec => rubydebug}
}

This is in the correct location etc/logstash/conf.d

I have ES installed and localhost:9200 is giving me what it is supposed to.
I have kibana installed as well and it is correctly configured to ES.

(Do I need to create the index beforehand in ES?, because I didn't.)

When I start Logstash using this "sudo systemctl start logstash.service" nothing is happening.

The test.log file is in the correct location and an example line of it is

2018-01-24 23:59:51 W3SVC11 happyweb 10.2.0.5 GET /blog/wp-content/uploads/2017/08/Mobile-accessories-min.jpg - 443 - 172.68.58.11 HTTP/1.1 Mozilla/5.0+(compatible;+MSIE+9.0;+Windows+NT+6.0;+Trident/5.0;++Trident/5.0) __cfduid=d422c9e6ed63a5957e526427f6341b8891516838382 - 200 0 0 50988 617 485 131.253.25.73

I have tested this with my format using grok debugger and the filters are fine according to it.
So I don't know what is missing. Can someone please help me figure it out?

Logstash is tailing test.log and only picks up new lines. Read the file input documentation and pay special attention to the sincedb_path and start_position options.

1 Like

Thanks @magnusbaeck , but what do I have to do to index already existing logs into ES. I even added a few new lines to the test.log file and even they are not being inserted into ES and I checked the status of Logstash and it is up and running.

Your elasticsearch output section looks like it was copied from a very old example, in fact localhost and 9200 are the defaults so they don't need to be specified. It should look something like this:

  elasticsearch {
    index => "my_index"
  }

IMO, the Logstash service is being continually restarted because the service is exiting with a configuration error message that is being logged to the journalctl logs.

You can download another copy of Logstash to a different folder and run it manually. What do you see in the console?

1 Like

Thanks for the answer, it helped but the reason why my logs weren't indexed is because of the data directory used by Logstash. So I changed the path and gave a new directory and it worked fine.

Anyways is there any hack around this? I am experimenting with the ElasticStack and each time I change my configuration I need to give Logstash a new directory for it's data. Thank you :slight_smile:

Read this https://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html#_tracking_of_current_position_in_watched_files

1 Like

Thank you very much @guyboertje

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.