SInce updating to logstash 5.2 (from 2.x) glob is []

Since updating my system (centos 6.5), logstash and elasticsearch, there has been no input from logstash. The files are in the correct directory (it's an NTFS share with a windows server) and everyone has read access on the files. The only thing that's changed on the system is the logstash and elasticsearch version (this was working perfectly before updating). At this point every link on google is purple and i've completely run out of ideas.

My Config

  file{
            path => ["/home/User/Desktop/Windows-Logs/u_ex1605*.log"] #if i type vi /*this file*/ then it opens fine.
            start_position => "beginning"
            tags => ["iis-access", "local-file","iislog", "may"]
            ignore_older => 0
            #sincedb_path => "/dev/null"
            codec => "plain"
    }

The logstash log - removed some parts due to size

    2017-02-14T07:57:39,431][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@absolute_healthcheck_path = false
    [2017-02-14T07:57:39,431][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
    [2017-02-14T07:57:39,431][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
    [2017-02-14T07:57:39,437][DEBUG][logstash.agent           ] starting agent
    [2017-02-14T07:57:39,439][DEBUG][logstash.agent           ] starting pipeline {:id=>"main"}
    [2017-02-14T07:57:39,602][TRACE][logstash.inputs.file     ] Registering file input {:path=>["/home/search_op/Desktop/Windows-Logs/u_ex1605*.log"]}
    [2017-02-14T07:57:51,672][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
    [2017-02-14T07:57:51,680][DEBUG][logstash.inputs.file     ] _globbed_files: /home/search_op/Desktop/Windows-Logs/u_ex1605*.log: glob is: []
    [2017-02-14T07:57:51,979][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
    [2017-02-14T07:57:51,980][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
    [2017-02-14T07:57:52,076][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x18cf3a34 URL:http://localhost:9200/>}
    [2017-02-14T07:57:52,077][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
    [2017-02-14T07:57:52,124][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
    [2017-02-14T07:57:52,128][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
    [2017-02-14T07:57:52,128][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0x5d81cfcd URL://localhost>]}
    [2017-02-14T07:57:52,129][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
    [2017-02-14T07:57:52,137][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
    [2017-02-14T07:57:52,137][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
    [2017-02-14T07:57:52,141][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x57893aa2 URL:http://localhost:9200/>}
    [2017-02-14T07:57:52,142][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
    [2017-02-14T07:57:52,149][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
    [2017-02-14T07:57:52,153][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}

[2017-02-13T15:53:37,156][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>1000}
[2017-02-13T15:53:37,164][INFO ][logstash.pipeline        ] Pipeline main started
[2017-02-13T15:53:37,171][DEBUG][logstash.agent           ] Starting puma
[2017-02-13T15:53:37,173][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2017-02-13T15:53:37,173][DEBUG][logstash.api.service     ] [api-service] start
[2017-02-13T15:53:37,199][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-02-13T15:53:42,165][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-02-13T15:53:47,166][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-02-13T15:53:50,520][DEBUG][logstash.inputs.file     ] _globbed_files: /home/search_op/Desktop/Windows-Logs/u_ex1605*.log: glob is: []
[2017-02-13T15:53:52,168][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-02-13T15:53:57,169][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2017-02-13T15:54:02,169][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline

Logstash is also a sudo user, i added "logstash" and "Logstash" just to eleminate that...

What user does Logstash run as? What are the permissions and ownership/group of /home/User/Desktop/Windows-Logs/u_ex1605*.log and all directories leading up to that file?

Logstash is also a sudo user, i added "logstash" and "Logstash" just to eleminate that...

What do you mean by sudo user? A user that's allowed to run sudo? That doesn't help here.

Hi, thanks for the reply!

I added: Logstash ALL=(ALL) NOPASSWD: ALL to sudo visudo
i'm replacing the username with "user"

This is the output of ls -la

Permissions on user directory
drwxrw-rw-. 32 user user 4096 Feb 14 07:53 user

Permissions on user desktop directory
drwxrwxrwx. 6 user user 4096 Feb 13 14:38 Desktop

Permissions on Windows-Logs directory
drwxr-xr-x 1 root root 98304 Jan 11 09:00 Windows-Logs

Permissions on one of the files in the Windows-Logs directory
-rwxr-xr-x 0 root root 69652305 Dec 31 00:00 u_ex161230.log

I'm relatively new(ish) to linux, been using it for a few months. If there's anything i've missed just let me know! :slight_smile:

The user directory is mode drwxrw-rw- (766) but it's important that directory is executable (drwxrwxrwx, 777). It does however not need to be writable so what you're really looking for is (drwxr-xr-x, 755). Same thing with the Desktop directory, it doesn't have to be writable to anyone but you.

You've solved it! All this time i thought the permissions were not the problem, when they infact were. Thanks a bunch for the help!

I just had one more query (i can make a new thread if that is the preferred method of sub questions)

If i wanted to take the year and month from within the log to use as the index name in elasticsearch, how do i do that? At the minute i add the month in the tags and do an 'if' in the the output, but this means i have 12 if statements. I'd ideally like to get the month and year from the date_time log field.

Date time log field

  date {
     match => [ "date_time", "YYYY-MM-dd HH:mm:ss" ]
     target => "date_time"
  }

I guess the best way would be to split the date time field on '-' then use the first and second parameters? But i can't seem to find out how to actually use a field in the output index string.

i.e.

   if "june" in [tags] {
            elasticsearch {
                    hosts => "localhost"
                    index => "logstash-%{+YYYY}.06"
            }

    }

i'd prefer something like

 if "june" in [tags] {
                elasticsearch {
                        hosts => "localhost"
                        index => "logstash-%{date_time[0]}.%{date_time[1]}" #assuming i've already split the date_time field
                }

        }

If i wanted to take the year and month from within the log to use as the index name in elasticsearch, how do i do that?

Just use %{+YYYY.MM} in the index option of your elasticsearch output.

I think that provides the current year and month if i'm not mistaken?
I need the value out of the log file, as the logs are 2 years old to present!
Is there a way to do that? I've tried passing in the fields but i end up with an index called:

 logstash-%{year}.%{month}

And this is how i'm setting it

 index => "logstash-%{year}.%{month}"

and this is the value of year/month (within a ruby block)

 code => "
                 dateSplit = event.get('date_time')
                 dateSplit = String.try_convert(dateSplit).split('-') //not sure if this is correct, but i'm not getting an error.
                 event.set('year',dateSplit[0])
                 event.set('month',dateSplit[1]) "

I think that provides the current year and month if i'm not mistaken?

That notation uses the timestamp in the @timestamp field, which should be the timestamp of when the event occurred (but it defaults to the current time).

Awesome, i've not got a fully working stack again! Thanks for your help.

Oops, meant to say 'now' got a working stack again...!

chmod to 755 did not work for me 5.2.1 - I get "attempting to install template" regardless of permissions set.

@GregMeadows, please start your own thread for your unrelated issue.

I already started a new thread for this issue

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.