Want to create multiple index for multiple input


(Vivek Pandey) #1

Hi I have multiple log file created based on user. i want to take input all user log file and create index for that.
For Ex
Input section
nput {

file {
    add_field => [ "host", "my-dev-host" ]
    path => "D:\JHipster_Demo\adminlogFile.%d{yyyy-MM-dd}.log"
    codec => "plain"
}
 file {
    add_field => [ "host", "my-dev-host" ]
    path => "D:\JHipster_Demo\unknownlogFile.%d{yyyy-MM-dd}.log"
    codec => "plain"
}

Filter section

filter {
grok {
match => [ "path", "D:/JHipster_Demo/(?[^]+)/" ]
}
date {
match => [ "timestamp" , "YYYY/MM/DD:HH:mm:ss Z" ]
}

}

output section
utput {

elasticsearch{


		index => "global2-%{project}-%{+YYYY.MM.dd}"
    hosts => [ "localhost:9200" ]
   
 
}

stdout { codec => rubydebug }

}

Getting below error

[2017-08-02T18:51:57,754][ERROR][logstash.agent ] Pipeline aborted due
to error {:exception=>#<RegexpError: premature end of char-class: /D:/JHipster
_Demo/(?[^]+)//m>, :backtrace=>["org/jruby/RubyRegexp.java:1434:in initialize'", "D:/logstash-5.5.0/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/li b/grok-pure.rb:127:incompile'", "D:/logstash-5.5.0/vendor/bundle/jruby/1.9/gem
s/logstash-filter-grok-3.4.2/lib/logstash/filters/grok.rb:286:in register'", "o rg/jruby/RubyArray.java:1613:ineach'", "D:/logstash-5.5.0/vendor/bundle/jruby/
1.9/gems/logstash-filter-grok-3.4.2/lib/logstash/filters/grok.rb:280:in registe r'", "org/jruby/RubyHash.java:1342:ineach'", "D:/logstash-5.5.0/vendor/bundle/
jruby/1.9/gems/logstash-filter-grok-3.4.2/lib/logstash/filters/grok.rb:275:in r egister'", "D:/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:281:inreg
ister_plugin'", "D:/logstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:292:in
register_plugins'", "org/jruby/RubyArray.java:1613:ineach'", "D:/logstash-5.
5.0/logstash-core/lib/logstash/pipeline.rb:292:in register_plugins'", "D:/logst ash-5.5.0/logstash-core/lib/logstash/pipeline.rb:302:instart_workers'", "D:/lo
gstash-5.5.0/logstash-core/lib/logstash/pipeline.rb:226:in run'", "D:/logstash- 5.5.0/logstash-core/lib/logstash/agent.rb:398:instart_pipeline'"]}
[2017-08-02T18:51:57,988][INFO ][logstash.agent ] Successfully started


(Magnus Bäck) #2

What is [^]+ supposed to match? [^x] means "match anything except x" but you're not saying what x is.

Are your files really named D:\JHipster_Demo\adminlogFile.%d{yyyy-MM-dd}.log?


(Vivek Pandey) #3

Hi Thanks For Reply.

For the post "How to handle multiple inputs with Logstash to different indices"

You said create different topic so created this.

My intention to take all log as input from one folder. suppose there are log file based on an user.
I want to take input of that log file and create multiple indexe based logfile name.

Please suggest me how i will take all input log file any for loop anything?

And how i will create index any for loop here?


(Magnus Bäck) #4

The Logstash configuration language doesn't have loops and I don't understand why you'd need any loops.

My intention to take all log as input from one folder. suppose there are log file based on an user.
I want to take input of that log file and create multiple indexe based logfile name.

Yes, that much is clear. The problem is that your grok expression doesn't work.

Please answer the questions I asked previously.


(Vivek Pandey) #5

I have one folder where all the log file will be created based on user like if there are 10 user 10 log file will be created now if i want to pass as input i cant write input section 10 time. if next time user increases i cant go every time add new input in config file of logstash.

So i want something like that the input will take all log file from some folder and create respective indexes.

Are you clear now what I want to say.

Thanks for your reply.


(Magnus Bäck) #6

Your current approach is correct but your grok expression is bad. You don't need a loop.

Now, last chance to answer my questions:

  • What is [^]+ supposed to match?
  • Are your files really named D:\JHipster_Demo\adminlogFile.%d{yyyy-MM-dd}.log?

(Vivek Pandey) #7

Thanks for your reply.
What is [^]+ supposed to match? That i copied from your example
Are your files really named D:\JHipster_Demo\adminlogFile.%d{yyyy-MM-dd}.log? Yes this is filename.

what expression i will use so that it will match.
nput {

file {
    add_field => [ "host", "my-dev-host" ]
    path => "D:\JHipster_Demo\adminlogFile.%d{yyyy-MM-dd}.log"
    codec => "plain"
}

"adminlogFile.%d{yyyy-MM-dd}.log" Instead of this what i should write so that it will pick my all log file?

grok {
match => [ "path", "D:/JHipster_Demo/(?[^]+)/" ]
}
(?[^]+)/" instead of this what i need to use?

index => "global2-%{project}-%{+YYYY.MM.dd}"

%{project}-%{+YYYY.MM.dd}" instead of this what i need to use?


(Magnus Bäck) #8

That i copied from your example

What example?

“adminlogFile.%d{yyyy-MM-dd}.log” Instead of this what i should write so that it will pick my all log file?

Use a wildcard like *.log.

(?[^]+)/" instead of this what i need to use?

What do you want to match? The whole filename? Just the date? Something else?


(Vivek Pandey) #9

Thanks for replying.

1.Now my input is Input {

file {
    add_field => [ "host", "my-dev-host" ]
    path => "D:\JHipster_Demo\*.log"     /// As it will take all log file as input?
    codec => "plain"
}

Now i want to do filter with grok

grok {
match => [ "path", "D:/JHipster_Demo/(?[^]+)/" ]
}

Here Instead "?[^]+)/" of this what i should use so that it will apply filter on log file.

  1. How to create index for all log file based on log file name?

First point is clear using * it will take all log file as input please correct me if am wrong

Second and third point how i will do please help me?


(Magnus Bäck) #10

Please answer my questions.


(Vivek Pandey) #11

Which question you talking about?
*one that just an wild character i used in last example where u said create new thread.

Please specify your question again?


(Magnus Bäck) #12

*one that just an wild character i used in last example where u said create new thread.

I participate in dozens of threads each day and I can't keep track of them all.

Which question you talking about?

What do you want to match? The whole filename? Just the date? Something else?


(Vivek Pandey) #13

Thanks For your reply.
Please see this link from where i added wild character.

I just want to match with file name .for eg
admin.log
user1.log
for above log file i need to have respective index
I hope you understand.


(Magnus Bäck) #14

To match the name of path component that follows "D:/JHipster_Demo" you can use this grok expression:

D:/JHipster_Demo/(?<filename>[^/]+)

Then, if the path fields contains "D:/JHipster_Demo/adminlogFile.%d{yyyy-MM-dd}.log" you'll end up getting "adminlogFile.%d{yyyy-MM-dd}.log" in the filename field.


(Vivek Pandey) #15

Thanks for reply
oke so now my input is

file {
add_field => [ "host", "my-dev-host" ]
path => "D:\JHipster_Demo*.log"
codec => "plain"
}

Filter is
grok {
match => [ "path", "D:/JHipster_Demo/(?[^/]+)" ] //Filter path is same as your am not able type same thing due to editor
}

output is

index => "global2-%{filename}-%{+YYYY.MM.dd}"

is this correct?


(Magnus Bäck) #16

is this correct?

Maybe. Try it out. I strongly suggest that you comment out the elasticsearch output first and use a stdout { codec => rubydebug } output for verifying that events look as expected. In this case you'd verify that the filename field is created and has the expected contents.


(Vivek Pandey) #17

I am not getting any error . But no log details are coming on console..

I think input is not working.

file {
add_field => [ "host", "my-dev-host" ]
path => "D:\JHipster_Demo*.log"
codec => "plain"
}


(Magnus Bäck) #18

Make sure you use D:\JHipster_Demo\*.log (or D:/JHipster_Demo/*.log) as the filename pattern. If it still doesn't work, point to an exact file and make sure you're able to get data from it.

If you want to parse these files from the beginning you need to adjust the file input's start_position parameter and clear any existing sincedb state. This has been covered here a hundred times before so I will not elaborate.

Over and out.


(Vivek Pandey) #19

Yup i am giving exact file name it is working, but if am giving "*.log" it doent.

below point i have not asked to you.
If you want to parse these files from the beginning you need to adjust the file input’s start_position parameter and clear any existing sincedb state. This has been covered here a hundred times before so I will not elaborate.

My concern only "*.log" is not working. As you suggested.

Even grokparse is not giving correct name.getting output on console.

      "path" => "D:\\JHipster_Demo\\logFile.2017-08-04.log",
"@timestamp" => 2017-08-04T05:29:21.985Z,
  "@version" => "1",
      "host" => [
    [0] "FS-WS195-D45",
    [1] "my-dev-host"
],
   "message" => " 2017-08-04 10:59:21,907 INFO  [metrics-logger-reporter-1-t

hread-1] metrics: type=TIMER, name=com.hexaware.rad.web.rest.UserResource.update
User, count=0, min=0.0, max=0.0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=
0.0, p98=0.0, p99=0.0, p999=0.0, mean_rate=0.0, m1=0.0, m5=0.0, m15=0.0, rate_un
it=events/second, duration_unit=milliseconds\r",
"tags" => [
[0] "_grokparsefailure"
]


(system) #20

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.