Multiple input for Logstash from filebeat

Hi,

I'm sending CSV files from two servers 2012's with FIlebeat. Sending it to Logstash with Elasticsearch as output.
This is my setup: click me

Test1.csv contains this info:

"(PDH-CSV 4.0) (W. Europe Daylight Time)(-120)","\\Server2012ONE\Processor(_Total)\% Idle Time","\\SEN-MAILMIG\Processor(_Total)\% Processor Time" "07/06/2016 14:38:54.903","21946.437706803892","0,33","2222","3333"

Test2.csv contains this info:

"(PDH-CSV 4.0) (W. Europe Daylight Time)(-120)","\\Server2012TWO\PhysicalDisk(0 C:)\Disk Read Bytes/sec" "05/19/2016 10:57:35.915","98.920604165647148","0.047241671696285348"

Q1: How do I configure my Logstash to output three values from a csv file in Elasticsearch:

  1. The time to match @timestamp
  2. Make the values searchable
  3. To send the last two values from the first line as text. Eg. PhysicalDisk(0 C:) and Disk Read Bytes/sec
    Q2: How do I distinct the filter for csv files. So test1.csv has a different filter than test2.csv

My current beats.conf of Logstash is configured like this:

input {
  beats {
    port => 5044
  }
}

filter {
    csv {
    	columns => ["date", "Cthing"]
  separator => ","
     }}


output {
  elasticsearch {
    hosts => ["192.168.43.51:9200"]
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }}

Too much?

The time to match @timestamp

Use a date filter to parse the timestamp field into @timestamp

Make the values searchable

They will be.

To send the last two values from the first line as text.

As part of every event picked up from that file? Sorry, that's not possible. You can send the header row as one event, but it won't remember those fields for the subsequent events.

How do I distinct the filter for csv files. So test1.csv has a different filter than test2.csv

You can e.g. set a custom field on the Filebeat end to indicate what kind of file an event comes from. Then use conditionals in your Logstash configuration to choose between different filteres.

Allright, I understand.

The output of filebeat is:

{"@timestamp":"2016-07-12T10:22:37.950Z","beat":{"hostname":"sen-mailmig","name":"sen-mailmig"},"count":1,"fields":{"mycustomvar":"HDDIO"},"input_type":"log","message":"\"07/11/2016 17:17:02.339\",\"1.5\"","offset":368,"source":"c:/PerfLogs/Test2.csv","type":"log"}

How do I filter this correctly in Logstash?
The 'message' contains the real timestamp, instead of the @timestamp.

Do I use the json filter?

Use either the json filter or the json codec.

Yes, I'm allmost there!
How do I overwrite the @timestamp which is in the 'message'. And how do I make the 1.7 searchable for Elastic.

$ logstash -f stdinstdout.conf Using JAVA_HOME=C:\Program Files (x86)\Java\jre1.8.0_91 retrieved from C:\Progra mData\Oracle\java\javapath\java.exe io/console not supported; tty will not be manipulated Settings: Default pipeline workers: 1 Pipeline main started {"@timestamp":"2016-07-12T11:32:15.238Z","beat":{"hostname":"sen-mailmig","name" :"sen-mailmig"},"count":1,"fields":null,"input_type":"log","message":"\"07/12/20 16 12:22:02.339\",\"1.7\"","offset":434,"source":"c:/PerfLogs/Test2.csv","type": "log"} { "message" => "\"07/12/2016 12:22:02.339\",\"1.7\"", "@timestamp" => "2016-07-12T11:32:15.238Z", "host" => "sen-mailmig", "count" => 1, "source" => "c:/PerfLogs/Test2.csv" }

Please don't post screenshots. Use copy/paste.

Use a grok filter to extract the timestamp and the "1.7" string into their own fields. Use the date filter to parse the data in the timestamp field and store it in @timestamp.

Finally got it. Nice.

grok{ match => { "message"=> [ "\"%{DATESTAMP:newdate}\",\"%{NUMBER:}\",\"%{NUMBER:c}\"" ]} }

Come to think of it, you could also have used the csv filter.

nvm, got it working.
Time to write my .json file.

Thread closed/

@Ajay1, please start a new thread for your unrelated question.