Issues with Logstash reading csv file

Hello! I'm pretty new to the ELK Stack so I'm not sure how to troubleshoot this error.

I've set up my Elasticsearch and kibana successfully, but I'm having issues with getting logstash to read in my .csv file to the ELK Stack. I ran logstash with "--log.level debug" flag and the error I keep running into it the following:

I'm not sure how to go about solving this issue. Any ideas?

My config file is as followed:

input {
  file {
    path => "D:\Graduate School - Fall Semester 2021\Insure Research Project\Canadian Dataset\Processed Traffic Data for ML Algorithms\Friday-02-03-2018_TrafficForML_CICFlowMeter.csv"
    start_position => "beginning"
    sincedb_path => "NULL"
  }
}

filter {
      csv {
	separator => ","
        columns => [ "Dst Port", "Protocol", "Timestamp", "Flow Duration", "Tot Fwd Pkts", "Tot Bwd Pkts", "TotLen Fwd Pkts", "TotLen Bwd Pkts", "Fwd Pkt Len Max", "Fwd Pkt Len Min", "Fwd Pkt Len Mean", "Fwd Pkt Len Std", "Bwd Pkt Len Max", "Bwd Pkt Len Min", "Bwd Pkt Len Mean", "Bwd Pkt Len Std", "Flow Byts/s", "Flow Pkts/s", "Flow IAT Mean", "Flow IAT Std", "Flow IAT Max", "Flow IAT Min", "Fwd IAT Tot", "Fwd IAT Mean", "Fwd IAT Std", "Fwd IAT Max", "Fwd IAT Min", "Bwd IAT Tot", "Bwd IAT Mean", "Bwd IAT Std", "Bwd IAT Max", "Bwd IAT Min", "Fwd PSH Flags", "Bwd PSH Flags", "Fwd URG Flags", "Bwd URG Flags", "Fwd Header Len", "Bwd Header Len", "Fwd Pkts/s", "Bwd Pkts/s", "Pkt Len Min", "Pkt Len Max", "Pkt Len Mean", "Pkt Len Std", "Pkt Len Var", "FIN Flag Cnt", "SYN Flag Cnt", "RST Flag Cnt", "PSH Flag Cnt", "ACK Flag Cnt", "URG Flag Cnt", "CWE Flag Count", "ECE Flag Cnt", "Down/Up Ratio", "Pkt Size Avg", "Fwd Seg Size Avg", "Bwd Seg Size Avg", "Fwd Byts/b Avg", "Fwd Pkts/b Avg", "Fwd Blk Rate Avg", "Bwd Byts/b Avg", "Bwd Pkts/b Avg", "Bwd Blk Rate Avg", "Subflow Fwd Pkts", "Subflow Fwd Byts", "Subflow Bwd Pkts", "Subflow Bwd Byts", "Init Fwd Win Byts", "Init Bwd Win Byts", "Fwd Act Data Pkts", "Fwd Seg Size Min", "Active Mean", "Active Std", "Active Max", "Active Min", "Idle Mean", "Idle Std", "Idle Max", "Idle Min", "Label"
 ]
     }
    }

output {
  elasticsearch { 
  hosts => ["http://localhost:9200"] 
  index => "network_data"
  }
  stdout {}
 }

It's quite a large dataset, so I'm trying to read the files in one at a time to make sure it can work (this file is about 300mb, the entire set in total is about 7GB total).

Try this. Changed filepath \ to / and NULL to NUL.

input {
  file {
    path => "D:/Graduate School - Fall Semester 2021/Insure Research Project/Canadian Dataset\Processed Traffic Data for ML Algorithms/Friday-02-03-2018_TrafficForML_CICFlowMeter.csv"
    start_position => "beginning"
    sincedb_path => "NUL"
  }
}```

This worked like a charm! Thank you for your help I really appreciate it! Saved me a headache for today

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.