CSV data upload problem


(Ellis.K.Issac) #1

hi there.
I Have 2 issue about Logstash. but first, I let you know that test base.

  • Windows 10
  • Search, kibana, Logstash v6.2.2
  1. CSV data upload
    it is work in v6.2.2 family. but 6.4 or 6.5.0 doesn't work in same method.
    I found something in the config file.
    input { file { sincedb_path => "/dev/null"
    it is working in v6.2.2
    but, other higher version said, 'not found C:\dev\null'
    then I added '#' at that keyword. no error and no upload.
    what can I do?

  2. replace @timestamp
    i create log data with .CSV and it have timestamp itself.
    I want replace @timestamp.
    but @timestamp logging what I execute logstash.
    what can I do?

here it is my config.

input{
    file{
		path => "C:\SimDvcLk\tempLog.csv"
    	start_position => "beginning"
		sincedb_path => "/dev/null"
    }
}
filter{
    date{
	    match => ["Time", "yyyy-MM-dd HH:mm:ss"]
    	target => "@timestamp"
	}

	csv {
    	separator => ","
		columns => ["Time", "ControlMode","ActTemp","SetTemp","Output"]
	}

	mutate{convert=>["ControlMode","string"]}
	mutate{convert=>["ActTemp","float"]}
	mutate{convert=>["SetTemp","float"]}
	mutate{convert=>["Output","float"]}

}
output {
	elasticsearch {
    	hosts => "http://localhost:9200"
	    index => "templog"
	}
    stdout {}
}

(Mark Walkom) #2

Should be just nul on Windows.

Logstash executes things sequentially, so your date filter needs to go after the CSV one, otherwise it has no idea where the Time field is.


(Ellis.K.Issac) #3

Thanks for the reply.
but, still I have problem. ;(
first, I upgrade to v6.5.0

  1. CSV data upload
    sincedb_path => "nul" is not occur error anymore. great. :slight_smile:
    but, Logstash do not upload .CSV data.

  2. replace @timestamp
    I change .CSV file time stamp as ISO8601 format.
    then change config file like Logstash getting start video.
    also, not working at v6.2.2
    still, it is not working like video.

please help.


(Christian Dahlqvist) #4

If you can show us a sample log line it may be easier to help find why it is not working.


(Mark Walkom) #5

Also, please don't post pictures of text, they are difficult to read and some people may not be even able to see them.


(Ellis.K.Issac) #6

sorry about picture. my bad.

sample csv format is like this.

  Time,ControlMode,ActTemp,SetTemp,Output
  2018-11-15T02:27:57.965Z,idle,25,50,0
  2018-11-15T02:27:57.967Z,tuning,25.2,50,100

rest of format is same.

and config file is

input{
	file{
		path => "C:\SimDvcLk\tempLog.csv"
		start_position => "beginning"
		sincedb_path => "nul"
	}
}
filter{
	csv {
		separator => ","
		columns => ["Time","ControlMode","ActTemp","SetTemp","Output"]
	}

	mutate{convert=>["Time","string"]}
	mutate{convert=>["ControlMode","string"]}
	mutate{convert=>["ActTemp","float"]}
	mutate{convert=>["SetTemp","float"]}
	mutate{convert=>["Output","float"]}
}
output {
	elasticsearch {
		hosts => "http://localhost:9200"
		index => "templog"
	}
stdout { codec => "rubydebug" }
}

I was remove date for now.
thanks for helping me.


#7

hello,

if you were using windows machine make sure to change the path from

path => "C:\SimDvcLk\tempLog.csv"

to

path => "C:/SimDvcLk/tempLog.csv"

Hope this helps you,

Regards,
Balu


(Ellis.K.Issac) #8

still same. but don't worry.
I use v6.5.0 except logstash.
logstash v6.2.2 can upload to v6.5.0 elastic search.
thank you.


#9

check this sample example for csv upload which is working fine,