CSV data to Elasticsearch using Logstash

Hello, im trying to push csv data to Elasticsearch using Logstash, here's my logstash conf file :

input{
	file{
		path => "C:\Users\zk3i\Desktop\ELK\kpassdemo_vue_qsse.CSV"
		start_position => "beginning"
		sincedb_path => "C:\Users\zk3i\Desktop\ELK\null.txt"
	}
}

filter{
	csv{
		separator => ","
		columns => ["Nom","Etape","Etape [id<:ver>]","Nature évènement","Nature évènement [id<:ver>]","Lieu" ,"Lieu [id<:ver>]","Détecté par", "Détecté par [id<:ver>]","Date de détection"]
	}	
}

output{
	elasticsearch{
		hosts => "http://localhost:9200"
		index => "qsse"
	}
	
	stdout{ }
}

config file is located at C:\Users\zk3i\Desktop\ELK\logstach_qsse.conf
Elasticsearch is running as a service here's what i get when i go to localhost:9200
{
"name" : "elk_local",
"cluster_name" : "elasticsearch",
"cluster_uuid" : "zzjDN3PKQOyvBcpi9_I6Ng",
"version" : {
"number" : "7.2.0",
"build_flavor" : "unknown",
"build_type" : "unknown",
"build_hash" : "508c38a",
"build_date" : "2019-06-20T15:54:18.811730Z",
"build_snapshot" : false,
"lucene_version" : "8.0.0",
"minimum_wire_compatibility_version" : "6.8.0",
"minimum_index_compatibility_version" : "6.0.0-beta1"
},
"tagline" : "You Know, for Search"
}
I dont know what im doing wrong, im really new to the Elastic Stack.
When i run ./bin/logstash -f C:\Users\zk3i\Desktop\ELK\logstach_qsse.conf i got a tons of errors, you can find the error log here https://pastebin.com/n7dMiiiF

I couldn't paste it here because its limited to 7000 chars :confused:

There are two basic errors here. The first is

java.io.IOException: Could not create directory C:\Program Files\Elastic\Logstach\7.2.0\logs

That in turn results in hundreds of lines of error from log4j RollingFileAppender as it cannot create log files in that directory. I suggest you create that directory and make sure it is writeable by the user that logstash is running as.

The next error is

[2019-07-24T16:15:50,074][ERROR][logstash.config.sourceloader] No configuration found in the configured sources.

What is path.config set to?

Thirdly, you cannot use backslash in the path option of a file input. Change them to forward slash.

1 Like

Thanks for your answer , i created the logs folder and i still have that issue, path.config is set to C:\Users\zk3i\Desktop\ELK\logstach_qsse.conf

Run with '--log.level debug'. Check the line

[2019-07-24T14:58:07,125][DEBUG][logstash.runner          ] *path.config: "/home/user/logstash.conf"

and make sure that path.config contains what you expect.

1 Like

The path.config is alright, and i still have this error [2019-07-24T17:37:55,467][ERROR][logstash.config.sourceloader] No configuration found in the configured sources.

That error means the configuration is empty. I cannot explain that.

1 Like

I fixed it, i noticed that it was using the GIT folder as the home folder because im using git terminal x), now im having another error
[2019-07-25T10:00:17,757][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-07-25T10:00:22,817][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-07-25T10:00:22,817][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-07-25T10:00:23,484][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline.
[2019-07-25T10:00:27,765][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu

The index is created but its health isnt Green its Yellow

And im also having some CSVParseFailure

[2019-07-25T10:57:09,113][DEBUG][logstash.filters.csv     ] Running csv filter {:event=>#<LogStash::Event:0x5b9ae3f5>}
[2019-07-25T10:57:09,113][WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"EVT-QSSE-2018-0034,4. Clôturé,116,Qualité,1,\"\",\"\",Nicolas Duval,1,\"\"\r", :exception=>#<RuntimeError: Invalid FieldReference: `[Etape [id<:ver>]]`>}
[2019-07-25T10:57:09,114][DEBUG][logstash.filters.csv     ] Running csv filter {:event=>#<LogStash::Event:0x78b13507>}
[2019-07-25T10:57:09,115][WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"EVT-QSSE-2019-0004,2. En traitement,114,Santé,3,Atelier,1,Nicolas Duval,1,\"\"\r", :exception=>#<RuntimeError: Invalid FieldReference: `[Etape [id<:ver>]]`>}
[2019-07-25T10:57:09,116][WARN ][logstash.filters.csv     ] Error parsing csv {:field=>"message", :source=>"EVT-QSSE-2018-0026,2. En traitement,114,Sécurité,2,Atelier,1,Sébastien Lopez,2,24/06/2018\r", :exception=>#<RuntimeError: Invalid FieldReference: `[Etape [id<:ver>]]`>}

Those DEBUG messages are normal.

If you have a single node your index health will normally be yellow because there is nowhere to assign the replica shards to.

Remove the square brackets from your column names.

1 Like

Working perfectly thanks, can you please suggest me a good website or youtube channel having Elasticsearch courses ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.