Import csv file to elasticsearch using logstash

Hi,
I need to import a csv file to elasticsearch using logstash, i create a file.config:
input {
file {
path => "/Users/salma/Desktop/creditcard.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["Time","V1","V2","V3","V4","V5","V6","V7","V8","V9","V10","V11","V12","V13","V14","V15","V16","V17","V18","V19","V20","V21","V22","V23","V24","V25","V26","V27","V28","Amount"]
remove_field => ["class"]

 >      }
>         }
>         output {
>            elasticsearch {
>            hosts => "http://localhost:9200"
>            action => "index"
>            index => "dataset"
>                }
>           }

and this is my data :

Time,"V1","V2","V3","V4","V5","V6","V7","V8","V9","V10","V11","V12","V13","V14","V15","V16","V17","V18","V19","V20","V21","V22","V23","V24","V25","V26","V27","V28","Amount","Class"
> 0,-1.3598071336738,-0.0727811733098497,2.53634673796914,1.37815522427443,-0.338320769942518,0.462387777762292,0.239598554061257,0.0986979012610507,0.363786969611213,0.0907941719789316,-0.551599533260813,-0.617800855762348,-0.991389847235408,-0.311169353699879,1.46817697209427,-0.470400525259478,0.207971241929242,0.0257905801985591,0.403992960255733,0.251412098239705,-0.018306777944153,0.277837575558899,-0.110473910188767,0.0669280749146731,0.128539358273528,-0.189114843888824,0.133558376740387,-0.0210530534538215,149.62,"0"
> 0,1.19185711131486,0.26615071205963,0.16648011335321,0.448154078460911,0.0600176492822243,-0.0823608088155687,-0.0788029833323113,0.0851016549148104,-0.255425128109186,-0.166974414004614,1.61272666105479,1.06523531137287,0.48909501589608,-0.143772296441519,0.635558093258208,0.463917041022171,-0.114804663102346,-0.183361270123994,-0.145783041325259,-0.0690831352230203,-0.225775248033138,-0.638671952771851,0.101288021253234,-0.339846475529127,0.167170404418143,0.125894532368176,-0.00898309914322813,0.0147241691924927,2.69,"0"
> 1,-1.35835406159823,-1.34016307473609,1.77320934263119,0.379779593034328,-0.503198133318193,1.80049938079263,0.791460956450422,0.247675786588991,-1.51465432260583,0.207642865216696,0.624501459424895,0.066083685268831,0.717292731410831,-0.165945922763554,2.34586494901581,-2.89008319444231,1.10996937869599,-0.121359313195888,-2.26185709530414,0.524979725224404,0.247998153469754,0.771679401917229,0.909412262347719,-0.689280956490685,-0.327641833735251,-0.139096571514147,-0.0553527940384261,-0.0597518405929204,378.66,"0"
> 1,-0.966271711572087,-0.185226008082898,1.79299333957872,-0.863291275036453,-0.0103088796030823,1.24720316752486,0.23760893977178,0.377435874652262,-1.38702406270197,-0.0549519224713749,-0.226487263835401,0.178228225877303,0.507756869957169,-0.28792374549456,-0.631418117709045,-1.0596472454325,-0.684092786345479,1.96577500349538,-1.2326219700892,-0.208037781160366,-0.108300452035545,0.00527359678253453,-0.190320518742841,-1.17557533186321,0.647376034602038,-0.221928844458407,0.0627228487293033,0.0614576285006353,123.5,"0"
> 2,-1.15823309349523,0.877736754848451,1.548717846511,0.403033933955121,-0.407193377311653,0.0959214624684256,0.592940745385545,-0.270532677192282,0.817739308235294,0.753074431976354,-0.822842877946363,0.53819555014995,1.3458515932154,-1.11966983471731,0.175121130008994,-0.451449182813529,-0.237033239362776,-0.0381947870352842,0.803486924960175,0.408542360392758,-0.00943069713232919,0.79827849458971,-0.137458079619063,0.141266983824769,-0.206009587619756,0.502292224181569,0.219422229513348,0.215153147499206,69.99,"0"

I run with logstash -f file.conf
But nothing visualize in kibana and in elasticsearch

Any idea please ?
Thanks!

Temporarily comment out your elasticsearch output and use a stdout { codec => rubydebug } output instead. Are you getting anything in the log?

Hi Magnus, When my output is stdout { codec => rubydebug } I have this in terminal.

Could not find log4j2 configuration at path /Project/elk/logstash/config/log4j2. properties. Using default config which logs to console
  14:18:06.588 [[main]-pipeline-manager] INFO  logstash.pipeline - Starting pipeli
    ne {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.b
    atch.delay"=>5, "pipeline.max_inflight"=>500}
    14:18:07.492 [[main]-pipeline-manager] INFO  logstash.pipeline - Pipeline main s
    tarted
    14:18:07.967 [Api Webserver] INFO  logstash.agent - Successfully started Logstas
    h API endpoint {:port=>9600}

Nothing else? Try increasing Logstash's log level to get additional clues.

Just this, I run with --debug, I have this in terminal endlessly and don't stop.

15:13:30.917 [Ruby-0-Thread-10: C:/Project/elk/logstash/logstash-core/lib/logsta
sh/pipeline.rb:514] DEBUG logstash.pipeline - Pushing flush onto pipeline
15:13:35.918 [Ruby-0-Thread-10: C:/Project/elk/logstash/logstash-core/lib/logsta
sh/pipeline.rb:514] DEBUG logstash.pipeline - Pushing flush onto pipeline
15:13:40.922 [Ruby-0-Thread-10: C:/Project/elk/logstash/logstash-core/lib/logsta
sh/pipeline.rb:514] DEBUG logstash.pipeline - Pushing flush onto pipeline
15:13:42.345 [[main]<file] DEBUG logstash.inputs.file - _globbed_files: /Users/s
alma/Desktop/BCHARTS-MTGOXUSD.csv: glob is: ["/Users/salma/Desktop/BCHARTS-MTGOX
USD.csv"]

Okay. What comes prior to that? I'm looking for clues about what Logstash is doing with the .csv file. The last line in the log snippet you posted at least tells us that Logstash is able to locate the file which is a good sign.

This is first ,Then the first in a non-stop loop

ERROR StatusLogger No log4j2 configuration file found. Using default configurati
on: logging only errors to the console.
Could not find log4j2 configuration at path /Project/elk/logstash/config/log4j2.
properties. Using default config which logs to console
15:33:26.426 [LogStash::Runner] DEBUG logstash.runner - -------- Logstash Settin
gs (* means modified) ---------
15:33:26.431 [LogStash::Runner] DEBUG logstash.runner - node.name: "salma-PC"
15:33:26.431 [LogStash::Runner] DEBUG logstash.runner - *path.config: "test1.con
f"
15:33:26.431 [LogStash::Runner] DEBUG logstash.runner - path.data: "C:/Project/e
lk/logstash/data"
15:33:26.431 [LogStash::Runner] DEBUG logstash.runner - config.test_and_exit: fa
lse
15:33:26.431 [LogStash::Runner] DEBUG logstash.runner - config.reload.automatic:
 false
15:33:26.431 [LogStash::Runner] DEBUG logstash.runner - config.reload.interval:
3
15:33:26.431 [LogStash::Runner] DEBUG logstash.runner - metric.collect: true
15:33:26.431 [LogStash::Runner] DEBUG logstash.runner - pipeline.id: "main"
15:33:26.431 [LogStash::Runner] DEBUG logstash.runner - pipeline.workers: 4
15:33:26.431 [LogStash::Runner] DEBUG logstash.runner - pipeline.output.workers:
 1
15:33:26.431 [LogStash::Runner] DEBUG logstash.runner - pipeline.batch.size: 125
15:33:26.431 [LogStash::Runner] DEBUG logstash.runner - pipeline.batch.delay: 5
15:33:26.431 [LogStash::Runner] DEBUG logstash.runner - pipeline.unsafe_shutdown
: false
15:33:26.431 [LogStash::Runner] DEBUG logstash.runner - path.plugins: []
15:33:26.436 [LogStash::Runner] DEBUG logstash.runner - config.debug: false
15:33:26.436 [LogStash::Runner] DEBUG logstash.runner - *log.level: "debug" (def
ault: "info")
15:33:26.436 [LogStash::Runner] DEBUG logstash.runner - version: false
15:33:26.436 [LogStash::Runner] DEBUG logstash.runner - help: false
15:33:26.436 [LogStash::Runner] DEBUG logstash.runner - log.format: "plain"
15:33:26.436 [LogStash::Runner] DEBUG logstash.runner - http.host: "127.0.0.1"
15:33:26.436 [LogStash::Runner] DEBUG logstash.runner - http.port: 9600..9700
15:33:26.436 [LogStash::Runner] DEBUG logstash.runner - http.environment: "produ
ction"
15:33:26.436 [LogStash::Runner] DEBUG logstash.runner - queue.type: "memory"
15:33:26.441 [LogStash::Runner] DEBUG logstash.runner - queue.page_capacity: 262
144000
15:33:26.441 [LogStash::Runner] DEBUG logstash.runner - queue.max_bytes: 1073741
824
15:33:26.441 [LogStash::Runner] DEBUG logstash.runner - queue.max_events: 0
15:33:26.441 [LogStash::Runner] DEBUG logstash.runner - queue.checkpoint.acks: 1
024
15:33:26.441 [LogStash::Runner] DEBUG logstash.runner - queue.checkpoint.writes:
 1024
15:33:26.441 [LogStash::Runner] DEBUG logstash.runner - queue.checkpoint.interva
l: 1000
15:33:26.466 [LogStash::Runner] DEBUG logstash.runner - slowlog.threshold.warn:
-1
15:33:26.466 [LogStash::Runner] DEBUG logstash.runner - slowlog.threshold.info:
-1
15:33:26.466 [LogStash::Runner] DEBUG logstash.runner - slowlog.threshold.debug:
 -1
15:33:26.466 [LogStash::Runner] DEBUG logstash.runner - slowlog.threshold.trace:
 -1
15:33:26.466 [LogStash::Runner] DEBUG logstash.runner - path.queue: "C:/Project/
elk/logstash/data/queue"
15:33:26.466 [LogStash::Runner] DEBUG logstash.runner - path.settings: "C:/Proje
ct/elk/logstash/config"
15:33:26.466 [LogStash::Runner] DEBUG logstash.runner - path.logs: "C:/Project/e
lk/logstash/logs"
15:33:26.466 [LogStash::Runner] DEBUG logstash.runner - --------------- Logstash
 Settings -------------------
15:33:26.481 [LogStash::Runner] DEBUG logstash.agent - Agent: Configuring metric
 collection
15:33:26.486 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.os - Pe
riodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
15:33:26.511 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.jvm - P
eriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
15:33:26.536 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.persist
entqueue - PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120
}
ERROR StatusLogger No log4j2 configuration file found. Using default configurati
on: logging only errors to the console.
Could not find log4j2 configuration at path /Project/elk/logstash/config/log4j2.
properties. Using default config which logs to console
15:53:27.698 [LogStash::Runner] DEBUG logstash.runner - -------- Logstash Settin
gs (* means modified) ---------
15:53:27.702 [LogStash::Runner] DEBUG logstash.runner - node.name: "salma-PC"
15:53:27.702 [LogStash::Runner] DEBUG logstash.runner - *path.config: "test.conf
"
15:53:27.702 [LogStash::Runner] DEBUG logstash.runner - path.data: "C:/Project/e
lk/logstash/data"
15:53:27.703 [LogStash::Runner] DEBUG logstash.runner - config.test_and_exit: fa
lse
15:53:27.703 [LogStash::Runner] DEBUG logstash.runner - config.reload.automatic:
 false
15:53:27.703 [LogStash::Runner] DEBUG logstash.runner - config.reload.interval:
3
15:53:27.704 [LogStash::Runner] DEBUG logstash.runner - metric.collect: true
15:53:27.704 [LogStash::Runner] DEBUG logstash.runner - pipeline.id: "main"
15:53:27.705 [LogStash::Runner] DEBUG logstash.runner - pipeline.workers: 4
15:53:27.706 [LogStash::Runner] DEBUG logstash.runner - pipeline.output.workers:
 1
15:53:27.708 [LogStash::Runner] DEBUG logstash.runner - pipeline.batch.size: 125
15:53:27.708 [LogStash::Runner] DEBUG logstash.runner - pipeline.batch.delay: 5
15:53:27.709 [LogStash::Runner] DEBUG logstash.runner - pipeline.unsafe_shutdown
: false
15:53:27.709 [LogStash::Runner] DEBUG logstash.runner - path.plugins: []
15:53:27.710 [LogStash::Runner] DEBUG logstash.runner - config.debug: false
15:53:27.710 [LogStash::Runner] DEBUG logstash.runner - *log.level: "debug" (def
ault: "info")
15:53:27.710 [LogStash::Runner] DEBUG logstash.runner - version: false
15:53:27.711 [LogStash::Runner] DEBUG logstash.runner - help: false
15:53:27.711 [LogStash::Runner] DEBUG logstash.runner - log.format: "plain"
15:53:27.711 [LogStash::Runner] DEBUG logstash.runner - http.host: "127.0.0.1"
15:53:27.712 [LogStash::Runner] DEBUG logstash.runner - http.port: 9600..9700
15:53:27.712 [LogStash::Runner] DEBUG logstash.runner - http.environment: "produ
ction"
15:53:27.713 [LogStash::Runner] DEBUG logstash.runner - queue.type: "memory"
15:53:27.714 [LogStash::Runner] DEBUG logstash.runner - queue.page_capacity: 262
144000
15:53:27.715 [LogStash::Runner] DEBUG logstash.runner - queue.max_bytes: 1073741
824
15:53:27.715 [LogStash::Runner] DEBUG logstash.runner - queue.max_events: 0
15:53:27.715 [LogStash::Runner] DEBUG logstash.runner - queue.checkpoint.acks: 1
024
15:53:27.716 [LogStash::Runner] DEBUG logstash.runner - queue.checkpoint.writes:
 1024
15:53:27.716 [LogStash::Runner] DEBUG logstash.runner - queue.checkpoint.interva
l: 1000
15:53:27.716 [LogStash::Runner] DEBUG logstash.runner - slowlog.threshold.warn:
-1
15:53:27.717 [LogStash::Runner] DEBUG logstash.runner - slowlog.threshold.info:
-1
15:53:27.717 [LogStash::Runner] DEBUG logstash.runner - slowlog.threshold.debug:
 -1
15:53:27.718 [LogStash::Runner] DEBUG logstash.runner - slowlog.threshold.trace:
 -1
15:53:27.718 [LogStash::Runner] DEBUG logstash.runner - path.queue: "C:/Project/
elk/logstash/data/queue"
15:53:27.718 [LogStash::Runner] DEBUG logstash.runner - path.settings: "C:/Proje
ct/elk/logstash/config"
15:53:27.718 [LogStash::Runner] DEBUG logstash.runner - path.logs: "C:/Project/e
lk/logstash/logs"
15:53:27.719 [LogStash::Runner] DEBUG logstash.runner - --------------- Logstash
 Settings -------------------
15:53:27.735 [LogStash::Runner] DEBUG logstash.agent - Agent: Configuring metric
 collection
15:53:27.737 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.os - Pe
riodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
15:53:27.759 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.jvm - P
eriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
15:53:27.783 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.persist
entqueue - PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120
}
15:53:27.794 [LogStash::Runner] DEBUG logstash.agent - Reading config file {:con
fig_file=>"C:/Project/elk/logstash/bin/test.conf"}
15:53:27.968 [LogStash::Runner] ERROR logstash.agent - Cannot load an invalid co
nfiguration {:reason=>"Expected one of #, input, filter, output at line 21, colu
mn 8 (byte 529) after ", :backtrace=>["C:/Project/elk/logstash/logstash-core/lib
/logstash/pipeline.rb:47:in `initialize'", "C:/Project/elk/logstash/logstash-cor
e/lib/logstash/pipeline.rb:139:in `initialize'", "C:/Project/elk/logstash/logsta
sh-core/lib/logstash/agent.rb:277:in `create_pipeline'", "C:/Project/elk/logstas
h/logstash-core/lib/logstash/agent.rb:95:in `register_pipeline'", "C:/Project/el
k/logstash/logstash-core/lib/logstash/runner.rb:264:in `execute'", "C:/Project/e
lk/logstash/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:67:in
`run'", "C:/Project/elk/logstash/logstash-core/lib/logstash/runner.rb:183:in `ru
n'", "C:/Project/elk/logstash/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp
/command.rb:132:in `run'", "C:\\Project\\elk\\logstash\\lib\\bootstrap\\environm
ent.rb:71:in `(root)'"]}
15:53:27.972 [Ruby-0-Thread-3: C:/Project/elk/logstash/vendor/bundle/jruby/1.9/g
ems/stud-0.0.22/lib/stud/task.rb:22] DEBUG logstash.agent - starting agent
15:53:27.977 [Api Webserver] DEBUG logstash.agent - Starting puma
15:53:27.977 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.os - Pe
riodicPoller: Stopping
15:53:27.977 [Api Webserver] DEBUG logstash.agent - Trying to start WebServer {:
port=>9600}
15:53:27.978 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.jvm - P
eriodicPoller: Stopping
15:53:27.979 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.persist
entqueue - PeriodicPoller: Stopping
15:53:27.980 [Api Webserver] DEBUG logstash.api.service - [api-service] start

What? Now you're suddenly posting a log from a Windows box running Logstash, with an error in the configuration file that prevents Logstash from starting.

Hi, to solve the error

ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.

You must replacein the runner.rb file at
( [your_path_to]/logstash-5.3.0\logstash-core\lib\logstash\runner.rb) the line:

LogStash::Logging::Logger::initialize("file://" + log4j_config_location)

by the line:

LogStash::Logging::Logger::initialize("file:///" + log4j_config_location)

I had this problem and I solved it but I still having a problem similar to yours (the index is not created in elasticsearch) . I posted it in a new topic here

Hi thank you , I had a week in this error :frowning:

You are welcome.
The solution I said solve all your problems? Now the csv is correctly imported to elasticsearch? Because I had the same problem(CSV is not imported to ES) but solving the log4j2 problem the csv still not being imported.

I solve the log4j2 problem only thank you :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.