Pororo
February 13, 2018, 6:45pm
1
Hi, I was trying to import json file to ES thru logstash but it seems something wrong with my operation.
Here is my json sample:
{"jurHash":"112256955","txnId":"vipus0E48BEC71E28C81F","ts":"2018-01-29 00:03:30.085 +0000","result.statusMessage":"Success","durationMillis":2,"requestId":"9999","extUserId":"testto demo","result.status":"0000","wsdlVersion":"1_8","operation":"createUser","_id":"car4be-w2-tc.1517184210085.15247464"}
And this is my config:
input {
file {
path => "/Users/apple/Desktop/SampleData/event.log"
start_position => "beginning"
sincedb_path => "/dev/null"
codec => json_lines{
}
}
}
filter {
json{
source => "message"
#target =>"doc"
remove_field => ["message"]
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "realmockdata"
document_type => "eventdata"
}
stdout {}
}
I can start up logstash successfully but it just stuck in "Pipeline running".
warkolm
(Mark Walkom)
February 13, 2018, 9:52pm
2
Sounds like it is waiting for more input.
You might have better luck using stdin
instead of a file input.
Pororo
February 13, 2018, 10:26pm
3
That is one way, but my json file has hundreds of items like that. So I think I have to use "file".
warkolm
(Mark Walkom)
February 13, 2018, 10:36pm
4
Nope, cat $file | logstash ....
and you are good.
Badger
February 13, 2018, 10:36pm
5
I cannot speak to the question you asked, but if you use a json_lines codec on the input, you do not need a json filter. The line gets parsed as JSON on input.
Pororo
February 14, 2018, 7:09am
6
I successfully imported the json to ES but it seems it did not do mappings for each attributes of my json. It only has the whole bunch of message.
system
(system)
Closed
March 14, 2018, 7:09am
7
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.