Parsing JSON error Logstash


(SuperK) #1

Hi all

I am trying to import a JSON file but I get the following error.

filter {
json {
# This setting must be a string
# Expected string, got ["followUpDate", "dateOfBirth", "medical.dateFinalised", "medical.dateOfEntry", "medical.dateOfSuspension", "medical.dateOfWithdrawal", "medical.gapDateFinalised", "medical.gapDateOfEntry", "medical.gapDateOfSuspension", "medical.gapDateOfWithdrawal", "medical.loyaltyDateFinalised", "medical.loyaltyDateOfEntry", "medical.loyaltyDateOfSuspension", "medical.loyaltyDateOfWithdrawal", "medical.loyalty", "medical.loyaltyStatus", "medical.monthlyTotalContribution", "medical.monthlyRiskContribution", "medical.monthlyMSAContribution", "medical.monthlyLoyaltyContribution", "medical.monthlyOtherContribution", "medical.totalYearToDate", "medical.mostRecentBillDate", "medical.chronic", "medical.msaPayoutChoice", "medical.msaCarryOver", "medical.msaAnnual", "medical.msaProrated", "medical.msaBalance", "medical.annualThreshold", "medical.thresholdProrated", "medical.msaSpent", "medical.amountAccumulatedToThreshold", "medical.currentSPG", "medical.thresholdIndicator", "medical.ihPaid", "medical.lastUpdated", "medical.status", "title", "firstname", "surname", "idNumber", "gender", "employerName", "employer.value", "employer.label", "employer.type_of", "employer.employerNumber", "employer_id", "employerBranch", "occupation", "email", "cellphone", "salary", "status", "initials", "fsp_id", "consultant_id", "type_of", "broker.name", "broker.houseName", "broker.houseCode", "broker.code", "dates.dateOfEntry", "dates.dateOfWithdrawal", "updateSource", "history.0.creator.label", "history.0.creator.value", "history.0.type", "history.0.from", "history.0.to", "history.0.data", "history.0.source", "history.0.timestamp", "currentScheme", "employerNumber", "employerBranchNumber", "memberNumber", "employeeNumber", "language", "numDependants", "numSpouses", "numAdults", "numChildren", "currentOption", "workTelCode", "workTelNumber", "postalAddress1", "postalAddress2", "postalAddress3", "postalSuburb", "postalPostCode", "deleted_at"]
source => ["followUpDate", "dateOfBirth", "medical.dateFinalised", "medical.dateOfEntry", "medical.dateOfSuspension", "medical.dateOfWithdrawal", "medical.gapDateFinalised", "medical.gapDateOfEntry", "medical.gapDateOfSuspension", "medical.gapDateOfWithdrawal", "medical.loyaltyDateFinalised", "medical.loyaltyDateOfEntry", "medical.loyaltyDateOfSuspension", "medical.loyaltyDateOfWithdrawal", "medical.loyalty", "medical.loyaltyStatus", "medical.monthlyTotalContribution", "medical.monthlyRiskContribution", "medical.monthlyMSAContribution", "medical.monthlyLoyaltyContribution", "medical.monthlyOtherContribution", "medical.totalYearToDate", "medical.mostRecentBillDate", "medical.chronic", "medical.msaPayoutChoice", "medical.msaCarryOver", "medical.msaAnnual", "medical.msaProrated", "medical.msaBalance", "medical.annualThreshold", "medical.thresholdProrated", "medical.msaSpent", "medical.amountAccumulatedToThreshold", "medical.currentSPG", "medical.thresholdIndicator", "medical.ihPaid", "medical.lastUpdated", "medical.status", "title", "firstname", "surname", "idNumber", "gender", "employerName", "employer.value", "employer.label", "employer.type_of"]
...
}
}
[ERROR] 2018-06-25 09:48:01.205 [LogStash::Runner] agent - Cannot create pipeline {:reason=>"Something is wrong with your configuration."}

My logstash config file is the following:
input {
file {
path => "/home/uadmin/Healthcare/nmgmem.json"
start_position => "beginning"
sincedb_path => "/dev/null"
}

}

filter {
json{
source => [ "followUpDate", "dateOfBirth", "medical.dateFinalised", "medical.dateOfEntry", "medical.dateOfSuspension", "medical.dateOfWithdrawal",
"medical.gapDateFinalised", "medical.gapDateOfEntry", "medical.gapDateOfSuspension", "medical.gapDateOfWithdrawal", "medical.loyaltyDateFinalised",
"medical.loyaltyDateOfEntry", "medical.loyaltyDateOfSuspension", "medical.loyaltyDateOfWithdrawal", "medical.loyalty", "medical.loyaltyStatus",
"medical.monthlyTotalContribution", "medical.monthlyRiskContribution", "medical.monthlyMSAContribution", "medical.monthlyLoyaltyContribution",
"medical.monthlyOtherContribution", "medical.totalYearToDate", "medical.mostRecentBillDate", "medical.chronic", "medical.msaPayoutChoice",
"medical.msaCarryOver", "medical.msaAnnual", "medical.msaProrated", "medical.msaBalance", "medical.annualThreshold", "medical.thresholdProrated",
"medical.msaSpent", "medical.amountAccumulatedToThreshold", "medical.currentSPG", "medical.thresholdIndicator", "medical.ihPaid",
"medical.lastUpdated", "medical.status", "title","firstname","surname", "idNumber", "gender", "employerName", "employer.value",
"employer.label", "employer.type_of"]

}

}

output {
elasticsearch {
hosts => "localhost"
index => "nmg-jsontest"
document_type => "test"

}
stdout{}

}


(Christian Dahlqvist) #2

The source parameter in the JSON filter should indicate which field to parse JSON from. I do not know what you are trying to do, but what you have specified there is not correct. Have a look in the documentation for an example.


(SuperK) #3

Thank you for your response, I am actully trying to pipe a mongoDB json export into ES for analysis.

Here is one document {"_id":{"$oid":"5aeaf4e916df6f665a26b742"},"dependants":[],"tags":[],"followUpDate":null,"idUnknown":true,"dateOfBirth":"09/06/1984","medical":{"dateFinalised":null,"dateOfEntry":"2018/06/01","dateOfSuspension":null,"dateOfWithdrawal":"9999/12/31","gapDateFinalised":null,"gapDateOfEntry":null,"gapDateOfSuspension":null,"gapDateOfWithdrawal":null,"loyaltyDateFinalised":null,"loyaltyDateOfEntry":null,"loyaltyDateOfSuspension":null,"loyaltyDateOfWithdrawal":null,"loyalty":"NO","loyaltyStatus":"","monthlyTotalContribution":"2773","monthlyRiskContribution":"2080","monthlyMSAContribution":"693","monthlyLoyaltyContribution":"0","monthlyOtherContribution":"0","totalYearToDate":"2773","mostRecentBillDate":"01/06/2018","chronic":"NO","msaPayoutChoice":"DH Rate"}

Hope that makes sense. I will check docs.

Regards

K


(Christian Dahlqvist) #4

If you output to the stdout plugin with a rubydebug codec, you will be able to see what the event looks like, which makes it easier to apply the correct filters. This also makes it a lot easier for anyone helping out.


(SuperK) #5

Thank you I will try that route, appreciate the help.


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.