Lostash problem

My logstash file code as follows

input {

jdbc {

jdbc_driver_library => "D:\mysql-connector-java-5.1.44\mysql-connector-java-5.1.44\mysql-connector-java-5.1.44-bin.jar"


jdbc_driver_class => "com.mysql.jdbc.Driver"

 
jdbc_connection_string => "jdbc:mysql://localhost:3306/sample"

  
jdbc_user => "root"


jdbc_password => "root"


jdbc_fetch_size => 10000


    schedule => "* * * * *"
    statement => "SELECT * from sample"

codec => "json"

  }

}




output {

elasticsearch { 
hosts => ["localhost:9200"] 
manage_template => false
index => "clinical"
document_id => "%{RRH_MR_NUM}" }
stdout { codec => rubydebug }
}

when i see the output in kibana as follows

cpa_addr_2: - cpa_addr_area: - mcs_crt_uid:1104158 cpa_addr_1:GALSI- rrh_mr_num:3439829 cpa_pin_code: - rrh_pat_sex:Male mcs_crt_dt:2015-04-18T
mcs_case_summary: MRD No: 3439829 Patient Name: MR. AKKASH S K Gender: Male Age: 59 Year(s) ReportDate: 2015-04-18 Examined By: MOUPIYA DAS Consultant: External File Uploaded - Patient InfoInvestigation Report - FUN

when i click the refresh button in top right side of kibana discover record refreshing in the above record but count (hits) shows 1 only.

when i click refresh button count shows one only. but records changing.

is there any problem in logstash?

please let me know

Does that field map to a column in the database table?

in My sql table query as follows

select * from sample

gives output as follows

RRH_MR_NUM cpa_addr cpa_addr_area mcs_crt_uid cpa_addr_1 cpa_pin_code rrh_pat_sex mcs_Crt_dt mcs_case_summary

1104158 - - 1104158 GALSI- 3439829 - male 2015-04-18T MRD No: 3439829 Patient Name: MR. AKKASH S K Gender: Male Age: 59 Year(s) ReportDate: 2015-04-18 Examined By: MOUPIYA DAS Consultant: External File Uploaded - Patient InfoInvestigation Report - FUN

in kibana screen shows field as follows

RRH_MR_NUM
CPA_ADDR
CPA_ADDR_AREA
MCS_CRT_UID
CPA_ADDR_1
CPA_Pin_CODE
RRH_PAT_SEX
MCS_CRT_DT
MCS_CASE_SUMMARY

yes the field mapped to column in the database.

Any information required to you.

imam importing csv file. from that csv i am displaying into mysql table. that table i mentioned above sample.

already i mentioned from the elastic search and logstash display those table name sample details in to kibana,

yes that field is mapped into column of database table

do you want any more information.

is there any mistake in my logstash confi file code for removing duplicates

output {

elasticsearch {
hosts => ["localhost:9200"]
manage_template => false
index => "clinical"
document_id => "%{RRH_MR_NUM}" }
stdout { codec => rubydebug }
}
i

i sent the screen shot as follows

in that screen shot RRH_MR_NUM column is mapped.

please let me know how to solve this avoiding duplicates in logstash

or in kibana we can avoid duplicates or in the elastic search we can avoid duplicates

any one can you please tell me how to solve that error

please let me know

just a guess: maybe it's case sensitive? have you tried using ${rrh_mr_num} instead?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.