Hello, I am new to Elastic, I need help with the development of Logstash, I could not get any logstash to automatically create indexes in ElasticSearch.
I have created 2 logstast .conf and none create the indexes, I leave both here, can anyone tell me how to do it?
input {
jdbc {
jdbc_driver_library => "/etc/logstash/conf.d/mssql-jdbc-8.4.0.jre11.jar"
jdbc_connection_string => "jdbc:sqlserver://IPADDRESS:PORT;databaseName=DASHBOAR_TABLEAU;user=XXXXXX;password=XXXXXXXXXX"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
schedule => "5 * * * *"
jdbc_user => "XXXXXX"
statement => "SELECT [APPLICATION_NAME]
,[TRANSACTION_NAME]
,[LOCATION_NAME]
,[STATUS_ID]
,[EM_RESULT_VALUE]
,[FECHA] FROM ICBC_BPM_TRANS_ALL_DBS WHERE FECHA >= DATEADD(MINUTE,-15, getdate())
and FECHA <= getdate()"
}
}
output {
elasticsearch {
hosts => ["https://:<puerto de conexion "]
user => ""
password => "contraseña admin local de elastic cloud"
index => "Testing"
}
stdout { codec => rubydebug }
}
input {
beats {
port => 5044
}
}
output {
elasticsearch {
hosts => ["https://:<puerto de conexion "]
user => ""
password => "contraseña admin local de elastic cloud"
index => "%{[@metadata][beat]}-%{[@metadata][version]}"
}
}
does the ES user in your output settings have privileges to create index ? (I assume yes, since in pwd looks you are using the ES out of the box admin pwd), here is a base logstash output sample settings
I suggest set a sample index name for your index, like "test123", not sure if the variables and concatenation are correct, I would say "go to the basis" and verify at least the index is created.
Also, to make sure your input and filter are fine, I suggest run logstash as command line and set the output as json to confirm you're getting data.
for the jdbc configuration, the schedule is to run at 5th minute every hour, regardless if you run by command line or service, you need to leave running the pipeline, so the conf can be executed at 1:05 pm, 2:05 pm, 3:05 pm like example, I recommend first remove the schedule and try again, or modify the schedule to run maybe every 5 mins, also take a look on your logstash logs for errors (under your logstash install folder) .
# run every 5 mins (5, 10, 15, 20...)
schedule => "*/5 * * * *"
try this in your input settings, don't pass the sql user/pwd in the jdbc connection string, use jdbc user and password only and also go with a simple SQL statement with no where conditions just to pull a top X docs to make sure data is returned....
input {
jdbc {
jdbc_connection_string => "jdbc:sqlserver://INSTANCE-NAME:PORT;databaseName=DASHBOAR_TABLEAU;integratedSecurity=false;"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_user => "YOUR-SQL-ID" #I recommend use local SQL id instead of AD/domain id.
jdbc_password => "YOUR-SQL-ID-PASSWORD"
#schedule => "* * * * *"
#statement => "SELECT [APPLICATION_NAME] ,[TRANSACTION_NAME],[LOCATION_NAME],[STATUS_ID], [EM_RESULT_VALUE], [FECHA] FROM ICBC_BPM_TRANS_ALL_DBS WHERE FECHA >= DATEADD(mi,-15,GETDATE()) and FECHA <= getdate()"
statement => "SELECT top 10 [APPLICATION_NAME] ,[TRANSACTION_NAME],[LOCATION_NAME],[STATUS_ID], [EM_RESULT_VALUE], [FECHA] FROM ICBC_BPM_TRANS_ALL_DBS"
}
}
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.