Logstash does not create index on elasticsearch

Hello, I am new to Elastic, I need help with the development of Logstash, I could not get any logstash to automatically create indexes in ElasticSearch.

I have created 2 logstast .conf and none create the indexes, I leave both here, can anyone tell me how to do it?

input {
jdbc {
jdbc_driver_library => "/etc/logstash/conf.d/mssql-jdbc-8.4.0.jre11.jar"
jdbc_connection_string => "jdbc:sqlserver://IPADDRESS:PORT;databaseName=DASHBOAR_TABLEAU;user=XXXXXX;password=XXXXXXXXXX"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
schedule => "5 * * * *"
jdbc_user => "XXXXXX"
statement => "SELECT [APPLICATION_NAME]
,[TRANSACTION_NAME]
,[LOCATION_NAME]
,[STATUS_ID]
,[EM_RESULT_VALUE]
,[FECHA] FROM ICBC_BPM_TRANS_ALL_DBS WHERE FECHA >= DATEADD(MINUTE,-15, getdate())
and FECHA <= getdate()"
}
}

output {
elasticsearch {
hosts => ["https://:<puerto de conexion "]
user => ""
password => "contraseña admin local de elastic cloud"
index => "Testing"
}
stdout { codec => rubydebug }
}

input {

beats {
port => 5044
}
}

output {

elasticsearch {
hosts => ["https://:<puerto de conexion "]
user => ""
password => "contraseña admin local de elastic cloud"
index => "%{[@metadata][beat]}-%{[@metadata][version]}"
}
}

does the ES user in your output settings have privileges to create index ? (I assume yes, since in pwd looks you are using the ES out of the box admin pwd), here is a base logstash output sample settings

elasticsearch {
	
		user => "your_id"
		password => "your_pwd"
		
		hosts => ["server_1:9200","server_n:9200","server_x:9200"]
		document_id => "optional-if-you-want-set-for-specific-input-prop-as-id"
	
		index => "my_sample_index_%{+YYYYMMdd}"
		doc_as_upsert => true
		action => "update"
		}

I suggest set a sample index name for your index, like "test123", not sure if the variables and concatenation are correct, I would say "go to the basis" and verify at least the index is created.

Also, to make sure your input and filter are fine, I suggest run logstash as command line and set the output as json to confirm you're getting data.

i.e.

 output {
      stdout { codec => json }
    }

for the jdbc configuration, the schedule is to run at 5th minute every hour, regardless if you run by command line or service, you need to leave running the pipeline, so the conf can be executed at 1:05 pm, 2:05 pm, 3:05 pm like example, I recommend first remove the schedule and try again, or modify the schedule to run maybe every 5 mins, also take a look on your logstash logs for errors (under your logstash install folder) .

# run every 5 mins  (5, 10, 15, 20...)
schedule => "*/5 * * * *"

I pass the modification according to your recommendations, I have not yet created an index in elasticsearch.

After creating the .conf file, run bin / logstash --debug -f /etc/logstash/conf.d/pivote.conf

My operating system is DEBIAN 9.

In the execution of the Logsatash no errors appear to me, but the index "test_index_" in elasticsearch will not be created

Las credenciales del SQL y de Elastic son correctos.
52/5000
The SQL and Elastic credentials are correct.

I don't know what else I can do

input {
jdbc {
jdbc_driver_library => "/etc/logstash/conf.d/mssql-jdbc-8.4.0.jre11.jar"
jdbc_connection_string => "jdbc:sqlserver://IPADDRESS:PORT;databaseName=DASHBOAR_TABLEAU;user=XXXXXX;password=XXXXXXXXXX"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
schedule => "* * * * *"
jdbc_user => "XXXXXX"
statement => "SELECT [APPLICATION_NAME]
,[TRANSACTION_NAME]
,[LOCATION_NAME]
,[STATUS_ID]
,[EM_RESULT_VALUE]
,[FECHA] FROM ICBC_BPM_TRANS_ALL_DBS WHERE FECHA >= DATEADD(MINUTE,-15, getdate())
and FECHA <= getdate()"
}
}

output {
elasticsearch {
user => "XXXX"
password => "XXXXX"

	hosts => ["URL Elastic Cloud"]
	document_id => "testid"

	index => "test_index_%{+YYYYMMdd}"
	doc_as_upsert => true
	action => "update"
	}

stdout { codec => rubydebug }
}

try this in your input settings, don't pass the sql user/pwd in the jdbc connection string, use jdbc user and password only and also go with a simple SQL statement with no where conditions just to pull a top X docs to make sure data is returned....

input {
  jdbc {
    jdbc_connection_string => "jdbc:sqlserver://INSTANCE-NAME:PORT;databaseName=DASHBOAR_TABLEAU;integratedSecurity=false;"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    jdbc_user => "YOUR-SQL-ID"   #I recommend use local SQL id instead of AD/domain id.
    jdbc_password => "YOUR-SQL-ID-PASSWORD"
	#schedule => "* * * * *"
	#statement => "SELECT [APPLICATION_NAME] ,[TRANSACTION_NAME],[LOCATION_NAME],[STATUS_ID], [EM_RESULT_VALUE], [FECHA] FROM ICBC_BPM_TRANS_ALL_DBS WHERE FECHA >= DATEADD(mi,-15,GETDATE()) and FECHA <= getdate()"
	statement => "SELECT top 10 [APPLICATION_NAME] ,[TRANSACTION_NAME],[LOCATION_NAME],[STATUS_ID], [EM_RESULT_VALUE], [FECHA] FROM ICBC_BPM_TRANS_ALL_DBS"
	}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.