hi,
i'm pretty new on elastic and i don't find the solution of my problem.
I try to push a template to elastic with logstash because i want certain data to be formated with the IP type, but i've got a message that say "failed to install template".
If there are others way to format the type to ip than manage template i will take the solution too.
I run the elk stack on a docker on a single node for test.
here my configuration :
Logstash log :
[2020-01-02T08:12:29,024][INFO ][logstash.outputs.elasticsearch] Attempting to install template
[2020-01-02T08:12:29,375][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/pfw-logstash
[2020-01-02T08:12:30,076][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-01-02T08:12:30,090][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2020-01-02T08:12:30,127][ERROR][logstash.outputs.elasticsearch] Failed to install template. {:message=>"Got response code '400' contacting Elasticsearch at URL 'http://10.100.8.1:9200/_template/pfw-logstash'", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:291:in `perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:278:in `block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:373:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:277:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:285:in `block in Pool'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:352:in `template_put'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:86:in `template_install'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/template_manager.rb:28:in `install'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/template_manager.rb:16:in `install_template'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:130:in `install_template'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.1.0-java/lib/logstash/outputs/elasticsearch/common.rb:51:in `block in setup_after_successful_connection'"]}
[2020-01-02T08:12:30,187][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2020-01-02T08:12:30,190][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-01-02T08:12:31,600][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=>"/_monitoring/bulk?system_id=logstash&system_api_version=7&interval=1s", password=>, hosts=>[http://10.100.8.1:9200], sniffing=>false, manage_template=>false, id=>"87fe6c07823850f253c000b06f936133987a38e6ec2b3b9edcd620a19a1b5ae3", user=>"elastic", document_type=>"%{[@metadata][document_type]}", enable_metric=>true, codec=>"plain_f64b91c8-ce41-484e-8ba6-f0bebe082993", enable_metric=>true, charset=>"UTF-8">, workers=>1, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", ssl_certificate_verification=>true, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2020-01-02T08:12:31,653][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@10.100.8.1:9200/]}}
[2020-01-02T08:12:31,675][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@10.100.8.1:9200/"}
[2020-01-02T08:12:31,688][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2020-01-02T08:12:31,690][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-01-02T08:12:31,696][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://10.100.8.1:9200"]}
[2020-01-02T08:12:31,706][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, :thread=>"#"}
[2020-01-02T08:12:31,770][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
[2020-01-02T08:12:31,783][INFO ][logstash.agent ] Pipelines running {:count=>2, :running_pipelines=>[:".monitoring-logstash", :main], :non_running_pipelines=>[]}
[2020-01-02T08:12:32,142][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
logstash.conf :
input {
beats{
port => 5044
}
}
## Add your filters / logstash plugins configuration here
filter {
dissect {
mapping => {
"message" => "%{Syslog_infos},%{Receive_Time},%{Serial},%{Type},%{Threat_Content_Type},%{Config_Version},%{Generate_Time},%{Source_address},%{Destination_address},%{NAT_Source_IP},%{NAT_Destination_IP},%{Rule},%{Soure_user},%{Destination_user},%{Application},%{Virtual_System},%{Source_zone},%{Destination_Zone},%{Inbound_Interface},%{Outbound_Interface},%{Log_Action},%{Time_Logged},%{Session_ID},%{Repeat_Count},%{Source_Port},%{Destination_Port},%{Nat_Source_Port},%{Nat_Destination_Port},%{Flags},%{IP_Protocol},%{Action},%{URL_FileName},%{Threat_Content_Name},%{Category},%{Severity},%{Direction},%{Sequence_Number},%{Action_Flags},%{Source_Country},%{Destination_Country},%{Cpadding},%{Contentype},%{Pcap_id},%{Filedigest},%{cloud},%{Url_idx},%{User_agent},%{Filetype},%{Xff},%{Referer},%{Sender},%{Subject},%{Recipient},%{Reportid},%{DG_Hierarchy_Level_1},%{DG_Hierarchy_Level_2},%{DG_Hierarchy_Level_3},%{DG_Hierarchy_Level_4},%{Virtual_System_Name},%{Device_Name},%{File_URL},%{Source_V:_UUID},%{Destination_VM_UUID},%{HTTP_method},%{Tunnel_ID_IMSI},%{Monitor_TAG_IMEI},%{Parent_Session_ID},%{Tunnel},%{Thr_Category},%{Contentver},%{Sig_Flags},%{SCTP_Association_ID},%{Pqyloqd_Protocol_ID},%{Http_headers},%{URL_Category_List},%{UUID_for_rule},%{HTTP2_connection}"
}
}
date {
timezone => "Europe/Paris"
match => ["Receive_Time","YYYY/MM/dd HH:mm:ss"]
}
mutate {
convert => ["Serial","integer"]
convert => ["Config_Version","integer"]
convert => ["Session_ID","integer"]
convert => ["Repeat_Count","integer"]
convert => ["Source_Port","integer"]
convert => ["Destination_Port","integer"]
convert => ["Nat_Source_Port","integer"]
convert => ["Nat_Destination_Port","integer"]
convert => ["Sequence_Number","integer"]
convert => ["Pcap_id","integer"]
convert => ["Url_idx","integer"]
convert => ["Reportid","integer"]
convert => ["DG_Hierarchy_Level_1","integer"]
convert => ["DG_Hierarchy_Level_2","integer"]
convert => ["DG_Hierarchy_Level_3","integer"]
convert => ["DG_Hierarchy_Level_4","integer"]
convert => ["Tunnel_ID_IMSI","integer"]
convert => ["Parent_Session_ID","integer"]
convert => ["Payload_Protocol_ID","integer"]
convert => ["Http_headers","integer"]
}
}
output {
elasticsearch {
hosts => "http://10.100.8.1:9200"
index => "paloalto-%{+YYYY.MM.dd}"
user => "foo"
password => "password"
manage_template => true
template => "/usr/share/logstash/template/logstash-pfw-threat.json"
template_name => "pfw-logstash"
template_overwrite => true
}
}