Hey guys,
Inside "elasticsearch_deprecation.log" i see the odd warning stating: [types removal] Specifying types in bulk requests is deprecated.
.
I want to remove this, i am running my Elastic Stack on v7.7. I looked online but keep going down rabbitholes? I cant seem to find anything that simply states a command or something to resolve this.
Can someone shed some light on this?
Thanks.
dadoonet
(David Pilato)
May 22, 2020, 3:33pm
2
How are you indexing documents?
Hey there,
Apologies for the noobie response, I hope the below might answer this as i dont exactly understand how to answer that question as i am still learning Elastic Stack.
I used Kibanas Index Patterns? So i have an index in the format: "filebeat-" So i setup a pattern to match "filebeat-*".
Are you asking me to provide you with my logstash setup, filebeat setup or?
Whatever you need to assist me i will provide.
Again, apologies in advance!
dadoonet
(David Pilato)
May 22, 2020, 4:58pm
4
I was asking about the tools you are using to index data into elasticsearch. Is it logstash or filebeat?
Anyway, make sure you are using the latest version of those components.
Hey Dadoonet,
I am using Filebeats to send the data to Logstash.
Logstash is on the latest version, filebeats is not actually.. Since i have many Agents.
I will update Filebeats now on the servers and report back.
Hopefully that resolves this
Thanks for the suggestion!
Hey Dadoonet,
I just updated filebeats on all my servers.
I can confirm the entire Elastic Stack / Components are now all running on 7.7.0.
Unfortunately i can still see that error populating.
[types removal] Specifying types in bulk requests is deprecated.
Do i have to cleanse my current indices perhaps? Maybe its contaminated?
Thanks.
dadoonet
(David Pilato)
May 22, 2020, 6:40pm
7
What are you using logstash for?
What is its configuration ?
Hey,
I have logstash installed as a windows service, i have it run a param: "-f C:\Logstash\logstash.conf" for example, inside logstash.conf is:
input {
beats {
port => 5044
}
}
filter {
if [fields][log_type] == "check1" {
grok {
match => {
"message" => [
"(?m)%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:[fields][threadid]}\] %{WORD:level} - %{DATA:[fields][sourceontext]} %{GREEDYDATA:unparsed} : (?<message>.*?(?=[a-zA-Z.]*Exception:))%{GREEDYDATA:exception}",
"(?m)%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:[fields][threadid]}\] %{WORD:level} - %{DATA:[fields][sourceontext]} %{GREEDYDATA:unparsed} : %{GREEDYDATA:message}"
]
}
overwrite => [ "message" ]
}
}
else if [fields][log_type] == "check2" {
grok {
match => {
"message" => [
"(?m)%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:[fields][threadid]}\] %{WORD:level} : %{GREEDYDATA:message}",
"(?m)%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:[fields][threadid]}\] %{WORD:level} %{DATA:[fields][sourceontext]} : %{GREEDYDATA:message}",
"(?m)%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:[fields][threadid]}\] %{WORD:level} %{DATA:[fields][sourceontext]} : (?<message>.*?(?=[a-zA-Z.]*Exception:))%{GREEDYDATA:exception}",
"(?m)%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:[fields][threadid]}\] %{WORD:level} %{DATA:[fields][sourceontext]} %{GREEDYDATA:unparsed} : %{GREEDYDATA:message}"
]
}
overwrite => [ "message" ]
}
}
else if [fields][log_type] == "check3" {
grok {
match => {
"message" => [
"(?m)%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:[fields][threadid]}\] %{WORD:sessionid} %{IP:ip} %{WORD:level} %{DATA:[fields][sourceontext]} : %{GREEDYDATA:message}",
"(?m)%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:[fields][threadid]}\] %{WORD:sessionid} %{WORD:ip} %{WORD:level} %{DATA:[fields][sourceontext]} %{GREEDYDATA:unparsed} : (?<message>.*?(?=[a-zA-Z.]*Exception:))%{GREEDYDATA:exception}",
"(?m)%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:[fields][threadid]}\] %{WORD:sessionid} %{WORD:ip} %{WORD:level} %{DATA:[fields][sourceontext]} %{GREEDYDATA:unparsed} : %{GREEDYDATA:message}"
]
}
overwrite => [ "message" ]
}
}
else if [fields][log_type] == "check4" {
grok {
match => {
"message" => [
"(?m)%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:[fields][threadid]}\] %{DATA:fieldid} %{WORD:level} %{DATA:loggingclass} %{DATA:[fields][sourceontext]} : %{GREEDYDATA:message}",
"(?m)%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:[fields][threadid]}\] %{DATA:fieldid} %{WORD:level} %{DATA:[fields][sourceontext]} : %{GREEDYDATA:message}"
]
}
overwrite => [ "message" ]
}
}
else {
grok {
match => {
"message" => [
"(?m)%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:[fields][threadid]}\] %{WORD:level} - %{DATA:[fields][sourceontext]} \[%{DATA:[fields][requestid]}\] %{GREEDYDATA:unparsed} : (?<message>.*?(?=[a-zA-Z.]*Exception:))%{GREEDYDATA:exception}",
"(?m)%{TIMESTAMP_ISO8601:timestamp} \[%{NUMBER:[fields][threadid]}\] %{WORD:level} - %{DATA:[fields][sourceontext]} \[%{DATA:[fields][requestid]}\] %{GREEDYDATA:unparsed} : %{GREEDYDATA:message}"
]
}
overwrite => [ "message" ]
}
}
date {
match => ["timestamp", "ISO8601"]
remove_field => ["timestamp"]
}
json {
skip_on_invalid_json => true
source => "unparsed"
target => "[fields][scope]"
}
mutate {
remove_field => ["unparsed"]
}
if "beats_input_codec_plain_applied" in [tags] {
mutate {
remove_tag => ["beats_input_codec_plain_applied"]
}
}
}
output {
elasticsearch {
hosts => ["https://ElasticSearch:9200"]
ssl => true
cacert => "file.pem"
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
user => "${ES_USERNAME}"
password => "${ES_PASSWORD}"
}
}
I have logs populating in a certain way so my checks are for different log types.
Thanks.
dadoonet
(David Pilato)
May 22, 2020, 7:58pm
9
I moved the question to #logstash as it looks like logstash is sending the type field in bulk requests which should be removed I believe.
https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-document_type
I guess it's there for compatibility reason. May be experts can comment.
Badger
May 22, 2020, 9:14pm
10
@irishwill2008 that is a known issue. Hopefully it is fixable now.
system
(system)
Closed
June 19, 2020, 9:14pm
11
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.