Logstash dynamic id


(Gh) #1

i am using elasticsearch + logstash + kibana .
i have i question , i get log data like this :
2018-03-01 18:03:13.504 INFO 30613 --- [nio-8040-exec-3] com.gh.controller.LogController : log/SearchIndex requestdata:|index:oodkfke|type:user|id=1|data={"data1":"data11","data2":"data22"}|

i want to use oodkfke as index, user as type , 1 as id ,then save the data into elasticsearch to use this index, type, id

you can look at logstash my config file , i donnot know where is the error ?
by the way ,i am not good at gork .

input {
file {
type => "tomcatlog"
path => ["/usr/local/tomcat8/logs/catalina.out"]
discover_interval => 1
#start_position => "beginning"
sincedb_path => "/usr/local/logstash/config/sincedb_order.txt"
sincedb_write_interval => 1
codec => multiline {
charset => "UTF-8"
pattern => "^%{DATESTAMP_CN}"
negate => true
what => "next"
}
}
file {
type => "demon"
path => ["/usr/local/elasticsearch/log/demon.log"]
discover_interval => 1
sincedb_path => "/usr/local/logstash/config/sincedb_demon.txt"
sincedb_write_interval => 1
codec => multiline {
charset => "UTF-8"
pattern => "^%{DATESTAMP_CN}"
negate => true
what => "next"
}
}
}

输出到 elasticsearch

filter {
if [type] == "tomcatlog"{
grok {
match => { "message" => "%{DATESTAMP_CN:[@metadata][logdate]} .* | %{WORD:index}|%{WORD:type}|id:%{DATA:id}|%{WORD:data}|" }
}
ruby {
code => " event.set('indexid', '%{id}');
event.set('typeid', '%{id}');
event.set('idid', '%{id}');
event.set('dataid', '%{id}')
"
}
}else if [type] == "demon" {

}    

}
output {
if "_grokparsefailure" not in [tags] and "_dateparsefailure" not in [tags] {
elasticsearch {
index => "oodkfkd"
document_type => "syslog"
document_id => "%{idid}"
hosts => ["192.168.198.132:9200"]
manage_template => true
template_overwrite => true
template_name => "orderservice"
template => "/usr/local/my_logstash_template/orderservice_template.json"
}
}else{
elasticsearch {
hosts => ["192.168.198.132:9200"]
index => "err"
document_type => "err"
}
}
stdout {
# 1 把采集的数据输出到elasticsearch里面 2 以JSON格式输出
codec => rubydebug
#codec => json_lines
}
}


(Gh) #2

i have seen this , i donnot know the reason why his server is ok .

input {
tcp{
port=>3362
type="mf_data"
codec=>"json_lines"
}

}

filter{
json{source=>"message"}
grok{match=>"message","documentID:%{DATA:documentID}"]}
}
output{
elasticsearch{
host=>"localhost"
index_type=>"customType"
index=>"event_%{documentID}"
}
}
Input is {"domain":"test.com","documentID":"cAmii"}


(Tag V) #3

Your question seems incomplete. what went wrong with your conf and what exactly is your requirement ? If u got any errors ping them over here.


(Gh) #4

my bad , not error ,
this is the log : 2018-03-01 18:03:13.504 INFO 30613 --- [nio-8040-exec-3] com.gh.controller.LogController : log/SearchIndex 请求数据数据:|index:oodkfke|type:user|id=1|data={"data1":"data11","data2":"data22"}|
i want to set the value of elasticsearch _id to the value 1 ,and set elasticsearch index to oodkfke , and set elasticsearch type to user .
but i cannot get the value from the log data .

tag_v
March 6 |

Your question seems incomplete. what went wrong with your conf and what exactly is your requirement ? If u got any errors ping them over here.

Visit Topic or reply to this email to respond.

In Reply To

1243userGh
March 6 |
i have seen this , i donnot know the reason why his server is ok . input { tcp{ port=>3362 type="mf_data" codec=>"json_lines" } } filter{ json{source=>"message"} grok{match=>"message","documentID:%{DATA:documentID}"]} } output{ elasticsearch{ host=>"localhost" index_type=>"custom…

Visit Topic or reply to this email to respond.

To unsubscribe from these emails, click here.


(Tag V) #5

I thinks this is your requirement:

output {
 elasticsearch {
  hosts => ["loclahost:9200"]
  document_type => "user"
  document_id  =>  "%{documentID}"   // documentID you are getting from grok
}
}

(Gh) #6

i have tried , but not efficient , id is the elasticsearch default uuid .


(Gh) #7
  1. List item
    my log data is :slight_smile:

2018-03-01 18:03:13.504 INFO 30613 --- [nio-8040-exec-3] com.gh.controller.LogController : log/SearchIndex requestdata:|index:oodkfke|type:user|id=1|data={"data1":"data11","data2":"data22"}|

my gork is :

   match => { "message" => "%{DATESTAMP_CN:[@metadata][logdate]} .* | %{WORD:index}|%{WORD:type}|id:%{DATA:id}|%{WORD:data}|" }

your mean i use this ?
output {
elasticsearch {
hosts => ["loclahost:9200"]
document_type => "user"
document_id => "%{id}" //
}
}


(Tag V) #8

Yep. It wil take vale of field "id" which was created from grok pattern for every msg and will append to elasticsearch output. Ping error if u got any

⁣Sent from Blue ​


(Gh) #9

error is idid is nil
and _id = %{idid}
obviously, the value not trans to idid ,
i doubt my gork is right ? ,
this website http://grokdebug.herokuapp.com/?#
i put my gork and input to this ,but no response ,so i cannot judge whether my gork is right or not


(system) #10

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.