Can give a examplge about how logstash use udp data to es?

can give a example about use udp data ?
how configure logstash with udp data as port 3232
input dat from as port 3242 , how filter configure how output to es configure.
give me a example
it's about the .conf file setting with logstash.
thanks so much

What have you tried so far?

how the filter settings about grok match

You have not shown what you have tried nor what the data received looks like, so I am not sure how I can help.

@Christian_Dahlqvist
the data link this
1521102697403000002|!xxx|!11.xx.xx.ipv4|! |! |!1521102697021|!1521102697403|!xx.xx.xxx.xxx|!|!4|!61894|!xx.xx.xxx.ipv4|!|!4|!531|!xx.xx.xx.ipv4|!514|!EE:FF:FF:FF:FF:FF|!MACADD|!24|!tcp|!xxx|!17=0 18=0 19=0 20=0 21=0 22=0 23=0 24=0 25=0 26=0 27=0 28=0 29=0 30=0 31=0 32=0 33=11844|! |!
how do it

This looks like csv formatted data. Why not try a csv filter with |! as separator? You can the apply a kv filter to the field containing 17=0 18=0 19=0 20=0 21=0 22=0 23=0 24=0 25=0 26=0 27=0 28=0 29=0 30=0 31=0 32=0 33=11844 if you want to parse these out as fields too.

this is use logstash udp date . not csv file.
can give a example about this udp how configure with this data.

this is time .
which patterns can use.
@Christian_Dahlqvist

If the data received over UDP is in csv format (which it looks like), you can use the csv filter to parse it. I do not think you necessarily need to use grok here.

OK i try
the elastic doc can't open now .:joy::joy::joy::joy:
can give a CSV example about my data.
@Christian_Dahlqvist

Try something like this:

input {
  udp {
    port => 3232
  } 
} 

filter {
  csv {
    columns => ["c1", "c2", "c3", "c4", "c5", "c6", "c7", "c8", "c9", "c10", "c11", "c12", "c13", "c14", "c15", "c16", "c17", "c18", "c19", "c20", "c21", "c22", "c23", "c24", "c25"]
    separator => "|!"
  }
}

output {
  stdout { codec => rubydebug }
}

Rename columns as appropriate, the use mutate to cast numbers and a date filter to parse the appropriate timestamp.

this how do about kv?

Look at the docs for details. Something like this might work:

kv {
  source => "c23"
}

Not sure the source field is correct...

yes kv is 23 .
how do it
26=0 the 26 is id 0 is value.

thanks so much .

@Christian_Dahlqvist

now in kibana the data is like
_source": {
"1": "1",
"2": "0",
"3": "0",
"4": "0",
"5": "1",
"6": "0",
"7": "1",
"8": "0",
"9": "0",
"10": "0",
"11": "0",
"12": "0",
"13": "0",
"14": "0",
"15": "0",
"16": "0",
"code": "1=1 2=0 3=0 4=0 5=1 6=0 7=1 8=0 9=0 10=0 11=0 12=0 13=0 14=0 15=0 16=0",

how change the "16": "0", to "testname": "0", not use the id use the custom name
it change in logstash configure file or with es mapping ?
how can do it thanks.
@Christian_Dahlqvist
@warkolm

I suspect you may need to use a ruby filter to change the field names.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.