tomer
(tomer zaks)
February 27, 2017, 1:07pm
1
I am new to the whole ELK. I have a field that I get through a Json that is called "msgDeliveryTime" who I get as a String even though it is an Integer. So I tried to learn the "GROK" and I improvised the following:
input {
beats {
port => 5044
}
}
filter {
mutate {
convert => { "msgDeliveryTime" => "integer" }
}
}
output {
elasticsearch { hosts => ["192.1.1.1:9200"] }
}
This did not budge the output at all.
I know that this is basics but for some reason I am not able to google it . Any help will be great!
P.S.
My real goal is make "msgDeliveryTime" as my timestamp (it is given as mm)
Comment out your elasticsearch output and insert a stdout { codec => rubydebug }
output instead. What results do you get in your log, i.e. what do your events really look like?
A possible problem is that the msgDeliveryTime
field already has been mapped as a string, and the mapping of fields can't be changed without reindexing.
1 Like
tomer
(tomer zaks)
February 27, 2017, 1:40pm
3
Thank you for the quick reply!
Well I checked using "stdout { codec => rubydebug } " the script works like before.
For confidential reasons my company might not want me to put here a log but is combined something like this:
{"dataCoding":0,"from":"123","identifier":"123","longMessageSequenceNumber":123" ... (and many more fields like this) ... }
What does it mean that it is mapped?
For confidential reasons my company might not want me to put here a log
Then obfuscate the parts that are sensitive. I'm interested in exactly what the msgDeliveryTime
field looks like.
What does it mean that it is mapped?
Please read about mappings in the Elasticsearch documentation.
tomer
(tomer zaks)
February 27, 2017, 1:48pm
5
{"recordType":"MT","callingNumber":"123","callingImsi":"","callingMsc":"","billable":"","calledNumber":"123","calledImsi":"","calledMsc":"","msgSubmissionTime":"1484752652741","clientId":"","gmt1":"-1","msgDeliveryTime":"0","originatingProtocol":"MAP","gmt2":"-1","campignId":"","channel":"","destinationProtocol":"MAP","terminationCause":"STRING","transactionId":"123","msgLength":"0","concatenated":"FALSE","concatenatedFrom":"1","sequence":"0","priority":"","deferred":"","numOfAttemp":"0"}
Sorry about the confusion but I need msgSubmissionTime field as a long or int (I tried on msgSubmissionTime and also did not succeed)
I will look at mapping
Well, at least Logstash 2.4.0 is able to convert the field:
$ cat test.config
input { stdin { codec => json_lines } }
output { stdout { codec => rubydebug } }
filter {
mutate {
convert => ["msgSubmissionTime", "integer"]
}
}
$ echo '{"msgSubmissionTime": "1484752652741"}' | /opt/logstash/bin/logstash -f test.config
Settings: Default pipeline workers: 8
Pipeline main started
{
"msgSubmissionTime" => 1484752652741,
"@version" => "1",
"@timestamp" => "2017-02-27T15:14:25.741Z",
"host" => "lnxolofon"
}
Pipeline main has been shutdown
stopping pipeline {:id=>"main"}
tomer
(tomer zaks)
February 27, 2017, 3:40pm
7
I see, well I am using Logstash 5.2.0, but I do not think that this is the problem.
It is probably with a configuration between filebeat logstash and ES!
Many thanks!
system
(system)
Closed
March 27, 2017, 3:40pm
8
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.