Problem in converting string to ip


(SatishKumar) #1

Hi i tried to convert string to ip by following the steps provided in the threads but its not working for me. I am getting following error on starting kibana.

08:24:50.444 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Installing elasticsear
ch template to _template/logstash
08:24:50.486 [[main]-pipeline-manager] ERROR logstash.outputs.elasticsearch - Failed to install temp
late. {:message=>"Got response code '400' contact Elasticsearch at URL 'http://localhost:9200/_templ
ate/logstash'", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError"}
08:24:50.488 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - New Elasticsearch outp
ut {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost:9200"]}

i did created a template.json as below with ip type mentioned.

{
"template": "logstash-",
"settings" : {
"number_of_shards" : 1,
"number_of_replicas" : 0,
"index" : {
"query" : { "default_field" : "@message" },
"store" : { "compress" : { "stored" : true, "tv": true } }
}
},
"mappings": {
"default": {
"_all": { "enabled": false },
"_source": { "compress": true },
"dynamic_templates": [
{
"string_template" : {
"match" : "
",
"mapping": { "type": "string", "index": "not_analyzed" },
"match_mapping_type" : "string"
}
}
],
"properties" : {
"@fields": { "type": "object", "dynamic": true, "path": "full" },
"@message" : { "type" : "string", "index" : "analyzed" },
"@source" : { "type" : "string", "index" : "not_analyzed" },
"@source_host" : { "type" : "string", "index" : "not_analyzed" },
"@source_path" : { "type" : "string", "index" : "not_analyzed" },
"@tags": { "type": "string", "index" : "not_analyzed" },
"@timestamp" : { "type" : "date", "index" : "not_analyzed" },
"@type" : { "type" : "string", "index" : "not_analyzed" },
"ipAddress" : {"type" : "ip"}
}
}
}
}

mentioned following in the output

output {
stdout { codec => "json" }
elasticsearch{
hosts => ["localhost:9200"]
template => "C:/satish/satish/template.json"
template_overwrite => true
codec => "json"
index => "logstash-%{secEventType}"
}
}

can you help me in resolving this issue?


Problem with timestamp
(Magnus Bäck) #2

I'm surprised there isn't any better error messages in the Logstash log. Have you looked for more clues in the Elasticsearch log?


(SatishKumar) #3

Hi magnus,

i didn't get any info in elastic search log.
This is the complete log of logstash from console

08:24:50.251 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Elasticsearch pool URL
s updated {:changes=>{:removed=>[], :added=>["http://localhost:9200"]}}
08:24:50.255 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Running health check t
o see if an Elasticsearch connection is working {:url=>#<URI::HTTP:0x68930d0f URL:http://localhost:9
200>, :healthcheck_path=>"/"}
08:24:50.388 [[main]-pipeline-manager] WARN logstash.outputs.elasticsearch - Restored connection to
ES instance {:url=>#<URI::HTTP:0x68930d0f URL:http://localhost:9200>}
08:24:50.389 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Using mapping template
from {:path=>"C:/satish/satish/template.json"}
08:24:50.438 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Attempting to install
template {:manage_template=>{"template"=>"logstash-", "settings"=>{"number_of_shards"=>1, "number_o
f_replicas"=>0, "index"=>{"query"=>{"default_field"=>"@message"}, "store"=>{"compress"=>{"stored"=>t
rue, "tv"=>true}}}}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>false}, "_source"=>{"compress"=>
true}, "dynamic_templates"=>[{"string_template"=>{"match"=>"
", "mapping"=>{"type"=>"string", "index
"=>"not_analyzed"}, "match_mapping_type"=>"string"}}], "properties"=>{"@fields"=>{"type"=>"object",
"dynamic"=>true, "path"=>"full"}, "@message"=>{"type"=>"string", "index"=>"analyzed"}, "@source"=>{"
type"=>"string", "index"=>"not_analyzed"}, "@source_host"=>{"type"=>"string", "index"=>"not_analyzed
"}, "@source_path"=>{"type"=>"string", "index"=>"not_analyzed"}, "@tags"=>{"type"=>"string", "index"
=>"not_analyzed"}, "@timestamp"=>{"type"=>"date", "index"=>"not_analyzed"}, "@type"=>{"type"=>"strin
g", "index"=>"not_analyzed"}, "ipAddress"=>{"type"=>"ip"}}}}}}
08:24:50.444 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Installing elasticsear
ch template to _template/logstash
08:24:50.486 [[main]-pipeline-manager] ERROR logstash.outputs.elasticsearch - Failed to install temp
late. {:message=>"Got response code '400' contact Elasticsearch at URL 'http://localhost:9200/_templ
ate/logstash'", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError"}
08:24:50.488 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - New Elasticsearch outp
ut {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost:9200"]}
08:24:50.491 [[main]-pipeline-manager] INFO logstash.pipeline - Starting pipeline {"id"=>"main", "p
ipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=

500}
08:24:50.496 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
{"ipaddress":"169.254.133.10","secEventType":"authentication","messageID":"messageID0","userName":"u
serName0","Info":"Info0","tags":["_dateparsefailure"],"timeStamp":"Tue 2017 Jan 24, 02:36:31:088","p
ath":"C:/satish/SecurityEvent-2017-Jan-24.log","@timestamp":"2017-01-24T02:54:50.239Z","info1":"abc0
","@version":"1","host":"L24660WIN","ID":"000891b7-1907-40ed-a1d5-d0a51be07ad3","info2":"xyz0"}{"ipa
ddress":"169.254.133.10","secEventType":"dataaccess","messageID":"messageID1","userName":"userName1"
,"Info":"Info1","tags":["_dateparsefailure"],"timeStamp":"Tue 2017 Jan 24, 02:36:31:089","path":"C:/
satish/SecurityEvent-2017-Jan-24.log","@timestamp":"2017-01-24T02:54:50.501Z","info1":"abc1","@versi
on":"1","host":"L24660WIN","ID":"ffd1a034-022e-44a5-9aa0-0f9312db8b38","info2":"xyz1"}{"ipaddress":"
169.254.133.10","secEventType":"authentication","messageID":"messageID2","userName":"userName2","Inf
o":"Info2","tags":["_dateparsefailure"],"timeStamp":"Tue 2017 Jan 24, 02:36:31:089","path":"C:/satis
h/SecurityEvent-2017-Jan-24.log","@timestamp":"2017-01-24T02:54:50.503Z","info1":"abc2","@version":"
1","host":"L24660WIN","ID":"62331fcc-562b-4bd8-9c5f-b80a860b02ed","info2":"xyz2"}{"ipaddress":"169.2
54.133.10","secEventType":"dataaccess","messageID":"messageID3","userName":"userName3","Info":"Info3
","tags":["_dateparsefailure"],"timeStamp":"Tue 2017 Jan 24, 02:36:31:089","path":"C:/satish/Securit
yEvent-2017-Jan-24.log","@timestamp":"2017-01-24T02:54:50.506Z","info1":"abc3","@version":"1","host"
:"L24660WIN","ID":"7bca07ca-cbbf-42ae-b085-0ba1ec2cc7d0","info2":"xyz3"}{"ipaddress":"169.254.133.10
","secEventType":"authentication","messageID":"messageID4","userName":"userName4","Info":"Info4","ta
gs":["_dateparsefailure"],"timeStamp":"Tue 2017 Jan 24, 02:36:31:089","path":"C:/satish/SecurityEven
t-2017-Jan-24.log","@timestamp":"2017-01-24T02:54:50.506Z","info1":"abc4","@version":"1","host":"L24
660WIN","ID":"59ad02fb-4f62-411b-ad59-d825ba045a32","info2":"xyz4"}{"ipaddress":"169.254.133.10","se
cEventType":"dataaccess","messageID":"messageID5","userName":"userName5","Info":"Info5","tags":["_da
teparsefailure"],"timeStamp":"Tue 2017 Jan 24, 02:36:31:090","path":"C:/satish/SecurityEvent-2017-Ja
n-24.log","@timestamp":"2017-01-24T02:54:50.509Z","info1":"abc5","@version":"1","host":"L24660WIN","
ID":"e91e31d1-a31f-458c-8b7e-761161f340c1","info2":"xyz5"}{"ipaddress":"169.254.133.10","secEventTyp
e":"authentication","messageID":"messageID6","userName":"userName6","Info":"Info6","tags":["_datepar
sefailure"],"timeStamp":"Tue 2017 Jan 24, 02:36:31:090","path":"C:/satish/SecurityEvent-2017-Jan-24.
log","@timestamp":"2017-01-24T02:54:50.510Z","info1":"abc6","@version":"1","host":"L24660WIN","ID":"
b7d8fc2c-aed0-40a2-87b8-53d5e45a901e","info2":"xyz6"}08:24:50.624 [Api Webserver] INFO logstash.agent - Successfully start
ed Logstash API endpoint {:port=>9600}


(Magnus Bäck) #4

Okay. Then try posting your template file to ES yourself using curl or whatever REST client you prefer to use. Then you'll get any error messages straight from ES.


(SatishKumar) #5

hi magnus,
it resolved finally.
on doing post it gave errors on certain mappings and types, after resolving them it started working

following is the working template.

{
"template": "logstash-",
"settings" : {
"number_of_shards" : 1,
"number_of_replicas" : 0,
"index" : {
"query" : { "default_field" : "@message" }
}
},
"mappings": {
"default": {
"_all": { "enabled": false },
"dynamic_templates": [
{
"string_template" : {
"match" : "
",
"mapping": { "type": "string", "index": "not_analyzed" },
"match_mapping_type" : "string"
}
}
],
"properties" : {
"@fields": { "type": "object", "dynamic": true },
"@message" : { "type" : "string", "index" : "analyzed" },
"@source" : { "type" : "string", "index" : "not_analyzed" },
"@source_host" : { "type" : "string", "index" : "not_analyzed" },
"@source_path" : { "type" : "string", "index" : "not_analyzed" },
"@tags": { "type": "string", "index" : "not_analyzed" },
"@timestamp" : { "type" : "date", "index" : "not_analyzed" },
"@type" : { "type" : "string", "index" : "not_analyzed" },
"ipAddress" : {"type" : "ip", "index" : "analyzed"}
}
}
}
}
thanks for the suggestions


(John Jones) #6

I'm not by any means good at this, yet....
But this sticks out

"ipAddress" : {"type" : "ip", "index" : "analyzed"}
Shouldn't that be formatted as the others with "@"?


(Magnus Bäck) #7

Shouldn't that be formatted as the others with "@"?

For consistency perhaps, but otherwise no. It's actually quite unusual that @satishsnv has chosen to prefix the fields like that.


(SatishKumar) #8

All the other fields are part of the default template, i have added this ipAddress which is part of log. if i keep @ipAddress, i will get a new field of ip type, but my ipAddress field will remain as string. In order to avoid the creation of new field, i didn't used "@". Does this have any impact?


(Magnus Bäck) #9

All the other fields are part of the default template,

For Logstash 0.x and maybe early 1.x releases perhaps. The only @ fields in reasonably recent releases are @timestamp and @version.

i have added this ipAddress which is part of log. if i keep @ipAddress, i will get a new field of ip type, but my ipAddress field will remain as string. In order to avoid the creation of new field, i didn't used "@". Does this have any impact?

The name itself doesn't matter, but how the field you're using for IP addresses is mapped obviously matters.


(system) #10

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.