Nmap plugin/module? still working

anyone had any luck getting the nmap scan xml codec/plugin to work with v7?

i've got it reading the data in with the plugin, but the template wont import.. and the data is pretty rough around the edges.

any suggestions would be appreciated

Hi,

Can you be more precise about the error ? How do you import the template ?

by the way i didnt knew about this plugin are we talking about this https://github.com/logstash-plugins/logstash-codec-nmap/tree/v0.0.21 ?

i found it here: https://www.elastic.co/guide/en/logstash/current/plugins-codecs-nmap.html which links to : https://github.com/logstash-plugins/logstash-codec-nmap

i'm using the template straight off the github with this command:

curl -XPUT -H 'Content-Type: application/json' http://192.168.1.60:9200/_template/nmap_template -d@elasticsearch_nmap_template.json

result/error i get:
{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"Root mapping definition has unsupported parameters: [nmap_port : {properties={addresses={properties={addr={index=not_analyzed, type=string}, type={index=not_analyzed, type=string}}}, address={index=not_analyzed, type=string}, geoip={properties={timezone={index=not_analyzed, type=string}, area_code={index=not_analyzed, type=string}, ip={type=ip}, latitude={type=double}, continent_code={index=not_analyzed, type=string}, city_name={type=string}, country_code2={index=not_analyzed, type=string}, country_name={index=not_analyzed, type=string}, dma_code={type=integer}, country_code3={index=not_analyzed, type=string}, location={type=geo_point}, ...LOTS LOTS MORE... run_stats={properties={elapsed={type=double}, summary={index=not_analyzed, type=string}, end_time={index=not_analyzed, type=string}, exit_status={index=not_analyzed, type=string}}}, type={index=not_analyzed, type=string}, version={index=not_analyzed, type=string}, tags={index=not_analyzed, type=string}}}]"}},"status":400}

I have the same issue ... running logstash 7.6.2

{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"Root mapping definition has unsupported parameters:  [nmap_port : {properties={addresses={properties={addr={index=not_analyzed, type=string}, type={index=not_analyzed, type=string}}}, address={index=not_analyzed, type=string}, geoip={properties={timezone={index=not_analyzed, type=string}, area_code={index=not_analyzed, type=string}, ip={type=ip}, latitude={type=double}, continent_code={index=not_analyzed, type=string}, city_name={type=string}, country_code2={index=not_analyzed, type=string}, country_name={index=not_analyzed, type=string}, dma_code={type=integer}, country_code3={index=not_analyzed, type=string}, location={type=geo_point}, region_name={index=not_analyzed, type=string}, real_region_name={index=not_analyzed, type=string},

there's also this guide ....

Hi,

I took a look into the mapping and it looks like it needs to be updated

I might dive into it but you could try to re-create it with new standards.

i found a little linux termal program that converts the xml to json.. so i just wrote a little script that runs through and converts all my xml scans to json. then i have a logstash pipeline that takes those into ES. its not an ideal solution but it does work.

Hi,

Glad you made it works ! I had take a look at this plugin a while ago for similar purpose and i was considering using this too.

the nmap2json works well. it converts the xml to json, then you just run it through logstash for more enrichment. i'm integrating the shodan API now. lookingglass will be next. i wish there was a better way, but its good enough.

Dont hesitate to use prune plugin while dealign with json sometimes alot of useless fields can be removed easily :wink:

https://www.elastic.co/guide/en/logstash/current/plugins-filters-prune.html

thank you, i was not aware of this filter. i'll check it out.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.